Login

Register

Login

Register


Consumer decisions will never be made in a vacuum of logic and objective assessment. The psychological influence on consumer choices of user experience (UX) design is more than we realize, and user interfaces (UI) have been hijacked to steer users into taking unintended actions.

These hijacked designs are called dark patterns.

By using specific psychological techniques, designers can create UX and UI products that deceptively guide users into making decisions, often without their full understanding.

Tactics such as strategic visual design and persuasive micro-copy can manipulate people into making unintentional purchases, giving consent to invasive privacy settings or spending more time on the application than needed.

And they’re not restricted to the fringes of e-commerce sites, either: A recent study found more than 1,800 instances of dark pattern use on 1,254 online shopping websites.

Big Tech’s Use of Dark Patterns

Big Tech behemoths are no stranger to using dark patterns to further their interests.

There have been examples of using dark patterns to conceal an “opt out” data sharing option, burying the alternative below the lengthy list of Terms and Conditions. Other design tricks include highlighting an “Accept and Continue” button in blue, leaving the alternative in white, where a simple “Decline” button would have sufficed.

The exploitation of people’s personal interactions and information that has taken place on social media platforms has had consequences on populations’ perceived freedoms and liberties, as we saw with the Cambridge Analytica scandal.

We can see the use of dark design across other services, too. Homestay and lodging companies have been hit with criticism for the ways in which they display per-night pricing, often excluding costs such as cleaning, service fees and tax until the final booking process. 

Other tactics include burying the option to cancel an account deep in a company site’s architecture, forcing you to go through pages outlining all the services you’ll be missing out on.

The Push for Regulation Is Growing

As these techniques become increasingly widespread, legal steps have been taken to curb the use of dark design practices among tech companies.

Last year, the DETOUR Act (Deceptive Experiences to Online Users Reduction Act) was put before Congress, which would restrict irresponsible design use on big web platform holders with more than 100 million active users.

Dark patterns and data privacy issues go hand in hand, so it’s vital to understand the interlinking of the legislation that looks to regulate both of these potential threats.

If passed, the act would make it illegal for companies to “design, modify or manipulate a user interface with the purpose or substantial effect of obscuring, subverting or impairing user autonomy, decision-making or choice to obtain consent or user data.”

Actions like this, along with legislation like GDPR in Europe and the upcoming California Consumer Privacy Act, signal the dawn of a new information economy. While it’s difficult to predict how Big Tech will respond, some big players like Microsoft and Mozilla have actually come out in favor of the bill.

Will Regulatory Action Be Enough?

At the moment, few companies are likely to move much beyond surface-level principles or manifestos. Such principles often lack deeper accountability and fail to drive tech companies to comply with truly non-deceptive, ethical design that delivers a completely clear and fair picture to the user.

And we can expect the drive for data to continue unhindered: Data is an invaluable resource for Big Tech that the vast majority of companies will not be willing to compromise on.

As we can see with the first big crackdown of GDPR on Big Tech in Europe, which is being delayed by a procedural complaint submitted by WhatsApp’s lawyers, it’s not going to be simple to convince tech companies to comply, especially as they can easily stay afloat amidst financial penalization.

Despite this, there is increasing recognition of the impact and importance of personal data.

Politicians and citizens alike are waking up to abuses: We recently saw Google get hit with a record $170 million fine for violating children’s privacy on YouTube, while Amazon, Apple and Google have all faced complaints about their alleged failure to comply with GDPR this year.

Dark patterns and data privacy issues go hand in hand, so it’s vital to understand the interlinking of the legislation that looks to regulate both of these potential threats.

Risk of a Consumer Backlash

While regulation is necessary, it’s not just the job of governments to curb dark patterns. Tech companies ultimately need to embed this sense of responsibility to the user as deeply as they embed the need for business success through design.

UX and UI design teams need to ask important questions: How does this design influence user behavior? How does extensive data collection infringe on the rights of individuals? Neglecting these issues will not only result in legal consequences: As consumers become increasingly aware of how they use technology and how tech companies use their data, organizations leveraging dark design and misusing user data put their reputation on the line and risk alienating consumers.

Collaboration on Ethics

Foundational change that ensures respect of users’ rights may even mean collaboration between designers and regulatory bodies to develop ways to move toward applied ethics. It’s vital for both users and these digital teams to recognize the duality taking place: On the one hand, we enjoy the opportunities and convenience given to us by digital apps. On the other, we must carefully assess the risks posed to individuals and societies as a whole and ensure that our civil liberties are protected from the long arm of data-hungry Big Tech.

Going forward, understanding this duality will be even more crucial in preventing future generations from blindly interacting with technology, while a stronger focus on responsibility in design will help them maintain this control.

Only with collaboration between regulators and ethically acting digital teams will dark patterns lose its normative status to push back the tide of unethical practices.





Source link
——————————————————————————————————

Leave a Reply

Shqip Shqip አማርኛ አማርኛ العربية العربية English English Français Français Deutsch Deutsch Português Português Русский Русский Español Español

National Cyber Security Consulting App

 https://apps.apple.com/us/app/id1521390354

https://play.google.com/store/apps/details?id=nationalcybersecuritycom.wpapp


NATIONAL CYBER SECURITY RADIO
[spreaker type=player resource="show_id=4560538" width="100%" height="550px" theme="light" playlist="show" playlist-continuous="true" autoplay="false" live-autoplay="false" chapters-image="true" episode-image-position="left" hide-logo="false" hide-likes="false" hide-comments="false" hide-sharing="false" hide-download="true"]
HACKER FOR HIRE MURDERS
 [spreaker type=player resource="show_id=4569966" width="100%" height="350px" theme="light" playlist="show" playlist-continuous="true" autoplay="false" live-autoplay="false" chapters-image="true" episode-image-position="left" hide-logo="false" hide-likes="false" hide-comments="false" hide-sharing="false" hide-download="true"]

ALEXA “OPEN NATIONAL CYBER SECURITY RADIO”

National Cyber Security Radio (Podcast) is now available for Alexa.  If you don't have an Alexa device, you can download the Alexa App for free for Google and Apple devices.   

nationalcybersecurity.com

FREE
VIEW