The button did what?! Identifying misleading patterns in UX design

  • Dark design, dark UX or dark patterns are elements in user experience that nudge users to do things that they may not fully understand or intend to do.
  • A variety of techniques, such as confusing wording, cognitive overload, added friction, misdirection and anti-privacy defaults elicit actions that favour a business over a customer.
  • As people grow more concerned over their data and privacy, intentional and unintentional dark patterns can have dire consequences when it comes to retaining customer loyalty.

Are you frustrated that the spinning progress wheel on your screen is so slow? Or elated when you open your preferred ride-hailing app and see plenty of cars near your location? Perhaps you’ve been buying an item online and seen, with alarm, that it’s nearly sold out and hurried to get out your credit card.

You may have been a victim of dark patterns, a form of user experience (UX) that isn’t what it seems. Progress bars that linger to make you believe something serious is being accomplished, ghost cars that are in fact not at the end of your street, and random number generators making it appear that purchasing your item (or in a pre-COVID world, booking the ‘last seats’ on a flight or room in a hotel) is more urgent than it really is.

These dark designs play on human psychology to push customers into favouring a company’s desired response, often at their own expense. While it might seem like a great way to report better metrics, in an age of privacy breaches and data mishandling, dark design can do serious damage to a business’ brand.

The dark arts

Nudging people in a particular way, appealing to behavioural psychology or persuading customers to purchase are not inherently bad things, nor are they even always evidence of dark design. Deceptive UX is not just bad design, it is manipulative and often unethical, and aims to trick a user into doing what the business — not the customer —  wants.

It’s important to note that it doesn’t always happen on purpose. Many dark patterns emerge because businesses want to see results — a certain number of newsletter signups, or items in a shopping basket, for example. Without a complete understanding of the sanctity and importance of the overall customer/user experience it can be easy to see why a manager asks a designer to tweak the ‘buy’ button to be more prominent than the option to remove an item from a cart.

There are many ways that individual UX elements can create a ‘misleading experience’.1 Here’s a rundown of the most obvious to watch out for:

  • Friction occurs when unnecessary steps are added to user choices to favour an ‘easier’ option (that the business would prefer they make). For instance, ‘one step’ sign-up vs the many clicks, email or even phone calls required to unsubscribe or cancel an account. Hiding undesirable options, such as using small grey text, is another.2
  • Misdirection tactics make a user think one thing will happen and then instead trigger something else. Placing a brightly coloured “next” button in a consistent location and then switching it with a button that does something different, say downloading an additional program, is an example of this. As are using ‘X’ buttons — commonly understood to close pop up windows — as affirmative user agreement, or ads that are hidden as legitimate search results.3 4
  • Framing choices in a certain way can unwittingly play upon user emotions. Focusing on positives and downplaying negatives, or setting disproportionate consequences, (eg. “tick yes to share your data for additional benefits or tick no to delete your account”) are two ways options are loaded towards a ‘right’ choice. Biased language is also used to scare or shame users away from a perceived ‘wrong’ choice (eg. “are you sure you want to miss out?”, “download now or allow hackers to steal your data”).
  • Confusing language such as double negatives can baffle users into things they don’t want. For example, when signing up to a newsletter one option may be given as “tick ‘yes’ to opt-out of receiving marketing emails” and the second “tick ‘no’ if you would not like to receive third party emails”. It is unclear to the user what ticking each box gets them.
  • Default settings are often ignored by users and dark design takes advantage of this by setting them to favour the corporation over the individual. This is particularly pernicious when it comes to privacy and data controls. While many services advertise privacy control options to reassure customers, in reality, they know most won’t change them and added friction will dissuade them from trying. For instance, having one toggle to turn data sharing on, but multiple toggles (and thus more work) to turn them off.
  • Timing can be used to engender a sense of (false) urgency leading to forced actions such as hurrying a user to agree to terms of service to get to a ‘new’ message from a friend or access a site’s full functions. Messaging that implies a user is in immediate danger, to encourage purchasing of VPN or anti-virus programs, or the inclusion of ‘optional extras,’ such as insurance, when a user must make a quick purchase to ensure they don’t miss out are both examples where the user is not given time to think through their actions. Indeed, making an optional choice (such as sharing contacts) appear as if it is mandatory to sign-up to a service is another example of forced action (especially condemned when it is really a further marketing tool).5
  • Dark patterns could legitimately be a result of unintentional or misguided decision-making, but there are plenty of examples where they are purposefully designed for nefarious purposes. Images that have ‘dust’ or hair superimposed on them to trick a user into swiping them off the screen, and following a link in the process for example, or ads that are disguised to look like they are a legitimate part of a site’s function, such as ‘next’ buttons which download programs or ‘download’ buttons that lead to external sites.6

But where is the line between marketing and trickery? In the most obvious cases, it is in the intent. A/B testing of an email is a legitimate way to find the best impact for your message. Signing everyone who visits your website up to your newsletter without asking is about boosting your metrics at the expense of your customer. Does it matter, and will the customer even notice? Increasingly, the answer is emphatically yes.

Go towards the light

As UX designer Harry Brignull, who coined the term ‘dark patterns,’ told TechCrunch in 2018, “UX design can be described as the way a business chooses to behave towards its customers.”7 And customers are paying attention. In a world where transparency and trust is high on customer agendas and questionable practices are being outed in mainstream news, it is more important than ever that businesses hold onto followers. It simply doesn’t make sense to risk loyalty for the short term gain of additional clicks.8 Not to mention that the proliferation of regulation aimed at  protecting consumer rights such as the European Union’s GDPR or the California Consumer Privacy Act which will eventually lead to monetary penalties for misleading behaviour.9

So what then is light design? Designed ethically, user experiences should respect those interacting with them. Inherently, we all know from personal experience what this means. Not only should our interactions with brands be seamless, easy, understandable and honest, they should also allow true — informed — consent over that experience, and our data. Customers must possess true agency over their interactions with a brand, as a choice based on a false or partial understanding of reality is not really a choice at all.10

Instead of manipulating customers into choosing what a business wants, companies should trust that users will make appropriate selections when they are given genuine choice — and will reward them for that honesty. After all, now more than ever, customers will abandon a brand — even one they are loyal to — after just a couple of bad experiences.

Safeguarding customers with responsible technology practices will go a long way to building real, sustainable trust that leads to business success. That way, you’ll always be in the light.

With thanks to Christopher Ong.



References

  1. https://www.darkpatterns.org/types-of-dark-pattern
  2. https://fil.forbrukerradet.no/wp-content/uploads/2018/06/2018-06-27-deceived-by-design-final.pdf
  3. https://www.extremetech.com/extreme/229040-microsofts-latest-trick-clicking-x-to-dismiss-windows-10-upgrade-doesnt-stop-upgrade-process
  4. https://techcrunch.com/2020/01/23/squint-and-youll-click-it/
  5. https://www.fastcompany.com/3051906/after-lawsuit-settlement-linkedins-dishonest-design-is-now-a-13-million-problem
  6. https://uxdesign.cc/stop-calling-these-dark-design-patterns-or-dark-ux-these-are-simply-asshole-designs-bb02df378ba
  7. https://techcrunch.com/2018/07/01/wtf-is-dark-pattern-design/
  8. https://www.fastcompany.com/3060553/why-dark-patterns-wont-go-away
  9. https://www.fastcompany.com/90452333/why-you-still-cant-escape-dark-patterns
  10. https://www.accc.gov.au/media-release/accc-takes-action-against-trivago-over-hotel-price-advertisements