Be Aware of Deceptive Design — Don’t Let It Use You (or Anyone Else)
With great power, comes great responsibility
User experience design, at its core, is the blend of multiple disciplines in order to create solutions to people’s problems. We need to know about visual design, technology, psychology, and more to create the experiences our users come to in order to accomplish their goals. Within those disciplines, we need to be aware of how people think, so that we can structure an experience that allows them to move through the system easily and efficiently.
All this knowledge is a double-edged sword. On the one hand, knowing someone’s motivations and perceptions of visual elements on a screen is very useful for helping them — we can highlight the most valuable elements on a screen so that people engage with them first. On the other hand, if we know where people want to look and click, we can put whatever we want in those places — good, or bad.
As the designers of the experiences we create for our users, we have a great power to influence what our users do. We can use design techniques to shape minds, nudge users, and adjust behavior. We can design an experience to shift metrics and numbers in our favor, at the expense of users. Or we can use it to do good, treat our users right, and help them best accomplish their goals.
Too often, there is the temptation to steer the product into one that takes advantage of human behavior and manipulates users into engaging with a system at the benefit of the business, not the user. This temptation often leads teams to practice deceptive design.
What is deceptive design?
Deceptive design, or sometimes called “dark patterns”, is the practice of using psychology and human nature against users in an attempt to elicit specific behavior from those users. It’s based on tricking users into doing things they don’t want to or didn’t mean to do, like sign up for a service, use a platform for extended periods of time, or engage with content unwittingly.
Deceptive design is insidious. It promotes metrics and revenue at the expense of users. It’s recurring credit card charges for a service you forgot you signed up for. It’s cancellation policies that prevent users from stopping a service. It’s fear of missing out on sales, deals, or amazing content through the use of shame, “limited time offers”, and false exclusivity.
In user experience design, it shows up in many places, and in many forms.
Upsells
One of the most common places for deceptive design to occur is with pricing. Businesses want people to pay the most they can for their services, and make more money from convincing users to spend more money, sometimes for things they don’t even need.
Most pricing services leverage this tactic, even before the prevalence of this pattern in digital design. For this specific example from Pluralsight, an online learning platform, we can see several elements of this pattern that are consistent across businesses that push users to specific pricing tiers.
The most visually dominant action on the page is the Start Free Trial CTA in the Premium pricing option. The Premium option has visual dominance over the Standard option, with copy that implies it’s the “best opportunity” for users. It shows a side by side comparison of the benefits of Premium versus Standard pricing, and explains how much more money users will save by choosing Premium over Standard.
Leveraging color, contrast, hierarchy, and copy, Pluralsight pushes users towards their preferred option — not necessarily the user’s preferred one.
False notifications
False notifications are a pattern where the system will send a notification to users with the hopes that users will engage with the platform. This pattern is great at getting users to check in, as the system is trying to notify them of something presumably related to their account or preferences.
In the case of Facebook, the service will send notifications to users in an attempt to increase platform engagement. Here we see a notification about a Facebook Marketplace item the user may want to purchase. Effectively, the company is hiding an advertisement in its notifications, prompting the user to look at the platform thinking there’s a new post or message, when in fact it’s a call to action to go to Facebook’s site to spend money on an item they showed no interest in buying.
LinkedIn does the same thing with its system, notifying users whenever someone looks at their profile. Hoping to leverage fear of missing out, LinkedIn lets its users know that someone took a peek at their credentials, but doesn’t say who that person was, unless the user signs up for a Premium account. LinkedIn gates a feature behind a paywall, then notifies users about the feature and relies on loss aversion to incentivize people to pay for the service.
Roadblocks
Roadblocks are artificial barriers companies put in place to make it harder to opt-out during a task. This is extremely common when trying to cancel a service, for example — there will be several screens of confirmations and account information before reaching the screen that has the cancellation functionality — if you can find the button.
Can you find the cancel subscription option in the example above?
How long did it take you to notice it? If you can’t find it, it’s in the upper middle right hand side of the screen.
A subset of roadblocks like the one above is commonly referred to as the “roach motel”, where it’s really easy to opt-in to something, but extremely hard to leave.
This is an example of Planet Fitness’s registration page (a franchise that offers gym services). It’s really easy to sign up for the service online — just by entering a few details, you can create an account, make a payment, and become a member. However, canceling is significantly harder.
Planet fitness has no issue letting you sign up online, but if you want to cancel your gym membership, that’s not a service you can do digitally. You either have to go in person, to the specific gym you signed up for, or mail a physical letter to your home club — if that club accepts written notices.
Phantom content
Phantom content is a design practice where the system will serve up fake users or interactions in an attempt to increase engagement, improve morale, or otherwise affect the emotional state of users to promote satisfaction with the product.
This is an example taken from an online game called Pokemon Unite. Users participate in 5v5 battles against online, real world opponents in an esports competition. Only one team can win each battle. If a user loses a few battles in a row, the game will prevent that user from playing against real world opponents and instead pair them against computer generated opponents of a significantly lower skill level. This is done to allow the user to feel better about beating an opponent and avoid so many losses in the system that they get frustrated and abandon the game.
We can tell in the screenshot above that the user has encountered these “bots” by looking at the player cards and player names of each character. The first row of characters all have the exact same pose and background — very unlikely for a customizable game where players can personalize their avatar’s appearance, pose, and background. The second sign is that none of the opponents have names — in order to play the game, a user needs to create a username. No username in this context implies no users.
Addictive patterns
Addictive patterns are those that leverage human psychology to repeat behavior, at the expenses of the user and for the profit of the system. Addictive patterns are those that a user may not want to engage in, but cannot stop or doesn’t realize they are doing.
One of the most common addictive pattern in digital design is called the “infinite scroll”. Perhaps you’ve experienced it before — as you scroll to the bottom of a screen, it looks like the content ends. But after a moment or two, the screen reloads and there’s additional content to consume. So, you keep scrolling, until you hit the end…just for the content to refresh and for you to repeat the process.
This is a pattern that is extremely common in social media products. Platforms like Facebook, Twitter, and Instagram all want their users to stay and be a part of the community, to produce and consume content, so that the companies can make money off of monetizing user attention via ads. As a result, you can’t really stop scrolling on these sites — because businesses don’t want you to.
Shaming
Shaming (sometimes called confirmshaming) is a way of structuring an opportunity for a user to opt-in to a service. By using visual design to highlight the intended action and copy to explain the effect of opting in or out of a decision, products can set up an opt-in so that a user feels bad if they don’t participate.
In the example above, we can see a pop up from Google asking users to try out Gmail on their phones. The two options for a user are “try the service” or “don’t try the service”. However, Google chose to write this in a way that is designed to make users feel bad about opting out and good about opting in. Using the phrase “I don’t want smarter email” to cancel implies that the user prefers a “stupider” way to check their email. As a result, the product tries to shame users into using it.
Design for clarity, not deception
As designers, it’s our responsibility to design in a way that helps people. We should be helping people towards their goals, in the pursuit of solving their problems, and making their lives better. We shouldn’t design our products in a way that shames them into participating, lies to them, or makes it hard for them to leave. Use your power as a designer to enable people to clearly use their products — not for their products to use them.