Top

Mobile Dark Patterns

Mobile Matters

Designing for every screen

A column by Steven Hoober
November 4, 2019

In addition to the misuse, misunderstanding, and bad implementations of perfectly good UX design patterns, we’ve long understood the concept of anti-patterns. These are things that we know don’t work well for users. We’ve clearly defined and documented them so we can avoid using them.

However, as much as I’ve studied the concept of pattern languages and libraries, as cynical as I am about how businesses use and abuse product design, I simply didn’t expect the rise of dark patterns. Dark patterns are design patterns that are effective, but evil. When they succeed, they drive users to make accidental or uninformed decisions against their best interests.

During the creation of dark patterns, there’s typically much argument that they are positive for the business, that they expose ideas and encourage behaviors that the average person would do. I don’t think I need to convince you that businesses often place their own success above that of users and that it is the job of UX professionals to remind everyone that user-centric design must also be moral, ethical, and unambiguously legal design.

Champion Advertisement
Continue Reading…

Dark Patterns Work

Dark patterns are not anti-patterns, because they work as intended. However, their intent is counter to user needs. But have no doubt—they do work. A recent study found that, while with a neutral presentation of an add-on product—a free, identity-protection service—only 11% of users accepted it, using the most aggressive dark patterns caused 42% to agree.

Whether a dark pattern constitutes deception by forcing accidental clicks or through persuasion makes little difference. A design change caused users to accept a decision they would not have made on their own. In other contexts, we’d call a practitioner of such behavior a con man.

Beware of Sneaky Dark Patterns

For UX designers to fall prey to dark patterns, it is not necessary for there to be an actual cackling villain; to have your meetings in dank, candelabra-lit castle; or to decide to use dark patterns because you hate users. These patterns sneak up on you and weasel their way into your products as you try to meet business requirements or simply accept that what others are doing is best practice.

Your boss might coerce you by saying, “Encourage users to buy the upgrade. It makes people happy.” Your legal department might influence you by saying that it is their formal decision to preselect a checkbox for the user. Or you might just copy something neat looking from a design-inspiration, social-media account, without understanding that the user interface causes unwitting interactions to occur.

Be a conscious, conscientious designer. Always understand what you are doing, why, and what effect it could have on the user’s experience of your product.

Thousands of dark patterns exist. Many have been discussed exhaustively, in articles all over the place. But in this column, I’ll describe some mobile-specific dark patterns to watch out for. Many go beyond typical user-interface layer tricks such as hiding alternative actions’ labels with inadequate contrast.

Too Much Location Information

Location privacy often gets discussed after various exploits and breaches have already occurred. However, one thing that we too often ignore is that many apps and Web sites simply don’t need the location data they request. Even those that do need some location information often do not need your precise location.

For example, a mobile app or site can provide weather information using coarse location—the location information that a cell site, sector, or triangulation provides. Knowing the user’s location to within a block or three is accurate enough for many services such as news, sports, and weather.

However, all too often, they don’t give users that option—instead insisting that users must allow fine location, which GPS, other GNSS, or Wi-Fi provides. This is often a bit sneaky and apps or sites don’t admit what location service they’ll enable, but Weather Underground simply admits that they want GPS, as Figure 1 shows.

Figure 1—Location-decision screen on the Weather Underground app
Location-decision screen on the Weather Underground app

Sure, users can also enter a location manually, but there’s no provision for the automatic use of coarse location, so the result is that the app or site is still getting fine location information.

What does an app or site do with this location information? Well, once it has it, anything it wants. Again, most apps and sites hide how they’ll use location information and simply say they need the information. But AccuWeather admits that only one use is for content—which we presume would be weather reports—but also use it for advertising and other vague terms of use, as Figure 2 shows.

Figure 2—AccuWeather dialog explains their use of location information
AccuWeather dialog explains their use of location information

Many mobile apps and sites imply that they need location information to work better, but then use it too precisely and apply it too broadly.

When designing for mobile, ask for location information only if you must have it. Then use it only for what customers might expect, instead of tricking them into sharing their personal data with every advertiser.

Notification Fatigue

For some thirty years, we have been lamenting the misuse and overuse of pop-up dialog boxes. Somehow, mobile designers have taken the wrong lesson from this and now happily design a notification layer at the top of a page, an upsell at the bottom, and a dialog box on top of everything.

On an app’s installation or a user’s first visit to a Web site or first use of an app, many insist on telling users way too much about themselves. Permissions, app tours, upsells, and in-app purchases—and, far too often, trickery. That AccuWeather dialog box in Figure 2 might seem like they are being open and honest, but all apps ask for permissions, so I bet no one reads this, and users aren’t aware that they’ve just explicitly given permission for location-based advertising.

The Pinterest app, which is shown in Figure 3, provides an even better example. It has a drawer, in which the user must confirm her email address, on top of the second of two banners informing users about privacy-policy changes, and all of this after several other permissions pop-up dialog boxes and onboarding steps.

Figure 3—Pinterest app’s permissions drawer over permissions banner
Permissions drawer on top of a permissions banner in the Pinterest app

Look closely at that drawer. You’ll see a checkbox—and it’s preselected, of course.

Users’ natural inclination has always been to get rid of every dialog box as quickly as possible and get back to the actual app or Web site so they can get to what they want to do. In this case, we can expect that most people would just tap Save without reading to get on with using the app.

This dark pattern takes advantage of that typical behavior in the hope that users won’t think too hard about granting permissions, sharing all their contact info, or agreeing to receive promotional email messages all the time.

Limiting Mobile Functionality

I never thought we’d be having this discussion in 2019, but many organizations still think of the mobile experience is an offshoot, so make some functions or information available only on the desktop Web site.

As the fitness-tracking service Strava famously revealed in 2018, they had let private data leak out, allowing things such secret military bases to be found. This occurred because they allowed the sharing of too much data by default, then hid the settings on mobile, displaying them only on the desktop Web site. The vast majority of their users spent all of their time in the mobile app, so didn’t know this was happening. They couldn’t have known this even if they had gone to the app settings to check settings for locational privacy and sharing.

Strava have since added most settings to the mobile app and no longer shares location information by default. Most—but their Training Log sharing settings, for one, do not appear in the mobile app.

Strava is a tool with independent sharing functions, but they are not visible on mobile. While the app now defaults to private location information, the popups that appear on the Web site were designed with the best dark patterns in mind—a big Accept button and a tiny Dismiss button—so it’s all too easy to accidentally share that information. A user could visit the Web site once and accidentally share location information, then use only the mobile app going forward and never again have a way of seeing that there is a data leak.

Do-or-Die Permissions

Many apps start out by presenting some permissions, then if the user rejects them, close. While some of these permissions are marginally understandable—for example, accepting Terms and Conditions—many seem unnecessary and, by closing the app, causing errors, or loading the permissions over and over, these apps push users into accepting permissions they neither need nor want.

Users can get into the AccuWeather app after rejecting permissions, but if they actually try to do anything, they’ll run into roadblocks. Trying to load the menu to see what the app can do displays the message “A location is required to use this application,” as shown in Figure 4.

Figure 4—A location error message in the AccuWeather app
A location error message in the AccuWeather app

But this is not actually true. Other weather apps allow manual location entry. If you look at this message closely, there’s a way to provide location information manually as well. It’s small and grayed out, but it actually works. Once the user does that, the app generally works.

But the app doesn’t want the user to enter this information manually, so it doesn’t try to be helpful. Instead, it tries to trick the user into providing fine location data—elsewhere, it refers to using GPS for location—that is not necessary, implying the app won’t even run without it.

On the menu in Zillow, there’s a convenient-looking Nearby Open Houses item. This might be a useful feature for those spending time driving around to look at houses on a Sunday. But if the user clicks it and hasn’t previously agreed to share location information, a permission dialog box appears, as shown in Figure 5.

Figure 5—Location permissions dialog box for the Zillow app
Location permissions dialog box for the Zillow app

Denying permission simply closes the dialog box, leaving the user still staring at the menu. No fallback is provided. Plus, we know location information isn’t necessary. Their Web site also has an open-house search tool that doesn’t ask for location at all. The user can just type or scroll to an area and see open houses. So the mobile app is going out of its way to trick users into sharing precise location information.

Remember, not only might people want their own locational privacy, they could be trying to help someone remotely or be seeking information on someplace they’ll be in the future. So not only is requiring location information unhelpful, it overlaps with the overuse of sensor data.

Such attempts to get users to reveal private data are not just tricky, they fail to meet the needs of every user. Whoever rejects sharing information or giving permissions should be allowed to do so. Then, whatever services you can provide with less data or with manually entered information should work as well as possible.

Unclear or Impossible Touch Interactions

From dark patterns on the Web, we’re all familiar with popups or ads with a close X that is so small it’s hard to click, and the user misses so gets a full-screen signup form or a loud video ad.

Just as with notifications, mobile makes this dark pattern worse. Mouse users can slow down to achieve greater accuracy, so have started to learn how bad the tricky ads are. But touch has accuracy limits so mobile users are left out of this. For example, Figure 6 shows an all too typical advertising banner at the bottom of a news article.

Figure 6—Dark-pattern Close button on the National Review banner
Banner with a dark-pattern Close button on the National Review site

While it does provide a close icon, the touch area of the close icon is approximately 30 times too small. Adding to the darkness of this pattern, the button is on top of the banner, so some 80% of misses would cause the banner’s link destination to load. This is precisely what the user does not want, but this dark pattern drives clicks and, therefore, revenue to the Web site.

Many Web sites try to tell users to download their app instead. I’ve been in meetings where people have proposed this solution, with all good intent on the part of most of them. While users would have a better experience using the app, it’s hard to justify suggesting that people aren’t aware these apps exist. Implementing a banner such as the one Medium uses becomes a fully dark pattern, as shown in Figure 7.

Figure 7—App redirect banner on an article page on the Medium site
App redirect banner on an article page on Medium's site

Despite the banner’s having a prominent Open in App button and formatting like that for a notification—tricking users into automatic acceptance by dismissing the banner—the button is actually meaningless. Clicking anywhere on the banner opens the app store, enabling the user to download the app and, eventually, perhaps read the article there.

At the far left of the banner is a close icon that is far too small for the user to easily click it, so most clicks would open the banner-link’s destination rather than dismissing the banner. Not recognizing that this dark pattern is at play, those looking at analytics would celebrate the huge number of users clicking to get the app, believing they are doing a good thing.

The Waze App Tour is typical of these sorts of tour interstitials. It tries to force the user down the continue path to see all features, then often to agree to things they don’t want to do. The close icon shown in Figure 8 is bigger than usual—although still too small and unlabeled—but its placement is very odd.

Figure 8—App tour page in the Waze app
App tour page in the Waze app

Close and cancel functions should either be in the corner or presented as the opposite of a positive action. In this case, the close button is not near anything in particular, plus it is practically completely grayed out, while the GO button has a button shape, making it more prominent. 

Suddenly Not Mobile

As I’ve said many times before, we need to create mobile experiences that are mobile specific. For example, they should use contact intents instead of Web forms. But a lot of mobile apps still fall apart around the edges, linking to other channels very badly.

In many apps, navigating to read legal or customer-care information takes the user to an organization’s Web site. Sometimes it just opens a browser and shows a desktop view, but some apps—such as Flipboard, which is shown in Figure 9—embed the page in a WebView.

Figure 9—Flipboard app’ privacy policy
Flipboard app's privacy policy

Although it’s a perfectly good tactic to include WebViews in apps, many app developers do this poorly. In this case, the app-development team forgot to tell the Web-development team they were doing this, so the whole Web site appears, with a second masthead and title. Plus, if the user opens the menu, the app indicates that the user is not logged in.

In relation to personal privacy settings, this approach could cause a disconnect that risks the user making uninformed choices. Even simply not considering the type and format of content means it’s nearly impossible for users to read the content and find the information they need to be informed users.

But the content could be even more poorly formatted. Consider the Web-site checkout process of Fastenal, an industrial supplier, shown in part in Figure 10.

Figure 10—Dialog on Fastenal does not display properly on mobile
Dialog on Fastenal does not display properly on mobile

The entire Web site is a poor attempt at responsive design, with many parts that do not fit on a page, wrap oddly, or disappear, making their content unreadable. In this case, what are apparently important disclaimers are not readable. But since the reader cannot read the content, how does he know what he is missing, accidentally agreeing to, or signing up for?

Ethical and Legal Design

You might argue that the last example is not really a dark pattern because it happens by accident, more or less. But think about the whole structure, which allows these sorts of things to exist. Often, there’s no deliberate attempt to make an app, site, or function hard to use; it just transpires that way because teams are not paying attention to broader issues.

I have worked on many projects whose goal was to increase sales, because analytics showed lower-than-expected close rates. But does anyone think about revisiting a page design because people inadvertently sign up for email blasts or accept sharing their contacts more than expected? On the page shown in Figure 10, a preselected checkbox for signing up for package insurance might be invisible, so they are effectively tricking people into spending money on something they didn’t truly agree to purchase. This is not ethical design

I’ve, by no means, provided a comprehensive list of mobile dark patterns—just a few wide-ranging examples to help make sure you’re keeping your ethical obligations in mind. UX design takes its name from our intent to keep the user in mind. Even when it’s not obvious how a design solution might be misused, misunderstood, or abused, as in some of these examples, your intent should be to do the right thing for users.

Remember, this isn’t some niche design-only discussion. Many people are noticing these ethical violations, including regulators, lawmakers, and courts. The move to prevent dark patterns has already started, with laws around accessibility covering some cases and other laws being proposed specifically to prohibit dark patterns. If technology companies don’t regulate themselves, governments will do it for them. 

Resources

Mathur, Arunesh, Gunes Acar, Michael Friedman, Elena Lucherini, Jonathan Mayer, Marshini Chetty, and Arvind Narayanan. “Dark Patterns at Scale: Findings from a Crawl of 11K Shopping Websites.” Proceedings of the ACM Conference on Human-Computer Interaction (CHI 2019). Glasgow, Scotland: Association for Computing Machinery, 2019. Retrieved October 20, 2019.

Luguri, Jamie, and Lior Strahilevitz. “Shining a Light on Dark Patterns.” University of Chicago Coase-Sandor Institute for Law & Economics Research Paper No. 879. Chicago, Illinois: University of Chicago, August 1, 2019. Retrieved October 20, 2019.

Hsu, Jeremy. “The Strava Heat Map and the End of Secrets.” Wired, January 29, 2018. Retrieved October 20, 2019.

Hoober, Steven. “Understanding Location.” UXmatters, March 5, 2018. Retrieved October 20, 2019.

Ekblom, Jonas. “Internet Users Must Actively Consent to Use of Cookies, EU Court Rules.” Reuters, October 1, 2019. Retrieved October 20, 2019.

President of 4ourth Mobile

Mission, Kansas, USA

Steven HooberFor his entire 15-year design career, Steven has been documenting design process. He started designing for mobile full time in 2007 when he joined Little Springs Design. Steven’s publications include Designing by Drawing: A Practical Guide to Creating Usable Interactive Design, the O’Reilly book Designing Mobile Interfaces, and an extensive Web site providing mobile design resources to support his book. Steven has led projects on security, account management, content distribution, and communications services for numerous products, in domains ranging from construction supplies to hospital record-keeping. His mobile work has included the design of browsers, ereaders, search, Near Field Communication (NFC), mobile banking, data communications, location services, and operating system overlays. Steven spent eight years with the US mobile operator Sprint and has also worked with AT&T, Qualcomm, Samsung, Skyfire, Bitstream, VivoTech, The Weather Channel, Bank Midwest, IGLTA, Lowe’s, and Hallmark Cards. He runs his own interactive design studio at 4ourth Mobile.  Read More

Other Columns by Steven Hoober

Other Articles on Mobile UX Design

New on UXmatters