Open up a web browser or power up a smartphone—pretty much essential for modern-day living—and you’re walking straight into a privacy minefield. That much you know. Especially after the news earlier this week that Unroll.me, a popular service that lets you unsubscribe from multiple email lists with a single click, was selling data it had mined from all your mail. What you might not realize is that your surrendering of your privacy isn’t just an accident—it’s the purposeful design of a particular breed of app makers and web designers employing a practice known as “dark patterns.”
The practice of using interface design, social engineering, and other tricks to funnel users in a particular direction has come to be known as “dark patterns”, a concept even has its own website, complete with a hall of shame featuring some alarming examples.
We’re talking about bonus purchases that appear by default in your shopping basket (a trick now illegal in the EU), confusing mixes of opt-in and opt-out check boxes when you sign up for services, and other forms of misdirection on the web and in apps. Before you know it, you’ve agreed to share your location for the next 40 years or upgraded your plane ticket on accident.
User Experience consultant Harry Brignull set up darkpatterns.org, and coined the phrase dark patterns itself, to try and highlight—and stop—this kind of sleight-of-hand.
“It started with one lone example—a low cost airline that was using a shady technique to trick users into buying insurance with their flights,” Brignull told Gizmodo. “I came up with this idea that by giving them a catchy name and publicizing them, it will help consumer awareness and deter companies from using them.”
Brignull points to two particularly pertinent types of dark pattern on the modern web: Friend Spam and Privacy Zuckering (yes, named after serial offender Facebook’s founder). You’ve probably come across both in the past.
With Friend Spam, you’re asked to give access to your contacts list, ostensibly for your own benefit—to find friends you might know on a particular service. But what actually happens, most of the time, is your friends get spammed with invitations to join whatever new instant messenger you’re testing out.
It’s an underhand tactic—and LinkedIn’s attempt at one version of it ending up costing it in the region of $13 million. You’ll find some version of it (often less aggressive) used by almost every social media company, including Facebook and Twitter. There’s no way to be sure if you’re about to bombard your contact list with spam beyond some light googling to see if the site you’re using is a serial offender. Or you could just avoid using the “find friends via email” service all together.
Privacy Zuckering covers all the ways companies try and get you to share more about yourself. Ever been invited to complete some extra boxes on your Facebook bio? Or contribute a bunch of unnecessary information when trying to get a free credit score online? Then you’ve been Zuckered.
Privacy Zuckering is a little less egregious than it used to be. Consumers are growing more internet savvy. Which is why, according to Brignull, much of this practice has been moved behind the scenes, via terms and conditions you can’t avoid if you want to use a particular site or social network.
There are plenty more examples (have a look at @darkpatterns)—making premium selections by default, or making it increasingly hard to opt out of services, or making very different options look too similar. Much of the time companies are relying on you to either not notice what’s going on or to be too busy to do anything about it.
According to Brignull the low cost airline that sparked his initiative has since mended its ways, but other companies have filled the gap.
“The dark patterns initiative has worked to some extent, but the web is a very big place,” says Brignull. “The situation has matured.”
The ethics of dark patterns aren’t always clear-cut either. Uber, for instance, would argue its app design makes more money for drivers and leads to a better service for passengers, even if you’re ordering the fanciest Uber when you’d prefer something cheaper. But other user experience (UX) design choices undoubtedly cross the line into straight-out deception.
Chris Nodder, a UX consultant and author of Evil by Design, says it’s difficult for users to stay ahead of the curve. “Once users become aware of a certain kind of trick, the sites start changing it out slightly so users don’t notice any more,” he told Gizmodo. “And more sites seem to be making use of dark patterns than ever before.”
“Maybe they think that because other sites are doing it, it’s okay for them to do it too.”
Nodder highlights the usual suspect as being particularly problematic: terms and conditions. In Evil by Design he quotes the example of a software company who hid a reward in their terms and conditions—it took four months and 3,000 downloads for someone to claim it.
“The T&Cs are presented at a time when you’re trying to complete a very different task,” he says “[They’re] a barrier between you and using the app, so it’s no wonder that people hit the big shiny ‘next’ button rather than reading through the whole thing.”
Users should also watch out for apps and sites that try and collect information or permissions piecemeal, according to Nodder. These dark pattern attempts usually include some simple-sounding rationale for collecting the information—like apps claiming to want location data to show you when your friends are around.
“If you saw one long form with all the personal data fields on it, you’d never fill it in,” says Nodder. “By the time you do realize, you’re typically so invested that it’s hard to quit.”
Both our experts suggest getting yourself educated about what you might be signing up for, and weighing up the benefits you’re getting in return. Sites like Dark Patterns and Terms of Service Didn’t Read can help, but no matter how sinister the tricks, the onus is still on all of us to clarify what we’re agreeing to. Ultimately, you might just be better using fewer apps and services in future.