Senators Mark Warner (D-Virginia) and Deb Fischer (R-Nebraska) have introduced legislation to ban so-called “dark patterns” tactics designed to trick users into handing over access to their data, Reuters reported on Tuesday.
Dark patterns, a term first popularized by the website darkpatterns.org, describe everything from UI elements to technical tricks designed to lure users into taking actions they might not otherwise agree to, such as presenting them with in-app purchase buttons or data-sharing agreements designed to appear like more mundane functions. In the video below, UX expert Harry Brignull characterizes it as a kind of catch-all term that also includes things like a “roach motel: a design that makes it very easy for you to get into a situation, but very hard to get out”: One such example is the maze of menus Amazon customers must navigate through to delete their accounts.
If the concept is still a little fuzzy, Dark Patterns runs a Wall of Shame highlighting some of the worst practices.
The bill, called the Deceptive Experiences To Online Users Reduction (DETOUR) Act, does not distinguish between mobile apps and desktop browsing experiences.
According to a draft of the bill, the legislation makes it illegal for large, public online services of more than 100 million monthly active users to “design, modify, or manipulate a user interface with the purpose or substantial effect of obscuring, subverting, or impairing user autonomy, decision-making, or choice to obtain consent or user data.” It also mandates that such services cannot “subdivide or segment” users into groups for “the purposes of behavioral or psychological experiments” without informed consent, and says they cannot target those under the age of 13 “with the purpose or substanial effect of cultivating compulsive usage, including video auto-play functions initiated without the consent of a user.”
Additionally, Reuters noted, companies would have to create independent review boards for the approval of behavioral or psychological experiments, as well as create professional standards bodies that would work in coordination with the Federal Trade Commission:
Under the terms of the bill, social media companies would create a professional standards body to create best practices to deal with the issue. The Federal Trade Commission, which investigates deceptive advertising, would work with the group.
According to ZDNet, practices that could be targeted under the bill include suddenly interrupting tasks unless users hit consent buttons, setting “agree” as the default option for privacy settings, and creating convoluted procedures for users to opt-out of data collection or barring access “until the user agrees to certain terms.”
In other words, this would radically change how a handful of massive companies whose entire business model relies on monetizing troves of user data operate. It’s also much different than how those companies would prefer any forthcoming regulation to look like, so it would likely face significant resistance from those huge platforms.
Warner said the bill may be included in a Senate Commerce Committee privacy overhaul that is still in the drafting process (and thus is still quite a bit away from becoming law).
“For years, social media platforms have been relying on all sorts of tricks and tools to convince users to hand over their personal data without really understanding what they are consenting to,” Warner said in a joint statement. “Our goal is simple: to instill a little transparency in what remains a very opaque market and ensure that consumers are able to make more informed choices about how and when to share their personal information.”
According to Reuters, in an recent interview with CNBC, Warner said, “The platform companies are now going to have an opportunity to put their money where their mouth is, to see if they support this legislation and other approaches.”