The Half-a-Billion Fortnite Fine Kicks Off a New Era of Regulating User Interfaces

With Epic Games in its crosshairs, the FTC demonstrates a newfound willingness to go after the manipulative power of interfaces.

We may earn a commission from links on this page.
Fortnite on a Nintendo Switch
Photo: Wachiwit (Shutterstock)

In a sweeping settlement announced Monday, the Federal Trade Commission fined Epic Games a whopping $520 million after accusing the Fortnite-maker of a variety of unsavory business practices. The complaint touches on a range of issues from alleged violations of children’s privacy to tricking users into unintentional purchases, but there’s an overarching theme: deceptive design.

Epic agreed to make a number of changes to its interfaces as part of the settlement, like adding friction in the purchase process to avoid accidental payments, and a new instant purchase cancellation system, and turning voice-chats off for minors.

“Epic used privacy-invasive default settings and deceptive interfaces that tricked Fortnite users, including teenagers and children,” said FTC Chair Lina Khan in a statement. “Protecting the public, and especially children, from online privacy invasions and dark patterns is a top priority for the Commission, and these enforcement actions make clear to businesses that the FTC is cracking down on these unlawful practices.”

Advertisement

After years of discussion, regulators are zeroing in on the manipulative powers of digital interfaces, and the government appears ready to act against them.

“The FTC has been doing work on deceptive design practices for years, but this is the biggest step up in terms of enforcement we’ve ever seen,” said John Davisson, director of litigation and senior counsel at the Electronic Privacy Information Center, better known as EPIC (unrelated to Epic Games).

Advertisement

Lawmakers have a newfound eye for the flaws of digital design. They’re paying increased attention to layout and composition on the web. An update to the California Consumer Privacy Act last year banned dark patterns, a term for deceptive design. California passed the Age Appropriate Design Code in September, which obligates companies to prioritize kids’ safety and well-being in the design of online services. A similar UK law with the same name went into effect last year—netting a $30 million fine for TikTok—and New York state is considering an even more aggressive children’s design bill of its own. U.S. federal regulators are taking up the mantle, too: the FTC held a dark patterns workshop in 2021.

“There’s definitely been a shift towards regulating design,” said Justin Brookman, director of technology policy for Consumer Reports, and former director of technology research at the FTC. “There’s recognition that choices about platform architecture are within the scope of what regulators can go after, and there’s more thinking about requiring companies to consider other values in designing products.” (Disclosure: this reporter formerly worked at Consumer Reports’ journalism division, which is separate from its advocacy wing, where Brookman works.)

Advertisement

Regulating design is complicated. You can influence user behavior by making one button blue and the other one red, but no one wants the government dictating the colors on websites. However, in cases like Fortnite’s, the problems are a little more clear.

Epic’s “counterintuitive, inconsistent, and confusing button configuration” tricked players into making hundreds of millions of dollars in unwanted purchases, the FTC said. Players could accidentally buy things when attempting to wake the game from sleep mode, or by tapping the instant purchase button, located right next to the preview item toggle, for example. When over a million users complained about the problem, Epic allegedly ignored them. “Using internal testing, Epic purposefully obscured cancel and refund features to make them more difficult to find,” the FTC said. Epic froze users’ accounts if they tried to dispute charges with their credit card companies.

Advertisement

Epic issued a statement about the settlement and its plans to address the problems raised by the FTC. “No developer creates a game with the intention of ending up here,” Epic said. “The laws have not changed, but their application has evolved and long-standing industry practices are no longer enough. We accepted this agreement because we want Epic to be at the forefront of consumer protection and provide the best experience for our players.”

“This settlement is going to wake companies up, they’re going to be taking a close look at what the FTC sees as manipulative design to make sure they’re not committing the same practices,” said EPIC’s Davisson.

Advertisement

Perhaps the most surprising part of the settlement has to do with Fornite’s voice chat feature. Chats were turned on by default, even for children, which exposed kids to risk of harassment or even sexual abuse. According to the FTC, this violated laws against unfair business practices. But what sets that argument apart is it treats the voice chats intrinsically dangerous and therefore subject to regulatory scrutiny.

“To say turning voice chat on by default is per se harmful is a brand new principle for the FTC. I can’t think of any analogous cases where they said that sort of design choice was inherently harmful,” Brookman said.

Advertisement

That logic could have broader implications considering other tech features and services that may have built-in risks. Think of criticisms that TikTok’s algorithm is too addictive, for example, or Instagram’s links to suicidal thoughts and eating disorders among teen girls.

“In a sense Fortnite is a social media platform, to the extent that it has chat features, and the FTC is saying companies have more of an obligation to design their systems to repudiate harms,” Brookman said.

Advertisement

According to Davisson, Fortnite’s shift is an encouraging one, especially when you think of dark patterns in the context of privacy problems. “There’s an evolving understanding and acceptance that the design of platforms and websites is a major contributing factor to extractive commercial surveillance,” Davisson said. “That’s something that needs to be addressed as part of a broader data protection push.”

Update: 12/21/2022 5:00 p.m. ET: This story has been updated with a statement from Epic.