FTC’s Aggressive Enforcement of Children’s Privacy and Dark Patterns: A Cautionary Tale and Simple Steps Companies Can Take To Reduce Risk

By: Tyler G. Newby

On December 19, 2022, the Federal Trade Commission (FTC) announced a settlement with Epic Games Inc. (Epic) over its wildly popular game “Fortnite.” The settlement requires Epic to pay $275 million in penalties to resolve alleged children’s privacy violations and $245 million to refund consumers for allegedly unfair billing practices that duped players into making in-game purchases. These actions are the latest chapter in the FTC’s continued aggressive enforcement of the Children’s Online Privacy Protection Act (COPPA) and mark its most significant action against so-called dark patterns.

Jump to Best Practices and 10 Simple Steps Companies Can Follow to Reduce Risk.

COPPA Allegations

“Fortnite” is a free-to-play multiplayer online game that launched in July 2017. Players glide onto an island where they battle other online players in a battle royale game format. Today, “Fortnite” has more than 400 million players worldwide, many of whom, according to the FTC’s COPPA Complaint, are under the age of 13. The Complaint alleges that “Fortnite” is directed to children under 13 and that Epic collected kids’ personal information and shared their voices and screen names without verifiable parental consent.

There is no bright-line rule defining when a site or app is “directed” to children under 13 and therefore subject to COPPA’s verifiable parental consent requirements. Instead, the COPPA Rule—the regulations issued by the FTC that implement the statute—provides that the FTC will consider numerous factors, including the site’s subject matter, content, use of animated characters or child-oriented activities, child-directed advertising, the presence of child celebrities, evidence regarding the intended audience and empirical evidence of the ages of users.

The Complaint points to the cartoonish characters in the game, the absence of blood and gore in the game, and the fact that players are not “killed” but eliminated. While it would likely be a stretch to conclude that these factors on their own make the game “child directed,” the Complaint also points to numerous other factors. This includes evidence that Epic had oversight over the licensing of merchandise for kids’ toys and costumes, empirical surveys that show that 53% of U.S. kids between the ages of 10 and 12 played “Fortnite” weekly, and evidence of employee statements reflecting an intent to appeal to young players and reflecting knowledge that many users were under 13.

The FTC alleged that because “Fortnite” was child-directed from the start, Epic was required to notify parents of its privacy practices and obtain verifiable parental consent before kids could create accounts or play. Although “Fortnite” implemented an age gate two years after launching that requested players self-identify their age, the FTC alleged that the age gate was ineffective because it had no effect on players who did not create an account and it did not affect all previously registered users.

The Complaint alleges child users were harmed by the game’s failure to comply with COPPA. Much attention is directed at the game’s player matching function that matched young players with older players and an on-by-default voice chat feature that exposed children to harassing and abusive voice and text communications from older players. The FTC also alleged that the on-by-default chat functions for both teen and child users was an unfair practice that violated Section 5(a) of the FTC Act.

To resolve these allegations, Epic agreed to a consent order that prohibits it from enabling the voice and text chat functions for both child and teen users without obtaining their affirmative consent through a privacy setting. Notably, this is the first enforcement action in which the FTC has required affirmative consent for chat functions for users who are 13 or older. The order also requires Epic to delete all personal information previously collected from “Fortnite” users unless it obtains verifiable parental consent, or those users identify as 13 or older through a neutral age gate. The $275 million penalty is the largest penalty ever obtained for violating an FTC rule, eclipsing the next-largest COPPA penalty (against YouTube) by more than $100 million. The settlement was unanimously approved by the FTC.

Dark Pattern Allegations

In a separate administrative Complaint, the FTC alleges that Epic employed myriad design tricks known as “dark patterns” to trick consumers into making in-game purchases without their express, informed consent. The term “dark patterns” describes design practices that trick or manipulate users into making choices they would not otherwise have made and that may cause harm. Once a largely academic concept, dark patterns are now regulated by major privacy laws around the U.S. (e.g., the California Privacy Rights Act) in addition to being the subject of numerous recent enforcement actions by the FTC. In September 2022, the FTC released a report on dark patterns, called “Bringing Dark Patterns to Light,” based on findings from a public workshop with the same name held in 2021. The report identifies the types of misleading and manipulative practices the agency believes can harm consumers.

Many of the practices highlighted in the report are present in the administrative Complaint. Specifically, the FTC alleged that Epic used dark patterns to:

  • Trick users into making unintended in-game purchases. The FTC alleged that “Fortnite” employed counterintuitive, inconsistent and confusing button configurations that led players to incur unwanted charges based on the push of a single button. For example, players could be charged while attempting to wake the game from sleep mode, while the game was in a loading screen, or by pushing an adjacent button while attempting simply to preview an item. According to the Complaint, Epic received more than one million complaints from consumers about unwanted charges but did not adequately address the issue.
  • Allow unauthorized charges by children. According to the FTC, Epic allowed children to make in-game purchases without requiring any parental or card holder consent or action, such as verifying the card’s CVV number.
  • Deter users from canceling or requesting refunds. The FTC alleged that Epic did not allow users to cancel or undo charges for certain in-game purchases and forced consumers to find and navigate a difficult and lengthy path to request a refund.
  • Block access to purchased content. The FTC alleged that Epic locked the accounts of customers who disputed unauthorized charges with their credit card companies. Consumers whose accounts have been locked lose access to all the content they have purchased.

In settling these allegations, Epic has committed to implementing a “hold-to-purchase” button on its store page that reconfirms a player’s intent to buy as an additional safeguard to prevent unintended purchases. The company is also expanding parental controls, including the ability to authorize real money purchases before they are made. Finally, Epic announced new instant cancellations for certain purchases and an expanded refund system for the purchase of digital goods.

Best Practices

This enforcement action serves as a cautionary tale to other game developers and online platforms used by children. As Epic points out in their blog post announcing the settlement, the video game industry likes to move fast and innovate. In its announcement, Epic comments “statutes written decades ago don’t specify how these ecosystems should operate.” But the recent enforcement examples have provided clarity on how the law will be applied by the FTC. For example, although the game “Fortnite” was rated “Teen” and Epic took the position that it was directed at an older teen and college-aged audience, the FTC alleged that the combination of empirical evidence about its users, its licensed product targeted audience, and the appearance of figures in the game demonstrated that it was directed to children under 13. This enforcement makes it clear that developers who create a teen-rated or mature-rated game cannot assume the FTC will agree the game is not directed to children.

Another example is in-app or in-game purchases by children. Although certain games or apps are free to download, the FTC will look closely at the ability of children to charge in-app purchases to parents or other accountholders without consent. The FTC has also made clear that companies designing user interfaces should look not just at the effect their design choices have on sales or other profit-based metrics, but also at how those choices affect consumers’ understanding of the material terms of the transaction.

With robust additional guidance now available from the FTC, game developers and other online platforms would be wise to carefully review their products.

Companies can follow 10 simple steps to reduce risk:

  1. Conduct a wholistic review of whether a game is directed to children, including the audience for merchandise and empirical evidence of the age of users.
  2. When in doubt about whether a new game is likely to be heavily used by children, implement a neutral age gate.
  3. Disable default settings for features that may expose young players to harm, such as chat and other communication features.
  4. Make clear and conspicuous disclosures of material terms.
  5. Avoid manipulative language or architecture.
  6. Ensure that procedures for obtaining consent include an affirmative, unambiguous act by the consumer.
  7. Require the express informed consent of the accountholder (not user) for any charges.
  8. Make sure purchase, sign-up, and cancellation flows and privacy settings are easy to execute.
  9. Provide cancellation mechanisms that are at least as easy to use as the method the consumer used to buy the product or sign up for the service.
  10. Avoid default settings that lead to the collection, use, or disclosure of consumers’ information in a way they did not expect.

Login

Don’t have an account yet?

Register