Skip to content

Epic Fail: Fortnite Creator to Pay $520 Million FTC Settlement

Time for a Check-Up: Updates in Data Breach Notification and Reporting Background Image

Epic Games, Inc. (“Epic”) agreed to pay a combined $520 million in two “record-breaking settlements” on Monday. The settlements resolve alleged violations of both the Federal Trade Commission Act (the “FTC Act”) and the Children’s Online Privacy Protection Act (the “COPPA Rule”).

Epic is the creator of Fortnite, a popular cross-platform game played by more than 400 million users worldwide. Epic has earned billions of dollars in revenue through Fortnite, primarily through the sale of in-game content like costumes and dance moves. According to one Epic employee, Fortnite is “living room safe, but barely. We don’t want your mom to love the game – just accept it compared to alternatives.” According to the Federal Trade Commission (“Commission”), however, Epic’s failure to obtain proper parental consents amounts to a violation of the COPPA Rule and its use of dark patterns or tricks to induce unwanted in-game charges amounts to a violation of the FTC Act.

In discussing the settlements, Commission Chair Lina M. Khan noted, “[p]rotecting the public, and especially children, from online privacy invasions and dark patterns is a top priority for the [FTC], and these enforcement actions make clear to businesses that the FTC is cracking down on these unlawful practices.”

Persistent Privacy Violations

The COPPA Rule applies to operators of commercial websites and online services that are “directed to” children under the age of 13 and to operators with actual knowledge of the collection of children’s personal information. The law imposes a variety of obligations including, most significantly, the obligation to obtain verifiable parental consent before collecting personal information from children. The COPPA Rule does not mandate the method a company must use to obtain parental consent. Instead, it states that an operator must choose a method reasonably designed in light of available technology to ensure that the person giving the consent is the child’s parent.

Epic failed, according to the Commission, to meet this standard. Complaint, United States v. Epic Games, Inc., No. 5:22-CV-00518 (E.D.N.C. Dec. 19, 2022). When Fortnite launched in 2017, Epic included a disclaimer designed to bring Epic outside of the COPPA Rule’s scope: “Epic does not direct its websites, games, game engines, or applications to children ….” This disclaimer allegedly persisted despite Epic’s actual knowledge that identified and identifiable Fortnite players were under the age of 13. In 2018, when the issue was raised by Microsoft personnel, Epic took only partial remedial measures. Users under the age of 13 were identified and prevented from participating in only a narrow portion of gameplay. The consent issues persisted.

Even more, Epic caused substantial harm by matching children with strangers in interactive gameplay and permitting real-time communications by opening voice and text chat options by default. These issues have been known to Epic since as early as 2017 when Epic’s then Director of User Experience emailed Epic leadership to request “basic toxicity prevention” mechanisms. His request was denied.

The proposed federal court order prohibits Epic from enabling voice and text communications for children unless parents provide their affirmative consent. Epic must also delete personal information previously collected from Fortnite users in violation of the COPPA Rule unless the company obtains verifiable parental consent to retain such data. Finally, Epic must stand up a comprehensive privacy program and subject itself to independent audits.

Use of Dark Patterns

In a separate administrative complaint, the Commission alleged that Epic used dark patterns — or user interfaces designed to trick or deceive users — to induce unwanted in-game charges without parental involvement in violation of the FTC Act. Complaint, Epic Games, Inc., No. 192-3203, FTC (Dec. 2022). In particular, the Commission alleged that Epic’s practices amount to ‘unfair or deceptive act[s] or practice[s] in or affecting commerce.” 15 U.S.C. § 45(a).

The Commission raised three principal allegations. First, counterintuitive and confusing button configurations allegedly led users to incur unwanted charges. For example, a player may be charged unwittingly while attempting to wake the game from sleep mode or while attempting to preview an item. Dark patterns were also implemented to deter refund requests. Second, the complaint raised Epic’s practice of allowing children to purchase in-game currency called V-Bucks with a simple press of a button without requiring proper cardholder authorization. This practice was reportedly stopped in 2018. Finally, the Commission alleged that users who disputed unauthorized charges were denied access to purchased content.

What This Means for You

Epic allegedly ignored more than a million user complaints and repeated employee concerns. These largest-ever settlements signal an increased focus on the protection of children’s personal information online. Companies with online services directed to children — or with actual knowledge of users under the age of 13 — should ensure their compliance with the COPPA Rule and with other applicable legislation.

Online operators of all kinds should ensure that dark patterns are not implemented to deceive users or to disincentivize legitimate user activity.

Vinson & Elkins tracks developments related to government and internal investigations, as well as data privacy laws in the United States and abroad, and helps companies navigate this ever-evolving space.

This information is provided by Vinson & Elkins LLP for educational and informational purposes only and is not intended, nor should it be construed, as legal advice.