By Stacy Forsythe and Jamila Hemmerich

Just in time for the holiday gift-giving season, the FTC reminds companies that it’s not all fun and games when it comes to privacy rights and in-game purchases.  In the FTC’s largest administrative settlement to date, Epic Games, Inc. (“Epic”), creator of the extremely popular video game Fortnite, agreed to pay $520 million in fines and customer refunds for violating the Children’s Online Privacy Protection Act (“COPPA”) and using dark patterns to deceive users into making in-game purchases.  These record-breaking penalties resolve two separate complaints filed by the FTC.

What is Fortnite?

If you have seen videos of kids (or adults) doing the “Floss,” the “L,” or the “Orange Justice,” chances are they learned those from Fortnite.  Fortnite is an online video game developed by Epic.  Since its release in 2017, it has amassed a following of over 400 million people worldwide.  Players around the world are matched without regard to their ages to either play against each other in “battle mode” or grouped together as a team in “coop mode.”  The game’s default settings permit live voice and text communication.  Although Fortnite is generally free to download, users can purchase in-game items, such as costumes and dance moves, at additional costs.  The game is especially attractive to younger players because of the cartoonish graphics.  While some parents have complained about the use of violence in Fortnite, such as players shooting at one another, the depictions of violence are not particularly graphic or gory.  For example, defeated players simply vanish with no traces of blood.

The FTC Takes Action

In 2022, the FTC filed two separate complaints against Epic in federal court.  One alleged violations of COPPA.  Another alleged unlawful billing practices.

The First Complaint

The first complaint alleges that Epic’s failure to notify parents and obtain consent violated COPPA and that the game’s default settings were “harm[ful] to children and teens.”  Epic knew that “many children were playing Fortnite,” as demonstrated by “surveys of Fortnite users, the licensing and marketing of Fortnite toys and merchandise, player support and other company communications.”  Still, the company “collected personal data from children without first obtaining parents’ verifiable consent.”  Those “parents who requested that their children’s personal information be deleted” were forced to “jump through unreasonable hoops.”  Sometimes, Epic “failed to honor [parents’] requests.” 

Further, the game’s default settings enabled live “text and voice communications” for all users and matched children and teens with strangers of all ages to play together.  These settings and practices resulted in children and teens being “bullied, threatened, harassed, and exposed to dangerous and psychologically traumatizing issues such as suicide while on Fortnite.”  Epic employees expressed concerns about the default settings and urged the company to change them as “early as 2017.”  Ultimately, the game maker “added a button” that permitted “users to turn voice chat off.”  However, the FTC’s complaint claims that Epic “made it difficult” for users to locate it.

Pursuant to the proposed order to resolve this complaint, Epic agreed to pay a $275 million fine, the “largest penalty ever obtained for violating an FTC rule,” to the U.S. Treasury.  The order also “prohibit[s] Epic from enabling voice and text communications for children and teens unless parents (of users under 13) or teenage users (or their parents) provide their affirmative consent through a privacy setting.”  The company is also required to “delete personal information previously collected from Fortnite users in violation of the COPPA Rule’s parental notice and consent requirements unless the company obtains parental consent to retain such data or the user identifies as 13 or older through a neutral age gate.”  Finally, “Epic must establish a comprehensive privacy program that addresses the problems identified in the FTC’s complaint, and obtain regular, independent audits,” a “first-of-its-kind provision.”

The Second Complaint

The FTC’s second complaint alleges that Epic used ”dark patterns” to “trick” players into making unwanted purchases,” letting children “rack[] up” unauthorized charges without their parents’ knowledge or approval and then blocked users’ access to their accounts and the content they had purchased if they disputed the charges.  

Epic used a variety of “dark patterns” aimed at getting “consumers of all ages” to make unintended in-game purchases.  Fortnite’s “button configuration” places purchase buttons near other commonly used game functions, leading players to incur unintended charges “based on the press of a single button.”  Fortnite players have incurred charges while trying to wake the game from “sleep mode,” while the game was in a loading screen, or by inadvertently pressing an adjacent button while merely trying to preview an item.  Such schemes resulted in hundreds of millions of dollars in unauthorized charges to consumers.

Until 2018, children who played Fortnite could buy “V-Bucks,” which players use to make “in-game purchases,” “simply by pressing buttons with no parental or card holder action or consent.”  Parents complained that their children had accumulated hundreds of dollars in charges without their knowledge or consent.  Moreover, when customers disputed the unauthorized charges with their credit card companies, Epic locked their accounts, blocking access to all the content the consumers had purchased.  Customers were blocked from accessing content that was not the subject of a billing dispute, which could total thousands of dollars.  If and when Epic unlocked an account, it threatened the customer with a permanent ban if they disputed any future charges.  Those who disputed a subsequent fraudulent or unauthorized charge were permanently banned and never refunded for any of their paid content. 

Under the terms of the proposed administrative order to resolve these billing issues, Epic must pay $245 million, which is to be used to issue refunds to consumers.  The order also forbids Epic from charging consumers by employing dark patterns or from otherwise charging them without their express, informed consent.  The company is also prohibited from temporarily or permanently locking consumers out of their accounts because they dispute unauthorized charges. 

Epic’s Response

The proposed orders do not require Epic to make any admissions to the FTC’s allegations.  Epic released a statement on December 19, 2022 in response to the settlement, acknowledging that the gaming industry should not rely on the status quo and expressing their desire to take positive action to safeguard players.  The company “accepted this agreement because [it] want[s] Epic to be at the forefront of consumer protection and provide the best experience for [its] players.”  “Statutes written decades ago don’t specify how gaming ecosystems should operate. The laws have not changed, but their application has evolved and long-standing industry practices are no longer enough.”

Rather than rely upon common industry practices, Epic is taking action to protect its younger players and to address complaints regarding in-game purchases.  Earlier in December, Epic introduced “Cabined Accounts” for its Fortnite, Fall Guys, and Rocket League video games.  During registration, if a player provides a date of birth “that places them below their country’s age of digital consent (13 in the U.S.), then features [including] chat and purchasing are disabled.”  Furthermore, parents will be notified via email when a child signs up.  Parents then have the option to adjust “their child’s settings.”  Additionally, Epic has made parental controls more accessible and robust, including the “option to require a PIN to send and accept friend requests and enable parents to authorize purchases before they are made.”  There are also established “daily spending limit[s] for players under the age of 13,” and more privacy options for the “chat” feature.

Epic has also updated its “payment flows” to provide additional “clarity” to players.  Rather than saving payment information by default, Epic now explicitly offers players the choice as to whether they wish to save their payment information.  The company “has [also] had a refund token system and an undo-purchase system” dating back to May 2018.  Epic has “updated [its] payment flows with a hold-to-purchase mechanic that re-confirms a player’s intent to buy, as an additional safeguard to prevent unintended purchases alongside instant purchase cancellations and self-service refunds.” 

Further, Epic has changed its “chargeback policy to account for non-fraud related” refund requests “and will only disable accounts when fraud indicators are present.”  Historically, it has been common practice for companies in the gaming industry, including Epic, to disable accounts associated with chargebacks “as a fraud prevention measure.”  Using Epic’s new approach, the company has reinstated “thousands of accounts” that were previously “banned due to reported chargebacks under [its] previous policy.”

Key Implications

  • The FTC is paying close attention to business models that lure users with a “free to play” option but include in-app or in-game purchases.
  • The FTC is using non-monetary provisions in settlement agreements as additional enforcement measures for rule violations, such as requiring the establishment of “a comprehensive privacy program” and “regular, independent audits.”
  • The FTC and the public are similarly concerned over the privacy and safety risks that online game play poses to children.
  • Content developers should be aware that even if your content is geared toward a more mature audience, if there is a possibility it will be attractive to younger audiences (whether through graphics, celebrity endorsements, gaming platforms, or otherwise), assume children will find a way to access it and you need safeguards to protect their privacy.
  • Developers must recognize and adapt to issues and threats in the ever-evolving gaming industry.  Mere compliance with the letter of the law and “long-standing industry practices” will not protect developers from FTC action if they do not comply with the law’s spirit.

Level Up Your Privacy Safeguards

Online platforms that attract a younger audience need to be especially careful not to run afoul of COPPA and prepare for forthcoming state regulations that aim to push tech companies to provide additional privacy protections for younger users.  For example, the California Age-Appropriate Design Code Act, which passed on September 15, 2022 and is set to take effect in 2024, requires, among other things, the configuration of default settings “that offer a high level of privacy” for users under the age of 18.  Regardless of their target-age demographic, companies should be prepared to ensure compliance with privacy regulations set to go into effect on January 1, 2023, including the CCPA and CPRA, to avoid getting – as gamers say –“pwned” by epic-level fines.

For assistance with or additional information on this topic, or to discuss data privacy compliance issues more broadly, please contact Martin Tully.

The views expressed in this article are those of the authors and not necessarily those of Redgrave LLP or its clients.