Sections

Commentary

A guide to reining in data-driven video game design

Syrian refugee children take part in the first e-sports tournament organised in the Zaatari refugee camp near the border city of Mafraq, Jordan February 1, 2020. Picture taken February 1, 2020. REUTERS/Muhammad Hamed

Across the technology industry, the collection of massive amounts of user data is reshaping the way companies interact with their users. This collection of data—and its use in informing business decisions—has fundamentally changed the video-game industry, where the greater availability of data has allowed for more personalized game design. Indeed, video games represent much more than a form of interactive entertainment. They pose challenging public policy questions, for speech, propaganda, and politics. And as video games play an increasingly important role in society—as hugely popular forms of entertainment but also as an arena for contesting politics—data-driven game design also raises questions about whether user data is being used in manipulative ways, whether privacy rights are being violated, and whether the rights of children are being protected. 

Player data collection, analysis, and hypermonetization

Games have always been data-driven. Since its inception, the video game industry poured resources into marketing, promotion, and play-testing to make sure games would be interesting and challenging to players. As devices became more complex, data collection became easier. Today, the shift to online and mobile gaming has given game developers and publishers the ability to collect and process massive quantities of data about players: not only what they like to play, but what they read online, who they play with, and what makes them spend money. 

With this data, game makers are building a host of new in-game features and virtual economies, driving new monetization methods and capabilities to tailor and personalize gaming experiences. The trove of data they collect allows finely tuned play experiences to be locked behind paywalls, many of which resemble gambling. The most popular form of gambling-like paywalls, the “loot box,” is central to how Electronic Arts, for example, monetizes its FIFA series of games, which bring in $1 billion in annual revenue. Loot boxes function much like a slot machine. In FIFA, loot boxes (or “player packs”) provide access to famous soccer players—but not always. The same $20 a player spends might provide a no-name player with low stats or Lionel Messi. The uncertainty drives engagement, which drives spending. A growing number of countries consider this practice a form of illegal gambling.

Like many global companies, game makers face concrete obligations under the European Union’s General Data Protection Regulation to provide access for individuals to see what data is collected about them and are held responsible for how they manage that data. After GDPR went into effect in 2018, one UK-based player filed a request to see how much information was being collected about him while he played FIFA soccer games. Since the 1980s, video game designers have used difficulty levels to adjust player expectations, and the player hoped to learn if the game’s publisher, EA, was collecting information to adjust the game’s difficulty on the fly. Instead, he learned that he had spent more than $15,000 on in-app purchases over the prior two years. The sheer scale of the expenditure, which was generated through small purchases and slot machine-like mechanics, was shocking. 

We call this sort of aggressive, data-driven monetization, including the use of manipulative design elements to add pervasive or unnecessary transactions to aspects of a game’s mechanics, hypermonetization. Hypermonetization can be found in the fastest growing segments of the industry, from mobile games to big-budget blockbusters. Free-to-play games might lock vital, game-advancing content behind paywalls. Expensive first person shooters might cripple the “base” game experience unless downloadable content is purchased. The use of arbitrary virtual currencies might trick players into spending more money than they realize. The scale of the revenue is extraordinary: One popular game, Candy Crush Saga, reportedly generates more than $4 million in revenue per day for its publisher through these manipulative design tactics. Mobile games especially rely on so-called “whales,” which represent a tiny minority of players (as few as 2% of total users) and generate the majority of a game’s revenue through microtransactions.

Hypermonetization undermines consumer confidence, and its increasing sophistication raises troubling questions for the video game industry regarding consumer protection, data privacy, and gambling regulation. Ultimately, these questions converge at a simple question without an easy answer: Do video games now warrant unique regulatory attention?

The regulatory challenges raised by video games sit at the intersection of technology, law, and policy. Games rely on cutting-edge technologies that drive the adoption of more powerful graphic processing hardware and artificial intelligence. Because they appeal to young people they are also the frequent target of moral panics by well-meaning parents and policymakers. Prior moral panics about games focused on the content of games (like violence or sexuality), which prompted the industry to adopt a self-regulatory approach. The resulting Entertainment Software Rating Board issues voluntary content ratings to better inform consumers about the content of a game. Recently, this rating system has evolved into a three-tier system that includes some information about data privacy, random elements like loot boxes, and in-game purchases, but these efforts have not required the video game industry to reckon with the implications of its current design methods. 

Data privacy and security

When game makers can track everything a player does in a game, the privacy implications are enormous. Some data collection is necessary to help build better games, like what sort of stories players like to read, how they pass their time, and what sort of mechanics they find engaging. In 2005, Microsoft launched “achievements” on the Xbox 360, which let players win badges for playing a game in a particularly challenging way. These badges were unlocked by the game monitoring every aspect of gameplay, from how many bullets a player used to defeat a boss to how smoothly they traversed a level. Soon, game makers realized they could use this data bonanza to build psychographic profiles of their players. Player behavior can divulge how impulsive they are, their willingness to play with others, and what gameplay aspects can motivate them. With access to cameras and GPS data (as in Pokémon GO, for example) a game maker has the ability to correlate play with travel patterns, facial expressions, room noises, or whether or not someone lives alone. Much of this can happen without player knowledge, further undermining not just players’ privacy but their autonomy. 

With invasive gameplay surveillance, game makers have become stewards of huge amounts of user data, and data privacy and security is perhaps the most pressing issue that the industry needs to address. Several major publishers have experienced serious data breaches. A 2011 breach saw more than 77 million accounts on the Playstation Network compromised. More recently, lax security measures at the Electronic Entertainment Expo, the industry’s preeminent trade show, led to a leak of sensitive information about game journalists. In the shadow of harassment campaigns like GamerGate, which forced several women active in the industry to flee their homes under a cloud of death threats, this breach made real the consequences of the industry’s lax approach to data security.

There is general agreement about certain baseline privacy needs in video games, but vague consensus has failed to establish clear expectations. The Entertainment Software Association, the video game industry’s trade association, supports a federal privacy framework in the United States, but their approach is limited by prioritizing additional transparency to players and parents as a primary solution to invasive data practices. Put bluntly, more disclosures will not stop invasive collection or sloppy data management. Players need better access to the information they provide game makers and easily accessible and comprehensible choices about how—and whether—that information can be used to profile them and make inferences about their gaming behaviors. Players also need to be offered meaningful controls about how their data can be used and shared, particularly for certain marketing uses or where video games are used for employment or educational opportunities.

Even if game makers can avoid international laws like the GDPR, emerging state laws like the California Consumer Privacy Act are forcing game makers to grapple with data protection rules. The CCPA provides three basic rights: the right to know what information is collected, the right to delete that information, and the right to opt-out of having that information sold. Regulations and a more comprehensive California Privacy Rights Act have built on these initial requirements. But granting individuals a set of affirmative privacy rights is no panacea, as the European experience with extensive privacy regulations shows. The GDPR created a labyrinthine process for consumers to access their own data, and European authorities have failed to consistently enforce privacy violations.

There is less agreement about what sort of substantive limitations should be placed on how game data is collected or used. Ensuring that game publishers and developers practice what is known as “privacy by design,” which would compel them to safeguard user privacy by default, would be difficult for game makers to accept, as it would force them to reckon with how they process and monetize data for game development. Many popular mobile games push players to grant access to detailed information about their location, social contacts, and sensor data. Marketing partnerships and obnoxious advertising have increasingly infiltrated video games, and many game makers closely monitor player behavior to generate research for future game development. Embracing privacy by design might not outlaw these practices, but it does present tough questions about whether it is appropriate to profile players, when their information may be shared, and how these practices must be communicated to players and affirmatively authorized. Privacy by design can curtail monetization; it also brings ethical issues about gameplay personalization to the forefront. 

When she was California attorney general, Vice President Kamala Harris encouraged mobile app developers to adopt a “surprise minimization” approach to collecting and using information. This encouraged developers to minimize surprises to users from unexpected data practices, specify how information can be used, and avoid hoarding data on the “off-chance that it might be useful in the future.” Adopting this approach broadly would mean game developers would limit the scope of the data they collect, the amount of data they collect within that narrowed scope, and how they store and use that data. 

The United Kingdom at least offers an example for how to regulate the industry’s impact on children. In September, the Age Appropriate Design Code goes into force and mandates that companies “put the best interests of the child first when they are designing and developing apps, games, connected toys and websites that are likely to be accessed by them.” The Design Code stands in sharp contrast to Children’s Online Privacy Protection Act in the United States, which only governs data collected from children under 13 on services either directed to kids or which have actual knowledge that information is being collected from a child. The Design Code applies to all minors under age 18 and creates rules regarding the collection of geolocation data, platform terms, and manipulative design tools.

Creating privacy-friendly defaults can curb invasive player profiling, as well as hypermonetization. But such defaults are also in tension with the potential of games to provide personalized experiences and positive surprises. Today, a video game that surreptitiously scanned a player’s contacts or other apps without asking permission would raise eyebrows, but most players were delighted when a boss enemy in 1998’s Metal Gear Solid “knew” if the player had played 1997’s Castlevania by scanning the contents of a memory card. (This information was on local storage and inaccessible to game makers.) Efforts to regulate how games use player data may be in tension with some of the most surprising elements in video games, and this is particularly true with respect to efforts to curb manipulative design. 

Manipulative design and dark patterns

In designing video games, developers have a great deal of power over users, and nowhere is this more evident than in the use of so-called “dark patterns,” which refers to user interfaces and experiences that are designed to intentionally confuse players, make it difficult to express their actual preferences, or manipulate users into taking certain actions. They typically exploit cognitive biases and prompt online consumers to purchase goods and services that they do not want, or to reveal personal information they would prefer not to disclose. These dark patterns can include “nudges,” which are a crucial way in which video games engage players. Nudges can be gentle, like an on-screen hint about a game mechanic, but can also be manipulative, like intentionally locking away vital game functions behind a loot box or pay wall. There have been several efforts to catalog and create a taxonomy of problematic design—including in the context of video games—but establishing clear guidance for when design features or functionality are detrimental to players has been challenging.

Regulators and watchdogs are examining whether this type of design can be legally deceptive, unfair, or even abusive to players. A recent workshop hosted by the Federal Trade Commission showcased some of the challenges in policing dark patterns. The workshop focused on in-app purchases as a type of manipulative design, and the self-regulatory Children’s Advertising Review Unit demonstrated in a presentation just how aggressive in-app monetization requests targeted at children can be. In one instance, a child’s virtual pet was threatened to be taken away for neglect unless the player purchased game currency. Another child-oriented game, this one in the Harry Potter universe, forced players to watch a young wizard get strangled unless they purchased in-game assets. Children’s advocates argue that live service games and freemium games are especially reliant on dark patterns like this to generate revenue streams. In addition to concerns around badges and push notifications, particularly manipulative design features include timers that promote fear of missing out (FOMO), menu manipulations, arbitrarily labeled virtual currencies, and design practices that rely on relationships with a game’s characters to encourage spending or other in-game actions. 

Some regulators and lawmakers are eager to rein in these practices. Both the UK Information Commissioner’s Office and the French regulator CNIL have warned companies about the use of dark patterns in app or platform design, and the term has started appearing in new legal rules. The California Privacy Rights Act, for instance, prohibits obtaining consent through “dark patterns,” which are defined as “a user interface designed or manipulated with the substantial effect of subverting or impairing user autonomy, decision-making, or choice.” Concerns about dark patterns go beyond their implications for a player’s ability to make a privacy decision. The bipartisan federal DETOUR Act, for example, would make it unlawful for large online companies “to design, modify, or manipulate a user interface with the purpose or substantial effect of obscuring, subverting, or impairing user autonomy, decision-making, or choice to obtain consent or user data.” While the proposal’s thresholds exclude most online games, it highlights a shift in legislative and regulatory focus toward nudges, nagging, and toying with individual consumer’s emotions.

Framing a regulatory agenda

If there is a recurring theme through this essay’s exploration of the dark side of game design, it is that hypermonetization comes with steep downsides for consumers, from privacy violations to dark patterns. Directly addressing hypermonetization is difficult, but privacy protections and restrictions on dark patterns might offer an indirect way to curb the most abusive practices and safeguard the rights of both child and adult players. 

The ESA is right to push for more transparency about how games process information. After all, one of the ways that dark patterns work is by being opaque about mechanics or gameplay so as to manipulate players into spending money without them realizing it. But this view of transparency should be a starting point for expanding consumer protection, not its end point. While its exact boundaries are unclear at the moment, both the industry and regulators need to engage in a broader debate about manipulative design—especially targeting children with in-app purchase demands. The current free-for-all, in which players are placed in pressure-filled situations while a game demands payment, is unsustainable and places the interests of consumers last. The way in which manipulative design undermines gamer autonomy and exploits gambling mechanics (like loot boxes) should be curtailed and have stronger safeguards against abuse.

But this issue goes beyond its impact on children. Games are increasingly used for education, workplace training, and even benefits eligibility. Transparency and safeguards about how those games are designed, and how they can be prevented from abusively extracting money from families, are vital.

Ultimately, any regulatory agenda regarding video games needs to have a role for public input and players themselves. There is a serious danger that the regulatory capture that has strangled efforts to protect consumers in the broader technology industry—video games, after all, generate more revenue annually than Hollywood—will also undermine efforts to improve the gaming industry. Indeed, the game industry often avoids the same degree of scrutiny, or “techlash,” that major technology companies receive, and both regulators and legislators will come under intense pressure to neuter any measure that might infringe on the current hypermonetization binge. 

Video games are wonderful creations, with the power to enrich life and improve mental well-being. Both of us love games and want to see them made better, but the long term health of the industry will require a commitment to using data responsibly and ethically. 

Joshua Foust is a PhD student studying strategic communication at the University of Colorado Boulder’s College of Media, Communication, and Information. His website is joshuafoust.com.
Joseph Jerome serves as the director for platform accountability and state advocacy at Common Sense Media.

Microsoft provides financial support to the Brookings Institution, a nonprofit organization devoted to rigorous, independent, in-depth public policy research.