Over the past decade, the video game industry has developed a refined, familiar playbook of monetization mechanics that fit well within prevailing free-to-play and live-service models: randomized loot boxes, in-game marketplaces, engagement-driven design patterns, and highly social integrations. These features have become staples of modern game design, particularly in free-to-play titles that depend on converting engaged players into paying customers. In recent months, however, standard, widespread practices have begun to face unprecedented legal and regulatory scrutiny across multiple fronts.
This new heightened scrutiny has emerged primarily from two intersecting sources: 1) reevaluation of chance-based rewards under state-level gambling regulations; and 2) concerns surrounding the online safety and privacy of children, especially younger children. Emerging technologies like generative AI may augment the risk stemming from these sources in ways that are not yet clear. For video game companies, especially those with younger audiences and traditional free-to-play monetization, the question is no longer whether these challenges will surface, but how to navigate them strategically. Below are five key areas where traditional game mechanics are facing evolving legal pressure, along with practical considerations for companies facing these risks.
Regulators are taking aim at certain common in-game monetization strategies that they claim are forms of illegal gambling. In February 2026, the New York attorney general filed suit alleging that a major developer’s paid loot box mechanics constitute illegal gambling. The complaint highlights that the process “resembles a slot machine” and that resulting virtual items “can be sold online for money, with one item reportedly being sold for more than $1 million.” These allegations resemble other recent lawsuits from California and the Federal Trade Commission. Notably, though, the New York lawsuit does not rely on allegations of pay-to-win mechanics, deception, or dark patterns. For game companies, this latest action signals that any mechanic involving paid randomized rewards may receive accusations of gambling under state law, particularly when items can be resold for real-world value.
Around the same time, the Washington attorney general also filed suit alleging that a number of popular mobile games constitute “unlicensed electronic gambling applications.” Unlike in the New York lawsuit, the games targeted by the Washington attorney general do not allow players to turn their winnings back into real-world cash. The Washington attorney general relies on these games’ resemblance to Las Vegas-style casino games instead of on the ability to “cash out,” and argues that the games in question are particularly egregious because, according to the complaint, their cartoonish aesthetics (although fairly standard in mobile gaming) target children.
The New York and Washington lawsuits are not the first examples of chance-based reward mechanics facing scrutiny. Several years ago, we were asking and answering the same question: Are loot boxes a gambling mechanic? The answer then was primarily “no,” although we did see some changes to the deployment of loot boxes to narrow the risk, such as disclosing the odds of revealing certain items in a loot box. Today, games companies should consider taking a similar approach to the one they took the last time loot box mechanics came under scrutiny: evaluate the mechanic in the context of that game while applying relevant law. Moreover, even if the law does not require it, implementing changes that provide more disclosures to players may help mitigate risks.
Even when loot box mechanics do not rise to the level of gambling, they may attract regulatory scrutiny on fairness grounds. The FTC has long taken an aggressive stance on loot boxes and dark patterns (design choices that manipulate players into spending more than they intend). Past recommendations have included maintaining transparency in reward drop rates and avoiding punishments for missed play sessions. More recently, the FTC took the position that loot boxes can be dark patterns as they relate to children under 16. The FTC is more likely to bring enforcement actions where dark patterns intersect with children’s privacy, penalizing developers whose confusing user interface configurations arguably led players to incur unwanted charges based on a single button press. Recent enforcement actions have resulted in record-setting penalties for these practices.
Companies should be aware that even well-intentioned or industry-standard design choices may be recharacterized as deceptive if players allege they were deceived into making unintended payments. Therefore, companies should playtest their games from the perspective of consumer fairness and through the lens of the different kinds of players who will enjoy the game. Where one group of players experiences confusion or frustration, the company should investigate why and whether that issue may lead a player to pay money they may claim they did not intend to pay.
Children’s safety on gaming platforms has become one of the most active areas of litigation in the video game industry. Multiple state and county governments have filed suits against major platforms, alleging lax age verification, easily bypassed parental controls, and poorly moderated direct messaging. The lawsuits test novel theories of platform liability in the context of inappropriate user-generated content, and their outcome is uncertain.
Regulators appear to be looking most closely at games they perceive to be attractive to children. If these lawsuits are successful, they may be the first wave of new litigation against games companies. Games and platforms that appeal to children, even if they do not collect age data and/or are not intentionally targeted to children, may face higher risk. AI may prove a useful tool for games companies to more effectively monitor and remove inappropriate user-generated content.
AI-driven non-player characters (NPCs) capable of dynamic, unscripted conversation introduce unresolved legal questions, especially when the audience includes children. California State Senator Steve Padilla recently sent a letter to industry leaders demanding transparency about safety testing in AI toys to ensure AI systems do not produce harmful or inappropriate outputs. The senator also introduced a first-in-the-nation proposed four-year moratorium on the sale of toys with embedded AI chatbots. For game developers deploying AI NPCs in titles popular with younger audiences, the questions are multiplying: What content moderation obligations attach to generative AI outputs? How do existing consumer protection frameworks apply to AI-generated speech? Could a company face liability for an NPC’s unscripted statements? These remain open questions without settled answers.
For now, video game companies should consider implementing internal AI use policies. They should also ensure their privacy and data storage practices continue to comply with the Children’s Online Privacy Protection Rule (COPPA), which the FTC has been actively enforcing against video game companies. Building guardrails around the use of AI and NPCs may also be important to help ensure the unscripted conversations that make NPCs engaging for players do not diverge so far from the narrative that they engage in conversations a regulator may deem inappropriate or unfair to the player.
Taken together, these developments fundamentally alter the cost-benefit calculus of games with significant minor audiences. Intensifying gambling scrutiny, aggressive dark pattern enforcement, expanding children’s privacy litigation, and emerging AI liability risks create compounding compliance costs and litigation exposure. Games that attract younger demographics already often generate less revenue per user than adult-oriented titles, but they now also carry disproportionately increasing legal risks. Developers of games that appeal to minors should consider whether alternative revenue models (such as subscriptions or cosmetic storefronts with transparent pricing) might better balance commercial objectives with the evolving compliance landscape, and all developers should consider whether designing games to clearly appeal exclusively to adults rather than children might lessen the odds of being caught in the regulatory crosshairs.
As enforcement actions, private litigation, and legislative proposals accelerate across multiple fronts simultaneously, proactive review of how monetization strategies, creative design, and risk profiles interlock is important. Companies navigating these issues should also be prepared to address these issues through a multi-pronged approach. Just as a privacy-by-design model may help companies think about their systems and the data they collect, fairness-by-design is an approach games companies should consider taking as they launch new creative content. AI use, data collection, and monetization strategies should be fair to the players who are attracted to that game. These strategies may vary depending upon the games and the players they attract.