Industry
Published June 11, 2024
Last updated February 26, 2025

From fraud to fairness: Leveraging KYC and age verification for online gaming

KYC can help keep online gamers of all ages safe and reduce fraud. Learn how KYC and age verification can benefit your gaming platform.
Jeff Sakasegawa
Jeff Sakasegawa
14 mins
Key takeaways
Instances of bullying, harassment, cheating, and fraud within online gaming communities are on the rise. 
Know Your Customer (KYC) controls allow gaming platforms to prevent bad users from perpetuating harmful, fraudulent, or unfair environments. 
A flexible and customizable approach to user identification enables your team to build a better user experience and remain compliant with anti-money laundering directives and minor protection laws.

The online gaming industry thrives on the contributions of its players. Yet, because of the relative anonymity of online games, there’s also a risk for bad actors to taint gaming communities by cheating, defrauding, bullying, or harassing other players. 

According to a 2024 survey from Deloitte, almost half of gamers believe there’s too much bullying and harassment within multiplayer gaming communities, and more than half agree that video game publishers need to take more action to combat these negative experiences. Meanwhile, rising rates of cheating and fraud also tarnish the gaming world. One video game developer found that instances of cheating rose 50% within a five-month period. 

So, how can gaming platforms keep a range of bad users, from harassers to fraudsters, out of their communities? While regulators are taking strides to establish legislation to enforce safer online communities, most of these laws currently lack specificity around how gaming platforms should keep bad actors out. Know Your Customer (KYC) controls, therefore, remain a foundational component of ensuring the community you’re building is a safe one.

iGaming vs. online gaming

Before we explore identity verification for online gaming, it’s important to point out that, in this post, we’re not referring to online gambling or “iGaming,” which includes online sports, poker, and video game and casino gambling. iGaming operators must comply with a distinct set of laws, many of which include strict KYC requirements

In this article, we will focus on video games that are played online, specifically those that offer online interactions with other players. 

The evolution of online gaming: A brief history 

The gaming industry has evolved from simple, single-pixel games to fully immersive online communities. With this evolution, video game publishers have faced new demands to better protect users from harmful experiences. 

Let’s examine the evolution of online games and how advancing technologies have created additional demands on game publishers to protect users. 

1990s 

Milestones:

  • Software advancements allowed game publishers to evolve from single-pixel experiences to more realistic interfaces and multiplayer functionality. 

  • Violent video games grew in popularity, spurring concern among parents. This gave rise to the industry-wide Entertainment Software Rating Board (ESRB), which is still used today.

Impact:

  • Video game publishers must factor in the age appropriateness of their games to adhere to guardrails set out by the ESRB. Meanwhile, consumers are placing greater scrutiny on video games and expecting greater transparency from game publishers when making purchase decisions. 

2000s

Milestones:

  • Internet connections speed up as broadband supplants dial-up modems, allowing companies to develop massive multiplayer online games (MMOs) that involve hundreds or thousands of players interacting simultaneously in a virtual world. 

  • This decade also saw the rise of voice chat, which allows users to interact with other players hands-free while playing. 

Impact:

  • As multiplayer universes expand, so too do bad actors who permeate communities with cyberbullying, harassment, and grooming. 

  • Voice chat creates new opportunities for bad actors to engage in bullying and harassment. By the end of this decade and into the 2010s, reports showed that women, in particular, were experiencing greater instances of harassment when using voice chat features. 

2010s

Milestone:

  • Microtransactions make their debut and quickly dominate the online gaming world. This shift in revenue model significantly alters the player experience, with in-game purchases becoming a common practice in mainstream gaming. 

Impact:

  • Microtransactions and in-game economies incentivize fraudsters to take over accounts to access credit card information or to sell accounts on third-party marketplaces for a profit. Early reports in this decade show that games like World of Warcraft with “fully fledged economies” were ripe for fraudulent activity. 

  • Microtransactions and in-game economies also create new opportunities for criminals to commit money laundering schemes. For example, a criminal could use illicit funds to purchase in-game items and then sell the account on third-party marketplaces for “clean” cash. 

2020s

Milestones:

  • The COVID-19 pandemic catalyzed growth in online gaming. As traditional media revenues declined, online gaming revenues increased.

  • This period also saw the realization of cross-platform functionality. This allows users to interact with one another in multiplayer environments regardless of console compatibility. 

Impact:

  • A survey by ADL found that as multiplayer game communities grew in size over the first year of the pandemic, so did instances of online harassment. In 2020, 81% of players experienced some form of harassment, a 7% increase from the prior year. 

The value of KYC controls for establishing safer and fairer online gaming communities

KYC controls allow you to securely and accurately verify user identities to prevent underage users from accessing age-inappropriate games and features and block known bad users, e.g., those on watchlists, from perpetuating harmful, fraudulent, or unfair environments.

With identity verification, you can ensure users are who they claim to be through government ID verification, document verification, database verification, selfie verification, or other methods. After you’ve verified a user’s identity at onboarding, you can then reverify their identity when certain signals are detected (e.g., suspicious activity and other risk signals) to minimize the threat of:

  • Underage users accessing the platform or using certain features within the game that are age-restricted

  • Bad actors reentering a gaming environment after they’ve been banned for cheating, harassing other players, or perpetuating extremist agendas

  • Bad actors taking over another user’s account to steal their in-game winnings and/or sell their account on third-party marketplaces 

  • Criminals perpetuating money laundering schemes or other illegal activities

Let’s explore how KYC controls can be leveraged to reduce these threats and create better gaming environments. 

Prevent toxic, harmful environments

Many game publishers use account bans to limit bad actors from perpetuating toxic environments, however, these players often create new, fraudulent accounts to re-access the game. 

Tools such as link analysis ensure that bad users cannot regain access to your platform under the guise of a new identity. Through link analysis, multiple accounts on a platform sharing suspicious account details such as physical address, IP address, and device fingerprints, among other signals, can be discovered. Your team can then investigate further or use identity reverification to determine if the user is who they say they are. 

Build user trust

As game interfaces have advanced, so too has software designed to give players unfair advantages. For example, players can download “wallhacks” that enable them to walk through walls and “speed hacks” to increase their character’s speed. In 2023, Call of Duty (COD) banned over 14,000 accounts for cheating and hacking. These accounts were spotted with anti-cheating technology that detected unusual behavior among players.

Simply banning the accounts of cheating players rarely works because users can reenter the platform with a new identity. By combining anti-cheating technology with KYC controls, gaming platforms can spot potential fraudulent accounts, prompt identity reverification, and ensure cheaters are kept off the platform. 

In more high-stakes competitions like gaming tournaments, fraudsters will create fake accounts at scale using bots to increase their chances of winning. This erodes the trust and confidence of genuine players. Using tools like link analysis, gaming platforms can spot instances where multiple accounts have been created from the same physical location or IP address, for example, to kick-start a user flow requiring the gamer to reverify their identity. 

Protect minors from accessing mature content

Since the late 1990s, game publishers have used content rating systems through the ESRB to help parents assess the age appropriateness of games. Now, with the proliferation of multiplayer online gaming communities, charting age appropriateness goes beyond rating the nature of the content itself. For example, a game with age-appropriate content might be considered inappropriate for minors if it allows unfiltered player-to-player chat, which could open up opportunities for predators to groom children or otherwise exploit them. 

To account for these nuances, the ESRB now highlights game features that may concern parents but do not influence the product’s rating assignment. For example, the game may still be rated “E for everyone,” but the rating includes a disclaimer that the game offers in-game purchases.

With KYC controls, including age verification during onboarding, you can identify underage players to ensure they are served age-appropriate experiences. For example, if your gaming platform offers age-appropriate content for minors but also allows all users, regardless of age, to chat with one another (which could expose minors to the risk of grooming or other exploitive behaviors from other users), you might restrict underage accounts from accessing chat features. 

Reduce instances of account takeover (ATO) fraud and money laundering 

In-game currencies allow players to purchase extra lives, character skins, props, or skills in the game. Players can rack up in-game currency by completing different levels or purchasing currency directly with their credit card. Accounts with large amounts of in-game currency are ripe for ATO from fraudsters who: 

  • Resell the taken-over account on third-party marketplaces for a profit

  • Use the saved credit card information to make unauthorized microtransactions within the game 

Microtransactions and in-game currencies also enable criminals to use gaming platforms to commit money laundering crimes. For example, a criminal may create an account, buy in-game items with illegal money (e.g., stolen credit cards), and then sell the account online for “clean cash.” 

KYC controls such as reverification prevent fraudsters from locking out true account owners and help gaming platforms surface suspicious activity that may become the basis for opening up money laundering investigations. Depending on the specifics of your platform, you might consider automating reverification when:

  • Key account details, such as password or contact information, is changed

  • A user possesses multiple accounts and attempts to transfer in-game currency to another account

  • A user logs in from a suspicious IP address or with geolocation data that indicates the user is logging in from a jurisdiction with a high risk of money laundering

  • A user attempts to engage in any activity that carries the risk of money laundering or other financial crimes, e.g., if a user attempts to make a large transaction or suddenly spends a large sum of money on the platform 

User verification pathways for online gaming platforms

KYC controls may look different depending on your audience demographics and business objectives. Below are some use cases for implementing identity verification for online gaming, including methods your company might deploy.

Compliance-driven use case: Age verification at onboarding 

Online gaming platforms must comply with data privacy laws like the General Data Protection Regulation (GDPR) in the EU and the Children’s Online Privacy Protection Act COPPA in the U.S., which set age thresholds for the use of data. They must also comply with right-to-use regulations like the Digital Services Act (DSA) in the EU and the Online Safety Act (OSA) in the UK, which aim to create safer digital environments for users of all ages.

Age verification typically involves users uploading a government ID to verify their age, which can be further validated with database or selfie verification. If you choose to implement a government ID verification system, it’s critical to thoroughly vet the security and privacy standards of the vendor you choose to ensure protection of users’ personally identifiable information. By incorporating age verification during onboarding, you can identify child and teen users to:

  • Adjust account settings and comply with child data privacy rules (e.g., ensure minor data is not sold to advertisers)

  • Limit access to age-inappropriate features (e.g., removing in-game voice chat capabilities for children under a certain age)

  • Ensure underage users do not access mature-rated gaming content 

Fraud-driven use case: Reverification during high-risk events

Free-to-play games rely on in-game currencies and microtransactions to generate revenue, therefore, maintaining trustworthy in-game economies is essential for these business models. If your platform offers microtransactions, you can reduce instances of fraud by monitoring in-game transactions, identifying suspicious or high-risk users, and requiring reverification to proceed with transactions and other high-risk events. 

Using dynamic, risk-based segmentation, you can build tailored flows that surface different experiences depending on the user’s risk signals. For example, if a user initiates a large transaction — a signal that a user could be riskier — your dynamic flow would automatically require them to reverify their identity with a government ID and selfie to proceed. 

Considerations to augment KYC controls for gaming 

Most online gaming platforms already use some form of KYC controls during onboarding to verify user ages and ensure users are not known criminals. However, it’s worthwhile for game publishers to consider how well these controls stand up against shifting tides, such as advancing cheating technology and the potential for greater regulatory oversight. 

Below are a few points to consider when examining your existing KYC controls and how they could be augmented with advanced KYC tools. 

Key questions and opportunities for KYC enhancement

ConsiderationOpportunity
Is your identity verification platform flexible enough to adapt to shifting regulatory requirements?A configurable platform with multiple verification options enables your business to remain adaptable to regulatory changes without overburdening your engineering resources.
Do your identity verification and reverification processes dynamically adjust based on various risk signals?By dynamically adding friction based on user risk signals, you can let “good users” continue using your platform with limited disruption while deterring potential fraudsters.
Does your identity verification complement your anti-cheating technology and other tools used to spot fraudulent players?By connecting anti-cheating technology with a link analysis tool, you can stop bad actors from reentering your platform with modified credentials during onboarding.

The future of online gaming and KYC controls from a regulatory perspective

Currently, regulators do not outright require online gaming platforms to verify players’ identities, however, as the gaming industry grows, lawmakers could create tighter rules — especially regarding protecting children and enforcing anti-money laundering directives. 

Minor protection regulations 

Several laws already exist to protect children from harmful content and predators online, and many of these laws apply to online gaming. For example, COPPA in the U.S. and the GDPR in the EU protect the privacy rights of children online, including on gaming platforms. Meanwhile, regulations like the DSA in the EU and the OSA in the UK set forth requirements for online platforms, including gaming platforms, to create safer online spaces for users of all ages. 

Anti-money laundering directives

Several anti-money laundering regulations exist, like the Anti-Money Laundering Directives in the EU and the Bank Secrecy Act in the U.S. These regulations were written for financial institutions and regulated financial products like banks and credit cards. Interestingly, gaming platforms offer financial vehicles that resemble these regulated products. For example, in-game “loot,” which has real-world value, acts like a stored value account.

As online gaming continues to grow in popularity, and as more third-party marketplaces emerge that enable money laundering activities, regulators could examine gaming platforms more closely. 

Beyond regulatory requirements: The value of KYC controls 

Regardless of whether regulators outright require identity verification in the future, KYC controls are essential for helping safeguard the integrity of gaming spaces and foster inclusivity and trust among participants. In an increasingly crowded market, offering engaging and safe online experiences across age groups could be a competitive differentiator. 

Build user trust and remain compliant with Persona

With Persona’s identity verification platform, you can remain compliant, eliminate toxic users, and better protect minors to create an ideal gaming experience for all users. 

Persona makes it easy to verify gamer identities while meeting data privacy standards and regional legal requirements. We’re certified and compliant with global security standards for data privacy.

Our configurable platform makes it easy to tailor flows to meet user expectations, whether you want to create an on-brand experience, adjust which types of verification methods are accepted, or add a layer of verification for high-risk events or risky users. 

Need user-friendly and compliant identity verification? Contact us to learn more or get started for free.

The information provided is not intended to constitute legal advice; all information provided is for general informational purposes only and may not constitute the most up-to-date information. Any links to other third-party websites are only for the convenience of the reader.
Jeff Sakasegawa
Jeff Sakasegawa
Jeff Sakasegawa is Persona's trust & safety architect. Prior to Persona, Jeff worked in fraud and compliance operations at Square, Facebook, and Google.