The growth of India’s gaming sector is linked to AI and how it will be regulated
The Indian government’s substantial budget allocation towards it, highlights artificial intelligence (AI)’s growing presence across sectors and launching initiatives the IndiaAI portal, implementing the ‘National Program on Artificial Intelligence’, and organizing a multi-stakeholder consultation on the safety and ethics in AI.
This also applies to India’s rapidly evolving gaming sector, the world’s second-largest market, which has attracted over $2.8 billion from domestic and global investors through ‘Real Money Games’ which is the primary revenue driver and accounts for approximately 84% of all revenues in the gaming sector.
The AI Gaming Interdict
There are multiple ways AI can enhance the gaming sector. Developers are increasingly using AI in game development to create non-playable characters (NPCs) that adapt to each player’s behavior, providing a personalized, immersive, and realistic gaming experience. AI also optimizes game performance by generating dynamic environments, designing autonomous opponents that evolve based on player actions to enhance challenges, and automating level design for a more engaging experience.
In the context of ‘Real Money Games,’ AI, machine learning, and emerging technologies play a crucial role in fraud detection. AI systems analyze betting patterns across extensive data sets, identifying anomalies that deviate from individual player behaviors. Additionally, AI enhances financial security by monitoring transaction histories, detecting potential money laundering activities, and flagging suspicious transactions for further investigation.
AI and machine learning can also promote responsible gambling by analyzing players’ gambling patterns, frequency of play, and spending habits to identify potential signs of addiction early. Gambling platforms can then use AI to create a personalized environment that prevents addiction from worsening. Additionally, AI facilitates ‘self-exclusion’ programs, allowing individuals to set restrictions on their gambling activities. Platforms can enforce these limits by suspending or terminating access if users breach their self-imposed restrictions, helping to mitigate gambling-related harm.
Regulatory Intent
While AI offers substantial benefits to the gaming sector, its deployment must align with existing laws and any future policies the government may introduce to regulate AI and emerging technologies, as emphasized in the G20 ministerial declaration. and the statement in Parliament in 2023. The government currently appears to favour a lighter approach to AI regulation, that is aimed at maximizing the sector’s potential.
Since the government is cognizant that AI deployment includes privacy risks and intellectual property rights infringement, efforts have been made to strike a balance between fostering innovation and ensuring responsible AI deployment. Notably, the ‘Digital India Act’ blueprint, published in March 2023, specifically addressed the regulation of ‘high-risk AI systems. In 2024, the government issued an advisory requiring entities to obtain approval before deploying certain AI models in India, but this was subsequently withdrawn.
Regulatory Framework and deploying AI and emerging technologies
The use of AI in gaming applications involves compliance with several regulations that impose obligations on gaming platforms. AI relies on vast data sets, including user spending, play duration, and location data. As a result, gaming platforms must implement a robust consent mechanism that clearly outlines the personal data collected, its intended use (including for responsible gaming, if applicable), and the security measures in place to protect it. Additionally, since AI models are trained on extensive data, gaming platforms must ensure they have the necessary rights and permissions to use all data utilized in AI training.
Online gaming platforms that offering ‘Real Money Games’ are classified as ‘Online Gaming Intermediaries’ and regulated under the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021. These guidelines impose several obligations, including mandatory KYC and user verification before accepting cash deposits and prohibiting platforms from facilitating credit for players to continue playing ‘Real Money Games.’ AI, machine learning, blockchain, and other emerging technologies can be leveraged to enhance financial security and streamline KYC verification.
On the policy front, NITI Aayog introduced the National Strategy for Artificial Intelligence (NSAI) in 2018 as a foundational framework for AI regulation, aiming to balance innovation with responsibility. The NSAI also proposes solutions for gaming platforms to access larger AI training datasets and advocates for a decentralized data marketplace powered by blockchain. This marketplace would: (i) enable data providers to implement traceability and access controls, (ii) ensure compliance with local and international regulations, and (iii) establish a robust price discovery mechanism for data. These measures aim to boost data providers’ confidence and promote a structured, formal economy for data exchange.
In 2023, the government established a sub-committee to identify regulatory gaps and propose recommendations for a comprehensive AI framework. The committee published its report in January 2025, advocating for an ‘activity-based regulation’ approach—where specific activities, regardless of the entity performing them, must comply with government-mandated requirements. The report also suggests transitioning to a hybrid model combining ‘activity-based regulation’ with ‘entity-based regulation,’ where sector-specific entities (such as banking, finance, and telecom) must obtain government authorization before deploying AI.
The report additionally clarifies that AI incident reporting should not fall under the existing Information Technology Act, 2000, or the CERT-In directions, as AI-related incidents have a broader scope. Instead, the focus should be on harm mitigation rather than penalizing reporting entities. Given the nature of gaming platforms and their services, any forthcoming AI regulations will likely impose additional obligations on these platforms to prevent AI misuse.
Emerging trends suggest that online gaming could play a more prominent role in policymaking. The government has been advocating for the inclusion of online gaming companies under anti-money laundering and counterterrorism financing frameworks, further shaping the regulatory landscape for the industry with the intention of enforcing stricter KYC protocols and mandatory reporting of suspicious activities. Of note, is the Tamil Nadu government’s move to establish an ‘Online Gaming Authority’ that monitors play duration and betting limits to mitigate addictive gaming behaviors. The deployment of AI will assist these organizations to enforce stricter KYC protocols, report suspicious activity, and promote responsible gambling, at feasible costs.
The Future of AI in Gaming
AI regulation is still in its early stages, and India’s approach continues to evolve through ongoing industry consultations. The goal is to establish an AI framework that encourages innovation while ensuring responsible deployment. As AI’s role in gaming expands, its applications will continue to grow, making effective regulation crucial.
Given the highly addictive nature of online gaming, AI regulations must explicitly ensure that monitoring user behavior and spending patterns is used solely to promote responsible gambling, rather than exploiting individuals with gambling addiction. With extensive discussions surrounding ‘Real Money Games,’ gaming platforms implementing AI will likely be subject to additional compliance requirements under the forthcoming AI framework, alongside existing regulations.