On 20 February 2025, we hosted a webinar in collaboration with All India Gaming Federation (AIGF) and All India Gaming Developers Forum (AIGDF), titled ‘Children & Persons with Disabilities: Key Implications of the Draft Digital Personal Data Protection Rules 2025’ (Draft Rules). The session highlighted critical implementation challenges in the free-to-play (FTP) gaming sector, focusing specifically on parental consent, tracking and monitoring restrictions for children, and data processing requirements for persons with disabilities (PWDs).
In this blog, we break down some of the key takeaways from the session, focusing on the practical steps required to implement these requirements along with the strategies to make games compliant and future ready.
1. Verifiable Parental Consent
(i) Age-Gating:
The Draft Rules do not explicitly mandate platforms to verify the age of users at the outset. Instead, they broadly require platforms to implement ‘appropriate technical and organisational measures’ to obtain verifiable parental consent before processing children's data. In practical terms, platforms must have some mechanism to identify whether their users are children in the first place, without which they would risk inadvertently processing children's data without parental consent.
In recent consultations officials have suggested that stringent age-gating might not be mandatory. The implication of this may be that a risk-based approach can help balance the risk of non-compliance and compromises on user experience. In some cases, self-declarations may suffice, but platforms catering to children or those with higher risks may require stronger verification measures. For gaming companies, age-gating is a key consideration since they are highly attractive to children, who make up a significant portion of the user base, especially in free-to-play models. These platforms also collect personal data, from device and gameplay patterns to social interactions and payment information. Additionally, many gaming platforms incorporate social and real-time interactive features. However, not all games pose the same level of risk, making it essential for age verification mechanisms to be adaptable based on the specific nature of the platform and its audience.
(ii) Obtaining Parental Consent:
The flexibility in obtaining parental consent under the Draft Rules can make compliance manageable – allowing businesses to design consent processes that align with their resource capacities. However, challenges remain, especially around the timing of consent. Even with flexible methods, the requirement for parental consent can create friction at a critical moment – what other experts call ‘synchronous consent’ – potentially leading to user drop-off. Many FTP games rely on quick, impulse downloads and immediate engagement to grow their player base, but if a child has to wait for parent consent, they may lose interest or move on to another game. For startups, an additional hurdle is building trust. Established gaming companies may find it easier to get parents to take consent requests seriously, but newer studios might find it harder to obtain the consent.
Beyond these operational challenges, there is also the fundamental question of whether the person giving consent is actually the parent. Children can easily bypass restrictions by using a parent's device to grant themselves permission. Moreover, research highlights concerns about the effectiveness of parental consent, especially in rural India, where children often have greater digital literacy than their parents.
(iii) Verifying the Age of the Person Giving Consent:
The Draft Rules appear to assume that platforms already possess reliable identity and age information about parents, but what qualifies as ‘reliable’ remains unclear – can it include basic details like email addresses, phone numbers, or device information? Or does it require more comprehensive identity and age verification data? This ambiguity presents a challenge, particularly for smaller businesses that may not have access to extensive user data – something even larger companies might struggle with.
On the other hand, integrating with digital locker services or ID verification systems would require significant technical infrastructure, including secure API connections, encryption protocols, user verification flows, and database management for storing verification status. This demands both technical expertise and financial investment, which many FTP startups may find difficult to find. The F2P model thrives on minimal friction in user onboarding, and introducing identity verification could create barriers, particularly for casual games and smaller studios where players may be unwilling to go through lengthy verification steps.
2. Tracking and Monitoring Restrictions
The Act imposes a general prohibition on tracking, behavioral monitoring, and targeted advertising to children. However, the Draft Rules provide specific purpose-based and entity-level exemptions where these activities are permitted. These exemptions cover scenarios such as preventing detrimental effects on children's well-being, among others. But the exemption schedule does not clearly define what constitutes a "detrimental effect," creating uncertainty about its scope. Companies may need to collect certain data for legitimate safety and user experience purposes, but without a clear definition, it becomes difficult to determine what is permissible. For instance, is collecting data to optimize game difficulty levels or recommend age-appropriate content considered a way to prevent harm? What about tracking user behavior to detect addiction patterns or excessive spending? This lack of clarity forces businesses to make judgment calls that could later be questioned by regulators. Moreover, if platforms are forced to serve content without monitoring how it is consumed, they may struggle to surface relevant and educational content for users while inadvertently making it easier for bad actors to push low-quality or harmful material.
The Draft Rules also fail to recognize how these activities are fundamentally different from data collection for commercial purposes. It is not about maximizing monetization; rather, it is about creating a more effective and supportive environment for children’s development. The Draft Rules currently lack carveouts for such positive use cases, failing to distinguish between data collection that serves a child’s best interests and tracking for purely commercial gain.
3. Rules for Persons with Disabilities (PWDs)
The Draft Rules categorize PWDs into two groups: individuals with impairments who, even with support, are deemed unable to make legally binding decisions, and individuals with specific conditions such as autism or cerebral palsy, among others. For data processing, the Draft Rules require verifiable consent from a court-appointed guardian. Companies must conduct due diligence to verify this guardianship, ensuring that the guardian was appointed by a court, designated authority, or local committee.
This approach raises legitimate concerns, particularly regarding its ableist assumptions. Disability exists on a spectrum, and many individuals may require varying levels of support while still retaining decision-making agency. This also applies to conditions such as autism and cerebral palsy—these are not uniform disabilities, and it is not correct to assume that all individuals with these conditions require a guardian’s consent to access basic internet services. This framework also conflicts with established Indian case law and international human rights treaties, which emphasize autonomy and decision-making capacity. Indian courts have generally rejected the assumption that disability alone implies incapacity.
Beyond these conceptual issues, the Draft Rules also create practical challenges for businesses. Companies would have to verify official court orders or legal documentation appointing guardians. There is no centralized digital system in India for verifying guardianship orders, which are typically issued as paper documents by different courts across the country. Developing systems to recognize and validate legal documents from multiple states would be challenging without clear implementation guidance or centralized verification mechanisms.