DPDP Act and Children's Data: What Every Business Must Know About Section 9

By Divya Oberoi | DPDP | 2026-03-04

Section 9 of the DPDP Act 2023 imposes strict obligations on businesses processing children's data — including verified parental consent and a ban on behavioural tracking. Here's what EdTech, gaming, and digital businesses in India need to do.

Why Children's Data Is the Highest-Risk Category Under DPDP India has over 500 million internet users under the age of 18 . They use educational apps, play online games, watch videos, and interact on social media — all while generating enormous volumes of personal data. The Digital Personal Data Protection (DPDP) Act 2023 recognises this vulnerability. Section 9 is dedicated entirely to children's data and places some of the strictest obligations in the entire Act on businesses that process it. If your product, platform, or service has users under 18 — even a small percentage — this section applies to you. What Does Section 9 Actually Say? Section 9 of the DPDP Act lays down three non-negotiable rules for any Data Fiduciary processing a child's personal data: 1. Verifiable Parental Consent Is Mandatory Before processing any personal data of a child (defined as anyone below 18 years), you must obtain verifiable consent from a parent or lawful guardian . This is not a simple checkbox. The Act requires that: The identity of the parent or guardian is verified through a reliable mechanism The consent is specific to the purpose of data processing General or blanket consent is not acceptable This is a significant departure from how most digital products work today — where a child can simply enter a birth date (or lie about it) and gain full access. 2. No Behavioural Monitoring or Tracking Section 9 explicitly prohibits tracking, behavioural monitoring, or profiling of children . This means: No tracking browsing patterns to build user profiles No collecting usage data for personalisation algorithms No monitoring activity to create engagement scores or behavioural predictions For EdTech companies that rely on "adaptive learning" powered by usage analytics, this creates a real compliance challenge. The learning algorithms that track how a student interacts with content may fall squarely under this prohibition. 3. No Targeted Advertising Directed at Children You cannot serve targeted or personalised advertisements to users identified as children. This includes: Interest-based advertising using collected data Retargeting campaigns based on a child's browsing history Sponsored content personalised using behavioural signals Contextual advertising (ads based on the content being viewed, not the user) may still be permissible, but the line is narrow. Who Needs to Worry About Section 9? If you think this only applies to children's apps, think again. Section 9 applies broadly: EdTech platforms — learning apps, online tutoring, school management systems Gaming companies — mobile games, console platforms, in-game purchases Social media — any platform where under-18 users can create accounts E-commerce — if children can browse and purchase (or influence purchases) OTT and streaming platforms — children's content sections Healthcare apps — paediatric health trackers, mental health tools for teens Schools and educational institutions — student information systems Even if your product is not "designed for children," if children use it, you are responsible. The Age Verification Challenge One of the most debated aspects of Section 9 is how to verify a user's age in the first place. The Act mandates parental consent for children but does not prescribe a specific age verification method. Practical approaches businesses are considering include: Government ID verification — linking to Aadhaar or other identity systems for parents Credit card or UPI verification — using a payment method as a proxy for adult identity AI-based age estimation — using facial analysis (raises its own privacy concerns) Teacher or school verification — for EdTech platforms operating through institutional channels Self-declaration with parental email confirmation — the lightest approach, but may not meet "verifiable" standards The government is expected to issue rules clarifying acceptable methods. Until then, businesses should implement the most robust mechanism feasible for their context. Penalties for Getting It Wrong Non-compliance with Section 9 obligations can attract penalties of up to ₹200 crore per instance under the DPDP Act's penalty framework. Given that the Data Protection Board of India will have broad enforcement powers, businesses processing children's data without proper safeguards face significant financial and reputational risk. Beyond fines, there is the reputational damage. A news headline about a company mishandling children's data can be far more damaging than the penalty itself — especially for consumer-facing brands. Section 9 Compliance Checklist for Businesses Here is a practical, step-by-step approach to achieving compliance: Step 1: Identify if Children Use Your Platform Audit your user base. Check registration data, usage patterns, and any indicators that suggest users under 18 are present. Even if you don't explicitly collect age data, look at the nature of your product and its likely audience. Step 2: Implement Age Gating Add an a