The Aakhya Weekly #178 | Drawing the Age Line Online
In Focus: Where Childhood Meets the Algorithm
Parent WhatsApp groups across India are abuzz with forwards about Social media bans for under-16s. Messages bounce between relief and scepticism; some parents welcome the firmness of such decisions in other parts of the world, while others question its feasibility in the Indian context. Underneath these conversations lies a deeper concern: whether governments are finally stepping in where platforms and self-regulation have failed.
Australia has taken a landmark step by becoming the first country in the world to enforce a nationwide ban preventing individuals under the age of 16 from holding accounts on major social media platforms. This significant legislative intervention, formally enacted through the Online Safety Amendment (Social Media Minimum Age) Act, 2024, received Royal Assent on 10 December 2024 and came fully into force on 10 December 2025.
The law marks a departure from incremental online safety reforms. Rather than refining consent mechanisms or expanding takedown powers, Australia has chosen a clear, age-based prohibition treating social media exposure as a developmental risk that warrants preventive regulation during childhood.
Why the State Stepped In?
The primary goal of this ban is to address growing concerns about the impact of social media on children’s mental health, online safety, and data privacy. The move was informed by mounting evidence linking early and intensive social media use among adolescents with adverse outcomes, including higher rates of depression, obesity, and sleep disruption. Oversight of the ban has been entrusted to the eSafety Commissioner, reinforcing a regulator-led approach rather than reliance on courts or post-hoc litigation. The stated objective is twofold: to provide immediate protection to children and to compel platforms to redesign systems that have historically prioritised engagement over safety.
India’s Approach to Child Online Safety
India’s regulatory posture toward child online safety has evolved along a different trajectory. Rather than imposing access restrictions, Indian policy has focused on data protection, intermediary responsibility, and parental consent as primary safeguards. This approach is shaped by India’s scale, constitutional jurisprudence, and the role digital platforms play in education, communication, and economic participation.
While there is no dedicated law restricting minors’ access to social media, the legal architecture increasingly acknowledges children as a vulnerable category requiring enhanced protection within digital ecosystems.
DPDP Act and Child Data Protection
The Digital Personal Data Protection Act, 2023, is the cornerstone of India’s child online safety framework. It defines a “child” as any individual below 18 years of age, effectively setting a higher threshold for digital protection than many global counterparts. Section 9 of the Act mandates that data fiduciaries including social media platforms obtain verifiable parental consent before processing a child’s personal data.
It further prohibits behavioural tracking, targeted advertising, and any form of data processing that may harm a child’s well-being. Non-compliance attracts significant penalties, up to ₹200 crore, signalling the State’s intent to move beyond voluntary compliance. However, the DPDP Act is fundamentally a data governance statute, not an access regulation law. It assumes that children will continue to engage with digital platforms and seeks to mitigate harm by regulating how their data is collected, processed, and monetised.
Challenges in India’s Context
Translating child safety objectives into enforceable outcomes presents several structural and policy challenges.
Absence of Explicit Age-Based Access Rules: Without a clear statutory minimum age for social media access, responsibility is shifted largely onto parents and guardians. This approach leaves platform design choices and algorithmic risks insufficiently addressed at a systemic level.
Unclear Age Verification Standards: The DPDP framework does not prescribe specific age-verification methods. This regulatory gap may result in weak self-declaration practices or, alternatively, overly intrusive verification demands that raise privacy and exclusion concerns.
Digital Literacy and Socio-Economic Gaps: Parental consent regimes assume a baseline level of digital literacy and access. In practice, uneven awareness and capacity across socio-economic groups risk deepening digital exclusion rather than enhancing protection.
Verification of Parent–Child Relationships: Establishing the authenticity of parent or guardian consent at scale is operationally complex. In the absence of interoperable systems and clear guidelines, false or proxy consents could undermine the intent of child safeguards.
Scale and Enforcement Capacity: With millions of minors online, ensuring consistent compliance across platforms presents a significant logistical challenge, requiring regulatory capacity far beyond complaint-driven or reactive enforcement models.
Ambiguity in Key Legal Standards: Broad terms such as “well-being” and “appropriate safeguards,” while well-intentioned, lack operational clarity. Without detailed regulatory guidance, enforcement risks becoming inconsistent and uneven across platforms.
Towards Safer Digital Childhoods
Rather than adopting a binary ban, India’s policy challenge is to design a child-first digital governance framework that is enforceable at scale, constitutionally sound, and technologically realistic.
Graduated Access and Child-Safe Platform Design: Mandate age-appropriate platform design with graduated access for minors. This includes default limits on algorithmic recommendations, restrictions on virality, removal of engagement-maximising nudges, and built-in time caps, ensuring children’s participation is developmentally appropriate rather than commercially optimised.
Binding Platform Accountability Beyond Data Protection: Platform obligations must extend beyond data handling to systemic risk mitigation. Mandatory child-impact assessments, periodic algorithmic audits, and regulator-approved safety-by-design standards would shift accountability upstream, compelling platforms to proactively address harms rather than react to violations after damage occurs.
Privacy-Preserving Age Assurance Framework: A nationally coordinated, risk-based age assurance framework is required. Such a framework should set minimum reliability standards, prohibit excessive identity collection, enable interoperable verification methods, and balance child protection with constitutional privacy and inclusion concerns.
Treat Child Online Safety as Digital Public Infrastructure: Child online safety should be embedded into the digital public infrastructure agenda. This includes integrating digital safety education into school curricula, funding nationwide awareness campaigns, and supporting parents through simple, standardised consent and oversight tools.
Strengthened Regulatory and Enforcement Architecture: Effective protection demands institutional capacity. A dedicated child digital safety unit within existing regulators, supported by transparent reporting, penalty escalation, and compliance monitoring, would reduce reliance on courts and ensure consistent, preventive enforcement across platforms.
These measures reflect an emerging policy consensus: safeguarding children online requires reshaping platform behaviour and regulatory incentives, not merely shifting responsibility onto parents or users.
India at a Policy Crossroads
Australia’s pioneering social media ban for minors has ignited a global conversation, with nations like Denmark, Norway, and France exploring similar restrictions. The Madras High Court has specifically urged the Indian Union Government to consider adopting a law akin to Australia’s, emphasising the vulnerability of children online and the need for stricter protections.
While some models prioritise an outright ban with strict enforcement on platforms, India’s DPDP Act focuses on comprehensive data protection requiring verifiable parental consent for data processing. Both approaches reflect a global consensus on the urgent need to protect children online, but they differ significantly in their mechanisms and potential implications.
For India, the policy choice is not binary. Importing the Australian model wholesale would require a fundamental shift from a consent-centric data protection regime to a platform liability and access-control framework. The more viable path may lie in a hybrid approach combining stronger platform accountability, clearer age-based design obligations, and enhanced regulatory oversight, without undermining access to essential digital services.
Top Stories of the Week
Govt Proposes Aadhaar-like ID for EV Batteries
The Ministry of Road Transport and Highways has proposed an Aadhaar-like unique identification system for electric vehicle (EV) batteries to improve traceability, transparency and recycling. Under draft guidelines, battery producers and importers will be required to assign a 21-character Battery Pack Aadhaar Number (BPAN) to every battery placed in the market and upload lifecycle-related data on a central portal.
The BPAN will track batteries from manufacturing to use, recycling or disposal, with new IDs required for recycled or repurposed batteries. EV batteries accounting for 80–90% of India’s lithium-ion battery demand have been prioritised under the framework, which is proposed to be developed through the Automotive Industry Standards Committee to enable stakeholder consultation and regulatory alignment.
Min. of Textiles Signs MoUs with 15 States to Strengthen Textile Data Systems under Tex-RAMPS
The Ministry of Textiles has signed MoUs with 15 States and Union Territories to enhance the coverage, quality, and credibility of textile-related statistics and research across the country. These MoUs are part of the Textiles Focused Research, Assessment, Monitoring, Planning and Start-Up (Tex-RAMPS) scheme and aim to establish a robust operational framework for strengthening textile data systems. By enhancing data collection, analysis, and monitoring mechanisms, the initiative aims to address longstanding gaps in textile sector statistics and support evidence-based policymaking.
Under the scheme, the Ministry will provide a yearly grant of ₹12 lakh per State/UT to support institutional and structural reforms in textile statistics. To extend the initiative’s reach to the grassroots level, an additional ₹1 lakh per district per year will be allocated, linked to the formulation and implementation of district-level action plans. Through these measures, the Ministry aims to reinforce the Textiles Statistical System and ensure more reliable, comprehensive, and timely data for the sector.
A Few Good Reads
Harsh V. Pant argues that Donald Trump’s Venezuela intervention shows how “America First” quickly turns into familiar U.S. military and power politics.
Shashi Tharoor emphasizes that India must legally enforce the right to disconnect to safeguard workers’ health, dignity, and long-term productivity in an always-on digital economy.
Nagesh Kumar contends that Budget 2026 must decisively address India’s chronic underinvestment in research and development.
“Social media has emerged as a parallel income stream, an extension of their primary gig, because it lowers entry barriers and rewards visibility,” as Pratishtha Bagai writes about gig workers finding income and recognition online.
Chetan Aggarwal writes on India’s space economy entering a new strategic era shaped by policy reform, private participation, and emerging global competition.


