Behind The Screen: Safeguarding Childhood in The Age Of Social Media
- THE GEOSTRATA

- Sep 26
- 6 min read
Updated: Oct 17
In the digital age, under the constant gaze of prying eyes, surveilling every move and gnawing at every step, the smartphone screen, once meant for communication, has turned into a blackhole with algorithms not just reflecting who people are, but rather actively shaping them.
Illustration by The Geostrata
Psychologist Erik Erikson, in his theory on psychosocial development, expanded on the concept of identity and role confusion, with 12-18 years of age as the formative stage for shaping the idea of self. At this stage, the media has shifted from being a means of information to a driving force in moulding identities, values and cognitive thinking.
Humans are a social construct, and everything we consume, particularly in the formative stage, makes us who we are, and that puts the focus on the haunting reality of social media, which no filter can blur.
The Netflix original TV Series, Adolescence, diving into the very evil side of social media, expanded upon how these young, impressionable minds are today forging identities driven by peer pressure and the content they are exposed to. Social media today is no longer meant for entertainment, but actively plays a role in shaping identity. Today, social media has stopped being just a space for entertainment, but rather an identity-shaping force for adolescents.
A charity fund in the UK, the Molly Rose Foundation, accused TikTok and Instagram of targeting teens with suicide and self-harm content. Almost 97% of the Instagram reels and 96% of TikTok content were found to be harmful. This has moved beyond a nationwide concern and calls for global action.
THE SCREEN OVER THE PLAYBOOK
Children stand as the most at-risk participants in cyberspace. While the internet opens room for education, creativity and global connection, it also exposes the naive minds to the brutality of the digital world, through cyberbullying, online exploitation and misinformation.
A 2017 UNICEF report found that 1 in 3 internet users worldwide is a child, and with an abject dearth of a strong protective framework and parental guidance, the children are susceptible to harm.
Even in advanced economies, the crisis remains, with more than half of the child sexual abuse webpages emerging in EU countries.
The detrimental effects of social media on physical appearance put child protection in the limelight, as a report published by the National Library of Medicine concluded that extreme exposure to social media and content has led to body dysmorphia and a surge in eating disorders, including Anorexia and Bulimia. The consumption of the quality and the type of content, and not the quantity at which it is consumed, is the predominant force steering the negative effects of media consumption.
The vulnerabilities of children online are universal with broad implications; however, to mitigate them, the approach adopted by diverse nations differs considerably from the global north to the global south, shaped by varying infrastructural levels, cultural limitations and the legal provisions prevalent in the country.
The global north recognises the high stakes of online ecosystems, with children no longer passive consumers of the digital medium, calling for legal intervention. A staggering 26.5% of American teenagers bore the brunt of cyberbullying in March, including 73% of girls and women receiving unwanted sexual content.
According to a study published by JAMA, young people exposed to cyberbullying today are at a 50% increased risk of suicidal thoughts as to their peers.
These alarming figures have led to nations such as Australia, the UK and the US to institute legislative mechanisms for child protection
GLOBAL RESPONSE TO THE SHARED CRISIS
The United States of America, with the implementation of the Communications Decency Act section 230 of 1996, places the responsibility upon the social media platforms to review the content being consumed. The Children’s Online Privacy Protection Rule (COPPA) focuses on child protection rights, addressing the privacy and safety concerns of children.
However, the landmark law faces its fair share of criticisms with inefficient implementation and statistical backing that no company has ever since its adoption been charged by the FTC in violation of the COPPA. These large corporations often go for settlement to avoid long prosecution procedures and rarely face the wrath of the law, prioritising profit over consumer safety.
The 2023-enacted Online Safety Act of the UK imposes new legal restrictions on social media platforms to safeguard users from harmful and illegal content. Ofcom, established under the Online Safety Act of 2023, plays the pivotal role of an independent regulator for online safety.
Platforms are legally obliged to protect children online, and Ofcom acts as a watchdog. It further ensured that platforms use highly effective age assurance methods to prevent access to self-harm content and pornography. With stringent penalties, including imposition of a fine of 10% of the worldwide revenue in extreme cases to appeal for blocking platform services, the image is clear that child protection is no longer optional but a legal necessity.
While social media platforms themselves maintain a spectrum of internal policies to moderate and define harmful content differently, including aggressive takedown policies, while others focus on user reporting mechanisms, the concern remains: should private entities with profiteering motives be trusted with public goodwill?
INDIA'S DIGITAL LIMITATIONS
India faces uneven law enforcement, a lack of digital literacy, and cultural boundaries. The Digital Personal Data Protection Act 2023 defines a child as an individual who has not attained the age of 18 years. With the latest NCRB data of 2022, the cybercrime cases against children have seen a surge from 232 in 2018 to 1832 in 2022.
The DPDP Act, enforced on 11 August 2023, stresses parental consent before processing any personal data of the child. The act puts prime focus on limiting targeted advertising at children, though it carries its inherent constraints, including the central government having the liberty to grant all exemptions for data fiduciaries under prescribed conditions.
If certain companies are deemed safe by the government, the act’s requirements of parental consent and ban on targeted advertisement and tracking could be lifted. These flexibilities for data fiduciaries could shape the Act’s reach and its future outcomes. India, in comparison to developed nations, today faces the crisis of regulating online sexual abuse.
Section 67 B of the IT Act of 2000, while prohibiting child pornography, briefly addresses the concern of child grooming, which has been rectified by being addressed under Section 11 of the POCSO Act 2012; however, neither the IT Act nor the POCSO Act address the concern of online solicitation leading to in-person meetings.
Countries such as the UK, Norway, and Australia criminalise such grooming regardless of the intention; the provision is still missing in the Indian laws.
Further, the POCSO Act 2012 does not effectively address the concern of deep-fake and AI-generated images, unlike countries such as the EU, Norway, and the UK, which have laws in place criminalising such cases irrespective of the intention. India still needs to take concerted, collaborative efforts to create a comprehensive and stringent framework, as safeguarding the children is at stake.
India needs to first recognise child grooming as a criminal offence, with stringent penalties that may even lead to imprisonment in serious cases. Further, it must work upon building the DPDP Act and the IT Rule Act of 2000 to standardise the age limit across all social media platforms and build upon curating an age-appropriate design code that automatically bans targeted ads, giving primacy to child safety and protecting private details.
India should further launch a digital sensitisation programme across all tier cities to raise awareness about wrongful content, along with launching a centralised national online safety guidance portal, elaborating on the imperative need for child safety, and establishing a national complaint centre to ensure swiftness in ensuring child rights to safety and privacy in the digital age.
CONCLUSION
In the wake of an increasingly interdependent world, child protection rights, while imperative, follow divergent enforcement mechanisms that differ based on territory. Cross-border jurisdictional conflict with content posted in one country, impacting another nation, faces its limitations due to the lack of an umbrella international regulation, and remains the greatest challenge in the digital age, considering the cultural variations and what nations define as ‘harmful content’.
Child safety is no longer a national concern but a global imperative, and the world needs to stop treating it as an option but rather a necessity, to collaborate and promote comprehensive laws, and make big tech companies act responsibly to safeguard the young minds who will define the generations to come.
BY ANANYA SHARMA
TEAM GEOSTRATA
.png)








Comments