SOCIAL MEDIA REGULATIONS FOR CHILDREN: GLOBAL INITIATIVES
With the rise of digital platforms, social media has become a prominent part of children’s lives. While it provides opportunities for learning and socialization, it also raises concerns regarding children’s safety online.
The Australian Prime Minister recently proposed a plan to set a minimum age for social media use to protect children from online harms. The pandemic significantly increased children’s screen time, leading to heightened awareness of the risks. One emerging concern is “Sharenting”—the practice of parents sharing personal information or images of their children online.
Efforts to regulate children’s social media usage are being undertaken globally, with countries developing policies to strike a balance between digital engagement and online safety.
In India, the Digital Personal Data Protection Act (DPDPA) 2023 aims to safeguard children’s privacy on the internet. Similarly, various other nations have implemented laws and policies to regulate children’s internet usage and ensure a safer digital environment.
Social Media and Traditional Media
- Social media: Refers to websites and apps that allow people to create, share, and exchange content in virtual communities (e.g., Facebook, Instagram, LinkedIn).
- Traditional Media: Includes newspapers, magazines, and newsletters. While newspapers are a form of print media, they are not categorized as social media.
Regulatory Efforts in India
Digital Personal Data Protection Act (DPDPA) 2023: Focuses on protecting children’s online data.
Section 9 outlines specific conditions for managing children’s data under 18 years:
- Parental Consent: Verifiable consent from a parent or guardian is required before handling a child’s data.
- Child Well-being: Personal data processing must prioritize the child’s welfare.
- Restrictions on Targeting Children: Bans on tracking, behavioral monitoring, and targeted advertising aimed at children.
Karnataka High Court Recommendation (2023): Proposed raising the social media age limit to 21 years, citing concerns about social media addiction among schoolchildren and its adverse effects on mental health.
Global Regulatory Efforts
South Korea:
- Introduced the Cinderella Law (Shutdown Law) in 2011, which banned children under 16 from playing online games from midnight to 6 AM to combat internet addiction.
- The law was abolished in August 2021 after proving challenging to enforce.
United States:
- Children’s Online Privacy Protection Act (COPPA), 1998: Requires websites to obtain parental consent before collecting data from children under 13.
- Children’s Internet Protection Act (CIPA), 2000: Mandates that schools and libraries receiving federal funds filter harmful online content.
European Union:
- In 2015, proposed banning children under 16 from accessing the internet without parental consent.
- The General Data Protection Regulation (GDPR), 2018: Establishes strict data privacy laws across the EU, serving as a model for other nations. It gives users control over their personal information.
United Kingdom:
- Age-Appropriate Design Code: Requires platforms to prioritize children’s safety by implementing stricter default privacy settings.
- The UK currently requires parental consent for online access at age 13. A government panel in 2024 recommended raising this age to 16.
France:
- July 2023 Law: Mandates that social media platforms block children under 15 from accessing their platforms without parental consent. Non-compliance results in penalties up to 1% of the platform’s global revenue.
China:
August 2023 Restrictions: Limits minors’ daily internet usage based on age:
- 16-18 years: 2 hours
- 8-15 years: 1 hour
- Under 8 years: 40 minutes
Internet access is banned from 10 PM to 6 AM. Exceptions exist for educational apps.
Brazil:
- In April 2023, enacted child data protection laws to regulate how companies handle children’s data, part of a larger Latin American initiative to enhance online safety for minors.
State of Digital Literacy in India
- Digital Literacy in India remains low, with only 40% of people knowing basic computer functions according to the NSSO 2021 report.
- A study in Tier 2 and Tier 3 cities revealed that 80% of children help their parents navigate online platforms, showcasing the digital gap.
- India’s linguistic diversity and widespread device-sharing practices present challenges for consistent digital safety measures.
Reasons for Regulating Social Media Usage for Children
- Safety Concerns: Children are exposed to risks like cyberbullying, online predators, and harmful content, which can impact their well-being.
- Mental Health: Excessive social media use can exacerbate anxiety, depression, and body image issues among children.
- Exposure to Inappropriate Content: Social media may expose young users to pornography or explicit material, shaping unhealthy perceptions of relationships and sexuality.
- Misinformation: Children are vulnerable to misleading information and online propaganda.
- Encouraging Real-Life Connections: Limiting social media use can foster face-to-face interactions and improve children’s social skills.
- Technological Accountability: Advocates argue that tech companies should create safer environments for children rather than leaving responsibility solely to parents.
Challenges Against Banning Social Media for Children
- Enforcement Difficulties: Digital bans are hard to enforce. Children can easily bypass restrictions, as seen in South Korea’s Cinderella Law.
- Parental Burden: Low digital literacy in many households makes it difficult for parents to effectively monitor their children’s online activities.
- Loss of Positive Engagement: Social media offers educational and creative opportunities that could be lost with blanket bans.
- Freedom of Expression: Children have the right to access information and express themselves, and outright bans could infringe upon these rights.
- Benefits of Social Media: Platforms provide spaces for community-building, where children can connect with others, learn, and stay informed about global issues.
Way Forward
- Education and Awareness: Schools should introduce digital literacy programs that teach children about safe internet usage, privacy, and recognizing online risks.
- Warning Labels: Apps should include warnings about the potential mental health risks associated with social media use, similar to cigarette packaging.
- Safe Platform Design: Tech companies need to prioritize child safety by integrating protective features and stronger privacy settings.
- Collaborative Regulation: Governments, educators, and tech firms should work together to create balanced regulations, using frameworks like the UK’s Age-Appropriate Design Code.
- Parental Involvement: Parents should be encouraged to engage in their children’s digital lives by modeling healthy online habits and discussing potential risks and experiences.
Conclusion
Regulating social media usage for children is a complex issue that requires balancing their need for digital engagement with the necessity of safeguarding them from online harms. While various global efforts, including those in India, are striving to address this, more emphasis on digital literacy, collaborative regulation, and parental guidance will be crucial in creating a safe and beneficial online environment for children.