UK considering ban on TikTok, Snapchat and Instagram for children

The Technology Secretary of the UK, Peter Kyle, has proposed a potential ban on popular social media platforms, including TikTok, Snapchat, and Instagram, for children under the age of 16. This decision comes as part of a broader initiative to protect children online, amidst growing concerns over harmful content and age-inappropriate material. Kyle warned that if tech companies do not take adequate measures to safeguard young users, stricter regulations, including a ban, may be implemented. This announcement coincides with the upcoming enforcement of the Online Safety Act in 2025, which will introduce new safety obligations for social media companies to ensure user protection, especially for minors.

In a recent statement to the Telegraph, Kyle emphasised the need for a proactive approach in enforcing online safety measures. He stressed the importance of collaboration between tech firms and regulatory bodies to uphold the standards outlined in the Act. While indicating a willingness to explore further regulatory actions, Kyle cited discussions with Australian officials regarding similar social media restrictions for under-16s. He expressed a comprehensive commitment to enhancing children’s online safety, stating, “When it comes to keeping young people safe, everything is on the table.”

Latest News
As preparations for the implementation of the Online Safety Act progress, Kyle has released a set of strategic priorities for Ofcom, the regulatory authority overseeing telecommunications in the UK. The strategic focus includes promoting “safety by design” principles, fostering transparency among tech companies regarding harmful content, and advocating for digital environments that are resilient to online harms. Ofcom is expected to adapt its regulatory approach to address emerging challenges such as artificial intelligence and adopt innovative technologies to bolster user safety effectively.

Latest News
Acknowledging the critical role of technology in shaping young lives, Kyle announced a research initiative to investigate the impact of smartphone and social media usage on children. The move aims to deepen the understanding of potential risks associated with digital platforms and inform future policy decisions. Notably, Ian Russell, chairman of the Molly Rose Foundation, lauded the strategic priorities outlined by Kyle, recognising them as essential steps towards improving online safety. However, he underscored the ongoing need for comprehensive reforms to fortify existing regulations and prioritise harm reduction strategies.

María Neophytou, director of strategy and knowledge at the NSPCC, welcomed the proposed priorities, highlighting their transformative potential in creating safer digital spaces for children. Neophytou emphasised the urgency of tech companies assuming greater responsibility in addressing online harms, particularly related to cyberbullying, self-harm content, and child exploitation. By championing more significant transparency and innovative solutions, the government aims to foster a safer online environment, conducive to the well-being of all users, especially vulnerable children and adolescents.

In response to the government’s strategic directives, Ofcom affirmed its commitment to advancing online safety initiatives and enhancing protections for children and adults alike. The regulatory body expressed readiness to collaborate with stakeholders to achieve a safer digital landscape, underscoring the importance of collective efforts in combating online risks and ensuring a secure online experience. As the implementation of the Online Safety Act approaches, stakeholders across the tech industry, regulatory authorities, and advocacy groups are poised to play a pivotal role in safeguarding users and fostering a culture of responsible digital engagement.