The European Parliament has issued a strong call for the European Union to adopt strict, bloc-wide age limits for children’s access to social media platforms, video-sharing sites, and AI-driven companion tools. Lawmakers say the move is urgently needed to protect the mental health and wellbeing of young people, who they argue are increasingly exposed to online risks without meaningful oversight from either governments or technology companies. The resolution, which passed with an overwhelming majority of 483 votes in favour, 92 against and 86 abstentions, urges the EU to take concrete steps to address what legislators describe as a rapidly escalating youth mental-health crisis linked to excessive screen time, algorithm-driven content consumption, and manipulative platform features. Although the resolution itself is non-binding, it represents one of the strongest political messages yet from EU lawmakers about the future of children’s digital safety.
The proposal outlines a structured access system based on age groups. Under the recommendations, children under 13 would be completely barred from using social media platforms, video-sharing websites, or AI conversational companions. Those aged 13 to 15 would be allowed access only if parents or legal guardians provide explicit consent. After turning 16, minors would be able to use online platforms freely, without restrictions. Lawmakers say these age rules are essential because current protections vary widely across EU member states, making it difficult to guarantee consistent levels of safety for young users throughout the bloc. At the moment, European countries set their own rules based on national laws, despite existing EU-wide frameworks such as the Digital Services Act. The Parliament argues that the current system leaves major gaps in youth safety, particularly when it comes to powerful recommendation algorithms, immersive platforms, and AI-powered tools that can mimic emotional interaction.
The momentum for stricter rules has been growing across Europe and internationally. Australia is preparing to introduce what would be the world’s first outright social-media ban for children under 16, while Denmark and Malaysia are also considering their own versions of nationwide restrictions. French President Emmanuel Macron has previously pushed for the EU to introduce a ban on social media access for children under 15, reflecting a growing sense of political urgency over the scale of online harm. But while many leaders support the idea of stricter regulation, there is little clarity about how such rules could be enforced in practice, especially since individual EU governments retain authority over setting age limits. Some lawmakers say that without harmonized, EU-wide rules, children will continue to face inconsistent protections and platforms will continue to evade accountability by relying on minimal or ineffective age-verification tools.
During the parliamentary debate, Danish lawmaker Christel Schaldemose — who sponsored the resolution — delivered a pointed warning about the influence of major global tech platforms on young people. She described the current situation as an “experiment” in which companies, including those led by figures such as Elon Musk and Mark Zuckerberg, along with firms linked to China’s Communist Party, have nearly limitless access to children’s attention. Schaldemose argued that platforms have designed services to keep minors engaged for long periods, often through persuasive algorithms and features that are far beyond a young person’s ability to regulate. She said these services were never built with children’s safety in mind and that the EU must now step in to end this unregulated exposure.
The resolution also goes beyond age restrictions, calling for the EU to tackle specific features that lawmakers say pose particular risks to minors. Among these are loot boxes — digital items or rewards in games that players can buy or win randomly, often using real money. Critics say loot boxes resemble gambling and can condition children to spend impulsively. Lawmakers also want to ban engagement-based recommendation algorithms for minors. These algorithms, which prioritize content likely to capture attention, have been widely criticized for amplifying harmful material such as self-harm content, violent imagery, and content promoting unrealistic beauty standards. The Parliament argues that removing these algorithms for minors is essential to preventing the spread of damaging content that disproportionately affects adolescent mental health.
In addition, the resolution urges the EU to adopt laws requiring age-appropriate platform design. This would force companies to rethink everything from interface layouts to notification systems to ensure that products are built with children’s developmental needs in mind. Lawmakers argue that current platform designs often encourage constant engagement, making it difficult for young users to disengage or regulate themselves. The call for age-appropriate design includes safeguards against manipulative or addictive features such as endless scrolling, autoplay functions, or pressure-driven social metrics like public follower counts.
The discussion around youth safety online is part of a much broader global debate. In the United States, companies such as Meta, Google, TikTok, and Snapchat face a wave of lawsuits from states, school districts, and parents who accuse them of contributing to a nationwide mental-health crisis among teenagers. Research cited in various legal filings points to increasing rates of anxiety, depression, body-image issues, and sleep disruption among adolescents, which critics link to prolonged use of social platforms. The EU resolution reflects these global concerns and positions Europe as one of the most proactive political blocs addressing digital risk for minors.
While the new EU resolution does not immediately change any laws, it puts pressure on the European Commission to draft legislation and could influence how national governments shape their own policies over the coming years. The strong parliamentary backing signals that the issue is likely to gain significant political traction. If the Commission decides to act on the recommendations, social-media companies could face major future obligations — including robust age-verification systems, redesigned algorithms, stricter content rules for minors, and enhanced parental-control mechanisms. However, any formal legislation will require negotiation among EU member states, where opinions on digital regulation often differ.
For now, the resolution marks an important step in the EU’s broader strategy to protect young people in digital spaces. It highlights deepening concerns about how social media and digital technologies affect children’s health, safety, and autonomy. Lawmakers emphasized that the goal is not to restrict children unnecessarily, but to ensure they grow up in a digital environment that is safe, fair, and developmentally appropriate. As the political debate continues, the coming months will determine whether the EU moves from strong recommendations to binding law — potentially reshaping how millions of European children engage with the online world.






