
European countries are moving toward restricting children’s access to social media, as growing concerns mount over its impact on mental health, safety, and development.
Just days after France’s lower house approved a bill to ban social media for children under 15, Spain’s Prime Minister Pedro Sánchez pledged to shield young people from what he called the online “digital Wild West.” Lawmakers and experts say excessive exposure to harmful online content is contributing to anxiety, addiction, sleep problems, and emotional distress among minors.
Specialists warn that children are particularly vulnerable because their brains are still developing. Research increasingly links heavy social media use with mental health challenges, especially among teenagers. European Commission President Ursula von der Leyen has also expressed support for setting a minimum age limit across the European Union, similar to Australia’s recent move to restrict social media access to those 16 and older.
Countries considering restrictions
Several European nations are now weighing similar steps:
- France: A bill banning social media for under-15s is moving through parliament.
- Spain: A proposed ban for under-16s is being added to draft legislation.
- Denmark: Political parties agreed to protect youth from online abuse; a law is pending.
- Italy: A proposal includes restrictions on young users and child influencers under 15.
- Greece: Officials say the country is close to introducing a ban.
- Portugal: New legislation would require parental consent for under-16s.
- Austria & UK: Both are reviewing similar measures.
European lawmakers have also recommended that children aged 13–16 should only access social media with parental approval.
One option under discussion is an EU digital identity system that verifies age without revealing private details. Supporters say this could allow platforms to confirm whether a user meets the age requirement while protecting personal data, dw.com reports.
However, youth digital rights groups question whether age-verification systems can truly protect privacy and whether bans alone address deeper problems within platforms.
Some experts argue that restrictions may not fix the root causes of harm. They point to platform designs that encourage addiction — such as endless scrolling, autoplay videos, and algorithms that amplify harmful content. Others note that online dependency does not disappear at age 15 or 16.
Still, supporters of the bans say governments are acting because progress under the EU’s Digital Services Act (DSA) has been slow. The DSA requires major platforms to reduce risks to minors and share data with researchers, but critics say enforcement has not yet produced major visible changes.
The discussion reflects a broader question: how to balance children’s safety, mental health, digital rights, and freedom of expression. Some advocates believe Europe should develop its own social platforms that better reflect European standards and values.
Governments are signaling that parental involvement will become central in children’s digital lives.
Proposed rules would:
- Give parents greater control over when and how children access social media
- Encourage supervision and digital literacy at home
- Reduce exposure to cyberbullying, harmful content, and online predators
- Help families set healthy screen-time boundaries
- Parents may need to prepare for new age-verification systems and consent requirements in the near future.
If implemented effectively, the measures could help:
- Lower anxiety and social comparison pressure
- Improve sleep and concentration
- Reduce exposure to inappropriate or violent content
- Decrease risk of online addiction
- Support healthier emotional and cognitive development










