The digital space is not safe enough for children and young people and platforms and regulators should carry the responsibility for fixing it, the European Committee of the Regions (CoR) said in a new statement.
The view comes from an opinion on “The Protection of Youth and Minors in the Digital Sphere”, adopted unanimously at the European Committee of the Regions’ plenary session on 6 May 2026.
The Committee of the Regions said children are facing risks including exposure to incitement to hatred, cyber-bullying that can lead them to withdraw from social life, and disinformation that can affect young people’s participation in democratic processes.
It added that deep-fakes and chatbots created by generative AI have brought further risks that existing rules have not yet addressed adequately.
Cities and regions said platform design features contribute to harm, citing opaque recommendation algorithms and interaction mechanisms designed to keep users engaged.
The opinion calls for measures to prohibit or restrict practices that encourage addiction, including “loot boxes” in video games, and for transparency around design mechanisms that promote compulsive use.
The Committee of the Regions said a small number of international providers dominate the market while benefiting from liability rules that leave them with limited direct legal responsibility for content.
It rejected transferring responsibility to minors and also rejected blanket social media bans, saying they would restrict young people’s rights to information, privacy and participation.
Age checks, platform rules and media literacy
A minimum age of 14 for access to certain social media services could be considered where mandatory age verification is in place, combined with enforceable age-appropriate design standards for platforms serving users up to 16 years old, the Committee of the Regions said.
It called for a “safety-by-design” approach for services used by minors, including removing “dark patterns” such as infinite autoplay, manipulative notifications and reward loops, according to the statement.
The opinion also asked for mandatory children’s rights impact assessments for all digital services, and for consistent enforcement of EU rules including the Digital Services Act and the Audiovisual Media Services Directive.
The Digital Services Act sets obligations for online platforms, including major services, while the Audiovisual Media Services Directive sets rules for audiovisual media across the EU.
Cities and regions backed age verification systems provided they are proportionate, respect privacy and do not exclude vulnerable groups, and they called on EU member states to implement Audiovisual Media Services Directive rules concerning influencers.
Local and regional authorities should play a leading role in media literacy, with the opinion noting disparities in connectivity, digital skills and access to support services across urban and rural areas.
Glenn Micallef, the European Commissioner for Intergenerational Fairness, Youth, Culture and Sport, told the debate at the plenary session that protecting young people online has become a “societal responsibility” and said work should focus on regulation, prevention and empowerment.

