“Can’t be done”. This is what platforms have argued when concerned parents and public authorities have asked them to do more to protect kids online. Turns out, it can be done.
On Wednesday 15 April, the European Commission presented its long-awaited age verification solution, which will allow users to prove their age without exposing personal data. Before its roll out across Member States, however, it must be technically sound, secure and able to withstand attempts to bypass or hack it.
Age verification is a first necessary step to protect minors online but legislators should do more. They should turn to the playbook that was used to tackle an equally dangerous and addictive product: cigarettes.
For decades, the tobacco industry denied the addictive nature of its products, questioned scientific evidence and targeted young consumers to secure long-term profits. They also shamelessly misused the “freedom” banner (this is a liberal politician writing) to make the use of their drug a personal choice issue.
Only when the evidence became overwhelming, and when public pressure and regulatory action united, did change begin. Now we see that more and more evidence is signaling that social media is bad for you, and especially for your teens.
Like smoking, scrolling has serious health consequences. Platforms are designed to capture attention and keep it for as long as possible. Features such as infinite scrolling, personalised recommendations and constant notifications are built around reward mechanisms that encourage repeated use.
Scientific research shows that this is reshaping young people’s brains, particularly in areas linked to emotion, impulse control and social behaviour. 78% of teenagers check their devices hourly, while 46% say they do so constantly. Like smoking, this is not just a habit, it is compulsion.
Addiction by design
The consequences are visible. Excessive use is linked to higher levels of anxiety, depression and attention disorders. Younger children face additional risks, including delays in emotional and language development. At the same time, many are exposed to harmful content, ranging from cyberbullying to material promoting self-harm or unrealistic body standards.
Like smoking, this is not only an individual issue. It is social. The more people use these platforms, the harder it becomes for others to stay away. The fear of missing out, of being excluded, pulls young users back in.
In that sense, just as there were “passive smokers” in restaurants in the 90s, there are “passive users” in the school yards today. Those affected by an environment they did not fully choose to be part of.
Recent court rulings in the United States underline just how serious this has become. In New Mexico, a jury found Meta liable for misleading users about safety and exposing children to harm, including sexual exploitation and self-harm content. The company was ordered to pay $375 million in penalties.
In California, another jury concluded that platforms including Instagram and YouTube designed products that fostered addictive use and contributed to a teenager’s depression and suicidal ideation. Meta was assigned 70% of responsibility, resulting in $3 million in compensatory damages, while YouTube will bear the rest.
These rulings are similar to the individual lawsuits brought by smokers suffering from cancer in the past. These cases culminated in the 1998 Master Settlement Agreement, which required tobacco companies to pay 46 US states more than $200 billion to cover public health costs.
More importantly, these cases imposed strict limits on advertising, particularly towards young people, and funded large-scale public awareness campaigns.
Courts are catching up. Politicians should too. The insights we have from the tobacco case is clear: we need to act. Sooner rather than later. And at European level preferably.
This means proposing a meaningful age limit across the European Union. It means restricting targeted advertising to minors and addressing design features that intentionally prolong use and exploit behavioural vulnerabilities. It also means investing in education and digital literacy, so young people, their parents, caregivers and teachers understand how these platforms work.
The European Commission's Special Panel on child safety online is already working towards such a holistic approach. Its recommendations are expected for July. The Commission should embrace them and act fast.
“Can’t be done” no longer flies in the face of evidence. We changed the trajectory of Big Tobacco. We can do the same with Big Tech.


