The American online video sharing platform YouTube's homepage video recommendation system may be breaching European Union (EU) law, with accusations of manipulative practices.
A new complaint was filed on Tuesday against Google Ireland with the Belgian telecom regulator (BIPT) by the Brussels-based NGO European Digital Rights Initiative (EDRi).
The organisation alleges that YouTube's design of its video recommendation system on users' homepage is intentionally designed to benefit the company over the user.
Concretely, YouTube offers a personalised homepage with recommended videos based on users’ activity.
However, for users wanting to choose the alternative homepage option, which has random popular videos and does not include profiling, the platform's interface makes it harder to find. In some cases, it even actively discourages it.
For example, the non‑profiling option is hidden, bundled with watch history, and discouraged via warning language about losing personalisation across Google services.
The only way users can have a recommender system without profiling is by turning off YouTube History, where they lose all historical watching data in their Google account.
These practices are known as "harmful design patterns", often called manipulative designs or deceptive patterns. They intentionally (or unintentionally) obstruct users from making their intended choices, choices not in their best interests, or choices that benefit the company over the user, according to EDRi.
'Staggering' breaches
In the latest complaint, the US tech giant stands accused of breaching EU law, notably the Digital Services Act (DSA), where users in the EU have the right to challenge content moderation decisions taken by large technology companies.
"The staggering non-compliance, in particular of Big Tech companies, with the DSA motivated us to lodge this formal complaint," Jan Penfrat, Senior Policy Advisor at EDRi, told The Brussels Times on Tuesday, after the formal complaint was lodged.
"The law has been in effect for over two years, and there still is so much obvious infringement that we are not surprised if people feel disappointed about the DSA," he continued.

Credit: European Commission
When it comes to moderating user-generated content, YouTube is applying its own terms of service "inconsistently", which in itself can be a compliance issue under the DSA, according to EDRi's Penfrat.
"Our complaint does not focus on individual content moderation decisions, but on the way YouTube pushes its own profiling-based recommender systems at the expense of people's right to use an alternative option," he explains.
Undermining users
YouTube's homepage recommendation system undermines user autonomy and informed choice, which disproportionally impacts vulnerable users (such as children), according to the complaint.
"The DSA is supposed to protect people's digital rights in Europe, including their rights to free expression and access to information, but it can only do that if online platform companies comply and regulators properly enforce it," Penfrat continued.
The NGO expects the Belgian regulator to forward its complaint to the Irish Media Commission, where YouTube’s owner, Google, is based.
"The Media Commission is one of the busiest in Europe due to the many Big Tech firms being established there, so it might take some time for both regulators to come to a decision," said Penfrat.

Hands holding a folding smartphone displaying YouTube. Credit: Unsplash
"But it is crucial for the DSA and the rule of law in Europe that its rules are strictly and swiftly applied. The issue raised in our complaint is neither complex nor hard to prove, and the infringement seems quite obvious. We trust the regulators will recognise that."
The complaint comes amid US technology companies, referred to as Big Tech, wielding unprecedented influence in the EU with a gargantuan lobbying operation, with spending up 56% since 2021 and over 890 lobbyists active in Brussels. They are seeking changes to regulations on AI and privacy rules.
In 2024, the European Commission asked YouTube, TikTok and Snapchat to provide more information regarding their recommender systems, but it is unclear whether YouTube ever formally responded. It was concerned about the platform's role in not amplifying harmful content, including illegal and hate speech. Another request for information was made last year.
YouTube and EC respond
The European Commission confirmed it was "aware" of EDRi's complaint. "We are in touch with the Belgian Institute for Postal Services and Telecommunications, the Digital Services Coordinator for Belgium, on a regular basis," European Commission spokesperson Thomas Reigner told The Brussels Times.
"Where the Commission has suspicions that a breach of the DSA may occur and it finds that a further investigation is needed, it will not hesitate to take action," he underlined.
The EU executive also confirmed it had sent out requests for information to YouTube in 2024 and also in 2025, having since received and analysed the responses. Last year's request included information on potentially harmful recommendations to children, for example concerning eating disorders.
A YouTube spokesperson, approached for a response to the civil society complaint in Belgium, told The Brussels Times that it provides users with "clear, accessible and meaningful" controls regarding their recommendations.
"Users on YouTube have several options to get recommendations that are not based on profiling, such as through topic channels, the Subscriptions tab and Channels pages. YouTube additionally offers users controls to turn such profiling on or off."
Manipulation concerns
At the heart of these concerns is YouTube’s recommendation algorithm, which was found in a 2024 US market-focused study to not protect teenagers from harmful or offensive (e.g. racist) content.
It was also found to expose all users to heightened religious content (particularly on Christianity), and to anti-vaccine content, as well as videos featuring self-proclaimed misogynist influencer Andrew Tate, who is banned from the platform.
"Google controls the algorithm that decides what kind of video content is seen by whom," EDRi's Penfrat explains. "So at least technically, there is nothing that prevents Google's leadership – either on their own or under instruction from a third party – to manipulate the YouTube algorithm for a selected part of the population."
"For example, this could be done for a whole EU Member State where public opinion is more critical of Trump, so that they see more of certain types of content while being blocked from others, when scrolling through recommendations or search results," he concluded.
This article was amended to include comments from YouTube and the European Commission.

