The European Commission is analysing reports of a network which uses the social platform X to distribute child sexual abuse material (CSAM), a Commission spokesperson told The Brussels Times.
The “continuous flood” of X accounts that “hijack” hashtags to share and boost child abuse content was reported by the non-profit Alliance4Europe, which found the alleged trend while doing an investigation into illegal Russian influence operations on X.
The social media platform, owned by tech giant Elon Musk, has been at the centre of concerns over online content moderation in recent years. In 2023, the Commission launched an investigation into X’s procedures to counter the dissemination of illegal content.
During a three-day investigation in July, Alliance4Europe’s researchers identified over 150 accounts with CSAM videos on X. Under one of the often “hijacked” hashtags, the organisation found that CSAM content was posted every one to 10 minutes.
The researchers estimate the illegal operation has been ongoing since at least 17 May. “While no definitive number can be estimated, the number of posts seemed to be in the millions, and the operation continued largely undisturbed,” they wrote in the report.
‘Not surprising’
Although the ease with which CSAM content is shared on platforms like X is “shocking”, the report’s findings were not surprising for child protection groups like the Brussels-based Child Focus.
“We have seen an increase in [CSAM] reports related to social media,” said Child Focus’ policy advisor, Tijana Popovic.
While she noted that the illegal content is not produced by the online platforms, Popovic underscored that they are also responsible. “The people who are abusing are the offenders, but the offenders are exploiting the vulnerabilities and the weaknesses of the platform designs. So in that sense, the platforms have an enormous responsibility.”
Despite their role, Popovic notes that “a lot of these social media platforms are downsizing their protection measures and content moderation.”
In addition, she argues that the issue is aggravated by the lack of verification in the process of registering new accounts, allowing illegal content to be easily amplified. This is part of the issue identified with X, according to the Alliance4Europe’s report researcher, Saman Nazari.
“It’s the same thing that’s enabling Russian influence operations from happening as well. It's the ease of creating accounts,” said Nazari.
‘It’s still ongoing’
X has a zero-tolerance policy for any form of child sexual exploitation. According to a statement from 2024, it removes “certain media depicting physical child abuse to prevent the normalisation of violence against children.”
The platform has a reporting tool to flag illegal content, as mandated by the EU’s Digital Services Act (DSA). The tool was used to flag posts found by Nazari and his team.
“That resulted in the posts being taken down, and we saw that a few days later they started taking some action, removing accounts after around a day of them being online,” he explained.
However, X’s countermeasures of targeting individual accounts have had little impact on the CSAM content, according to Nazari. “It's still ongoing. It's still heavily there. It looks exactly the same as when we initially reported it.”
‘There was no reply’
Nazari recommends that X focus on blocking the use of temporary email addresses, blacklist IP addresses that create multiple accounts in a short period of time, and demand a phone number when accounts are created.
Considering that the CSAM posts found tend to follow the same patterns, the researcher also argues that X needs more “systemic moderation”, whereby it creates a system that blocks and removes content that follows the known patterns.
According to Nazari, an official email was sent to X informing them of the case and offering to work with them to address the issue. “But we did not get any reply. We have in the past had conversations with X,” he said. “So we're a bit surprised that there was no reply this time.”
‘Power lies in implementation’
On Monday, Nazari completed the official submission of the entire case to the Comission, although the researcher had been in contact with the EU institution since early on in the research.
“We are very concerned by the alleged presence of a coordinated inauthentic behaviour network distributing child sexual abuse material (CSAM) on X, as reported by Alliance4Europe. We are aware of the report and we are analysing it,” a Commission spokesperson told The Brussels Times.
Nazari believes that the issue will be addressed. “I have trust in the Commission in this matter. I know the Commission team. I know that they're actually genuinely looking into it, if they say that they're looking into it.”
Nonetheless, he warns tough consequences may be needed: “The first thing we need to do is to speak with [X]. We need to provide evidence, we need to offer to help them to address it,” he said. “But if that doesn't work, the governments have to take other actions.”
For Popovic, it’s crucial to emphasise that everyone has a role addressing CSAM, whether by reporting the illegal content or by strengthening the DSA. “There is a European Framework. It exists, and its power lies in its implementation.”
Related News
- Children as young as eight posting sexual content online in Belgium
- Gaps in the system: thousands of missing children in Europe every year
- App enlists runners to find missing children in Belgium
The social media platform X no longer has a clear dedicated press team. The Brussels Times contacted X for comment on the report through the available channels, but has yet to receive a response as of the time of publication.
The Alliance4Europe is a non-profit focused on protecting and advancing democracy in Europe. The report was made possible through the Counter Disinformation Network and was financed by the Ministry of Foreign Affairs of the Republic of Poland.

