Friday, 04 June 2021
EU’s action plan to combat disinformation is incomplete and is being outpaced by emerging threats, according to a new special audit report published on Thursday by the European Court of Auditors (ECA).
“Any attempt to maliciously and intentionally undermine or manipulate public opinion represents a grave threat to the EU itself,” said Baudilio Tomé Muguruza, the ECA member responsible for the report, at a press briefing (2 June). We recommend that the EU’s response to disinformation should be stepped up, and that its coordination be improved.”
The EU’s action plan against disinformation has triggered positive developments, but it has not lived up to all of its promises, according to the auditors. The plan contained relevant measures – for example, debunking and reducing the visibility of misleading content – but it has not been updated or reviewed since 2018, even though disinformation tactics, actors and technology are constantly evolving.
In December 2020, the Commission published the European Democracy Action Plan, which includes anti-disinformation measures, without clarifying exactly how it relates to the 2018 action plan against disinformation. The auditors warn that pursuing similar objectives through different initiatives makes coordination more complex, and increases the risk of inefficiencies.
The auditors have looked at the European External Action Service’s (EEAS’s) strategic communications division and its three task forces – StratCom East, Western Balkans and South – and found that they have improved the EU’s capacity to forecast and respond to disinformation in neighbouring countries.
However, the auditors consider that the mandates and resourcing of these task forces should be reviewed in the light of new emerging threats.
Code of practice
The EU action plan has also targeted the private sector and civil society in the joint fight against disinformation. In October 2018, the European Commission established a code of practice for engagement with online platforms, consisting of voluntary measures.
The Code was signed with Facebook, Google, Twitter and Mozilla as well as trade associations representing online platforms, the advertising industry, and advertisers. During the initial stages of the COVID-19 pandemic, the code of practice led the platforms to give greater prominence to information from authoritative sources.
This was a pioneering approach, according to ECA. But the auditors found that it has been unsuccessful in holding online platforms accountable for their actions, and in inducing them to play a greater role in actively tackling disinformation.
Code of conduct
Another similar code with internet operators was outside the scope of the audit. In May 2016, a “Code of conduct on countering illegal hate speech online” was also agreed with Facebook, Microsoft, Twitter and YouTube. In the following years, Instagram, Snapchat, Dailymotion and TikTok joined the Code of Conduct. Hate speech is fuelled by fake news, disinformation and conspiracy theories.
While there are differences between the internet operators, the regular evaluations of the Code of Conduct show that it delivers positive results. On average 90% of the notifications from a network of trusted flaggers on illegal hate speech are reviewed within 24 hours by the operators and 71% of the content is removed by them.
However, there are ways of circumventing the Code of Conduct, which currently is limited to on-line hate speech which is publicly available. Private messaging platforms such as WhatsApp are not covered by the Code.
Furthermore, a new study published yesterday shows a rise in antisemitic content in French and German on Twitter, Facebook and Telegram during the COVID-19 pandemic. The report concludes that the Code of Conduct, as a self-regulatory initiative, is not enough to counter illegal hate speech on-line and draws the attention to other EU initiatives.
The audit report ends with a number of recommendations addressed to EEAS and the Commission, all of which have been accepted by them. They need to improve the coordination and accountability of EU actions against disinformation. Internally, they need to review the operational arrangements of the StratCom division and its task forces in the external action service.
The Commission should also adopt a media literacy strategy that includes combatting disinformation as an integral part.
As regards the code of practice, the Commission should propose additional commitments to the signatories to address weaknesses identified in the evaluations of the code, improve the monitoring of the online platforms’ activities to tackle disinformation by setting meaningful key performance indicators, and establish a procedure for validating the information provided by online platforms.
Update (7 June): A Commission spokesperson added that the Commission recently (26 May) published a Guidance setting out its views on how signatories should strengthen the Code of Practice and create a more transparent, safe and trustworthy online environment.
“We want the signatories to become more effective at demonetising purveyors of disinformation. They should also put in place safeguards for all forms of manipulative behaviour on their platforms, give additional transparency about their algorithmic recommendations and provide researchers with sufficient access to data. For the revised Code, we will devise clear performance indicators that will underpin a comprehensive monitoring framework.”
The Brussels Times