Flagged and Ignored: Testing X’s Response to EU Sanction Violations 

Posted on:  2025-07-23

Executive Summary

  • Researchers identified hundreds of posts violating EU sanctions on the social media platform X.
  • X is categorised as a “Very Large Online Platform” (VLOP) under the Digital Services Act (DSA), and as such is legally obligated to mitigate systemic risks on their platform and investigate illegal content reports from users.
  • A sample dataset of 125 clear sanction-violating posts were reported to X using the “Report EU Illegal Content” form on the platform. These included, for instance, programmes from the Russian state broadcaster RT.
  • Only 57% of the reports of illegal content received acknowledgement receipts, breaching DSA obligations. 
  • Only one of the reported posts was removed, and for the remaining cases, X responded via email, stating that no violation of EU law was found, despite clear evidence to the contrary.
  • There were 7 responses made by the platform within 2 minutes or less, potentially indicating automated reviews.
  • In the case of content from the sanctioned Russian influence operation Doppelgänger, posts were deleted despite the platform’s initial response claiming no action would be taken.
  • The results of this reporting experiment suggest that X’s current moderation mechanisms are insufficiently equipped, or that the platform is potentially unwilling to enforce sanction-related policies at scale.

Background

Under the current EU sanctions regime against Russia, Russian state media outlets and affiliated figures are banned from broadcasting their content within the European Union. This measure serves to safeguard the EU’s information environment from Russian Government-affiliated actors who actively spread disinformation and conduct information manipulation, especially in the context of Russia’s full-scale invasion of Ukraine. Such sanctioned actors are explicitly prohibited from making such content accessible to people in the EU via any platform. 

Under the EU sanctions regime (EU Council Regulation 269/2014 and 833/2014), it is prohibited to offer content hosting services for sanctioned entities. ‘Making available or broadcasting music, video or other content produced by listed persons’ is considered by the European Commission to constitute economic resources provided to those actors, which constitutes a violation of the sanctions regime.

Despite this, X (formerly Twitter) continues to violate these rules, as numerous reports and investigations have already shown. This report follows up on our previous reports to systematically evaluate X’s EU-mandated illegal content reporting system. 

In light of the significant issues identified on X, we sought to test how they are addressed when flagged by users through an experiment. We used X’s reporting tools to flag and report the illegal content directly on the platform. The reports were submitted between July 8 and 9, 2025.

Figure 1 – Screenshot showing the button in a post’s menu on X used to initiate a report.

Disseminating content from sanctioned Russian state media, particularly for commercial gain, has already triggered criminal investigations. In Germany, for instance, a couple is facing prosecution and up to one year in prison for operating a paid service that gave users digital access to several banned Russian TV channels.

Figure 2 – Screenshot of a post by the official Russian Ministry of Foreign Affairs account on X, sharing an excerpt from a documentary produced by state broadcaster RT and providing a link to bypass sanctions and access the full film on Telegram.

On X, hundreds of posts sharing sanctioned Russian state media content were easily identifiable. Some were published by official accounts of sanctioned media outlets, others by Russian diplomatic accounts, or by accounts likely operated by the “Social Design Agency,” a Russian firm known to be the producer of well-documented influence campaign “Operation Doppelgänger”, as well as anonymous users repeatedly posting such material.

Figure 3 – Screenshot of a post by the official X account of Margarita Simonyan, RT Editor-in-Chief and listed under EU sanctions list, sharing an excerpt from a video produced by state broadcaster RT and providing a link to bypass sanctions and access the full film on Telegram.

Alarmingly, at least 16 of these tweets were posted by the official X accounts of sanctioned media outlets. These posts on X mean that these outlets remain accessible to EU audiences, despite having been banned in the EU since 2022. Collectively, these channels have over 2.2 million followers.

The EU’s Digital Services Act outlines the legally binding responsibilities of a category of online service providers designated as “Very Large Online Platforms” (VLOPs). One of their duties is to effectively address illegal content on these platforms, and under the EU sanctions regime against Russia, the content flagged above is indeed illegal. X, falling within the scope of what constitutes a VLOP, has a responsibility to act on this content. 

A Pattern of Inaction

To test X’s enforcement of EU sanctions against Russian-affiliated entities, the following experiment was conducted: out of the hundreds of posts which were identified as sharing content from sanctioned entities, researchers curated a selection of 125 posts that were unequivocally in breach of EU sanctions. These posts were then reported to X using the “Report EU Illegal Content” form on the platform. 

The same reason was given for each report, as shown below.

This content has been produced by an entity listed under the EU sanctions regime targeting Russian entities supporting the invasion of Ukraine, pursuant to Council Regulation (EU) No 833/2014, Article 2f, as amended. 

The European Commission has made it clear in its official guidance that: 

‘Broadcasting and distribution of content by entities listed under Article 2f of Regulation 833/2014 is prohibited. This includes any means of transmission or distribution, such as cable, satellite, internet, apps, and platforms.’

Source: European Commission FAQ on Russia Sanctions 

Consequently, this content is illegal under Union law and should not be accessible in the EU, as doing so constitutes a violation of the EU sanctions regime.

All the posts flagged contained either material from sanctioned media outlets or included direct promotion of content from such blacklisted entities. In several cases, posts also linked to sanctioned media websites already blocked under EU regulations, or (as was often the case) to websites created to circumvent sanctions and bans to ensure EU access to sanctioned content.

Out of the 125 reports:

  • 1 post was removed
  • 124 received no visible enforcement or follow-up, even when the content clearly violated the EU’s sanctions regime
  • 53 never received an acknowledgement of receipt nor a response
Figure 4 – Distribution of platform responses to 125 user-flagged illegal posts. While one flagged content was removed, the majority either received no acknowledgement or was not acted upon despite receiving a response.

The Social Design Agency and Doppelgänger

The long-running Russian influence operation “Doppelgänger” is run by the Social Design Agency (SDA). The SDA is listed under the EU sanctions regime targeting Russian entities taking actions undermining or threatening the territorial integrity, sovereignty and independence of Ukraine (Council Regulation (EU) No 269/2014). 

10 posts out of the total dataset were easily attributable to Doppelgänger. These posts were collected using X’s API, and were identified by matching posts to the well-known pattern of behaviour used by Doppelgänger. These posts were identified and then flagged to X. 

Doppelgänger’s modus operandi is as follows: SDA starts by using one account, which typically has very few followers, to post the first instance of a manipulative piece of content. Then, a series of other accounts amplify the content by using it to reply to real people’s posts. These accounts used to post are usually single-use and abandoned after posting once. 

All the Doppelgänger posts reported as part of this dataset were published on June 8th and remained online for at least a day. Out of the 10 reported, only one of the reports resulted in removal with the message: “In accordance with applicable law, X is now withholding the reported content in the EU, specifically for the following legal grounds: Scams and Fraud.”. It is unclear why X decided to address only one post and why it chose to do it under the ‘Scams and Fraud’ category.

While X replied saying that they declined to act on our reporting of the remaining 9 posts, 6 of them were taken down in the following days. By the time they were taken down, the operation had already finished using those accounts and had outright abandoned them. Hence, the response from X came too late to have any impact on the success of this Doppelgänger operation. Reactions need to be significantly quicker, which is possible, potentially automatically.

It also bears mentioning that the specificities of the sharing patterns deployed by the Doppelgänger operation were provided to X in September 2024, which should have allowed X to stop the operation from functioning, as was the case with Bluesky, which did take effective action when provided with the same information. 

Transparency Gaps

X does not provide detailed feedback to users on reports related to sanctions violations. All responses were generic, lacking any indication of whether human review took place or whether the post was checked against official sanctions lists. 

Strangely, the platform’s reporting mechanism does include a dedicated section on EU law enforcement, stating that “this form enables an individual or an entity to notify us about content that is illegal under EU law or under the national law of an EU member state, in compliance with Union law.” However, there appears to be no corresponding follow-up process specifically aligned with EU legal requirements.

The process of reporting illegal content on X is also cumbersome and makes it impossible to report issues at the scale at which they appear on the platform, only allowing users to report content from one account at a time. X also blocks users from being able to report multiple posts within a short period, re-directing them away from the reporting page after reporting around 5-10 posts.  On the other hand, it is only possible to report a post from one account at a time.. Furthermore, the CAPTCHA put in place is also time-consuming. Lastly, when reporting larger numbers of posts, you don’t always get a confirmation of receipt from X, which is an obligation under the DSA. 

In short, X claims to offer a way to report illegal content under EU law, but there is no evidence that it follows up in a manner consistent with its legal obligations.

The EU’s Digital Services Act (DSA) requires VLOP-designated platforms to mitigate systemic risks, which include complying with EU law. This report shows that X’s current moderation mechanisms are insufficient to enforce sanction-related policies at scale, or that X is potentially unwilling to do so.

With geopolitics playing out as much online as on the ground, enforcement failures on platforms like X create dangerous blind spots, allowing sanctioned actors to continue shaping narratives, fundraising, and spreading disinformation.

What began as a firm stance against Russian state disinformation through sanctions has, two years later, devolved into a situation where official Russian diplomatic channels freely broadcast 40-minute-long RT videos on X without consequence, even when reported on the platform by EU citizens.

Key Recommendations:

The EU and member states should enforce current regulations and laws at the EU and national level:

  • To address the spread of content subject to EU sanctions on online platforms
  • To hold platforms accountable for inconsistent and inefficient reporting mechanisms
  • To hold platforms accountable for poor mitigation of systemic risks related to sanctioned Russian entities

The EU and EU member states should support the creation of a pan-European structure for reporting and enforcing sanctions:

  • Creating a civil society sanctions observatory that collects sanction violations and forwards them to a European Commission unit responsible for digital sanctions enforcement. 
  • Making digital sanctions enforcement an EU competency, creating a dedicated unit that also works closely with the Digital Service Act Enforcement team. 
  • Developing a common repository of sanction violations, continuously tracking the scale of the issue and the response of platforms, and developing communal resources for platform accountability. 

Authors:

Charles Terroille, Science Feedback.
Saman Nazari, Alliance4Europe.
Ewan Casandjian, Alliance4Europe. 

Commissioned by WeMove Europe

Alliance4Europe’s participation was made possible through funding from the Polish Ministry of Foreign Affairs.

This report was made possible through the Counter Disinformation Network.

The CDN is a collaboration and crisis response platform, knowledge valorisation resource, and expert network, bringing together 58 organisations and over 300 practitioners from OSINT, journalism, fact-checking and academia from 25 countries. The network has been used to coordinate projects on four elections and has produced 80 alerts since its creation in May 2024.

Science Feedback is a non-partisan, non-profit organization dedicated to science education. Our reviews are crowdsourced directly from a community of scientists with relevant expertise. We strive to explain whether and why information is or is not consistent with the science and to help readers know which news to trust.
Please get in touch if you have any comment or think there is an important claim or article that would need to be reviewed.

Published on:

Related Articles