According to the materials, EU officials coordinated with major technology platforms to ensure compliance with the Digital Services Act, or DSA, with the stated goal of enforcing content moderation standards ahead of sensitive political moments. The records show this activity occurred in connection with at least eight European elections, including the Dutch elections in 2023 and 2025.
In the run-up to the 2023 Dutch election, the European Commission designated the Dutch Interior Ministry as a “trusted flagger” under the DSA. That designation allowed the ministry to submit priority content takedown requests to social media platforms. Trusted flaggers are granted expedited review of reports and carry added weight in enforcement decisions.
The documents categories of political speech flagged for moderation or removal. These included “populist rhetoric,” “anti-government or anti-EU content,” “anti-elite content,” and “political satire.” Other targeted categories included criticism of migration policy, opposition to immigration and refugee programs, content described as Islamophobic, and material labeled “anti-LGBTQI.” References to “meme subculture” were also listed among the areas of concern.
The records show that EU officials framed these efforts as necessary to protect democratic processes and maintain public trust during election periods. However, the flagged content categories largely involved lawful political expression rather than illegal activity or direct incitement.
One major platform reportedly declined to fully cooperate with the requests. That same company is currently facing enforcement actions under the DSA, including a €120 million fine and regulatory scrutiny in multiple EU member states.
French authorities have also carried out raids related to ongoing investigations, according to public reporting.
The internal TikTok policy documents included in the disclosure show that upcoming community guideline updates were explicitly tied to DSA compliance. One internal summary states that the “primary motivation” for the changes was to meet the requirements of the Digital Services Act and ensure enforcement actions were reflected transparently in platform rules. The updates included expanded categories for misinformation, election-related content, and so-called marginalizing speech, with penalties ranging from content removal to algorithmic suppression.
The European Commission has previously said the DSA is intended to combat illegal content and protect users online. The newly released documents, however, show the law was also used as a mechanism to influence content moderation decisions involving political speech shortly before voters went to the polls.




