Balkan Insight: At Bulgaria’s Elections, is the EU Facing Another Digital Calamity?

The lesson of the anti-EU backlash following the annulled polls in Romania last year is that election-related social media manipulation must be identified swiftly, long before polling day in Bulgaria.

An illustration of a woman voting in a polling station

This piece was first published in Balkan Insight, which is BIRN’s flagship English-language website, and it provides daily news, as well as analytical insights and investigations. It is republished here to further amplify its message.

The author is Peter Horrocks, the co-founder of Balkan Free Media Initiative, BFMI.

The opinions expressed are those of the author and do not necessarily reflect the views of BIRN.

Balkan Insight Logo.jpg

The decision to call off the Romanian presidential election in December 2024 may have helped to prevent a Russia-supporting candidate from winning, but it was a political and digital disaster for the European Union. A similar disastrous outcome in Bulgaria needs to be urgently avoided

In Romania, evidence emerged too late that a candidate had been massively boosted on social media by illegitimate techniques. The major platforms failed to identify the abuses early enough. By the time the scale of the manipulation became clear, the Romanian Constitutional Court, under strong pressure from Brussels in the background, annulled the election.

The backlash was immediate. Nationalists and the Kremlin loudly claimed that the EU had cancelled democracy. Whatever the legal justification, the political damage was severe. The lesson should have been obvious: election-related digital abuse must be identified early and addressed quickly, before institutions are forced to undertake extreme remedies.

As Ursula von der Leyen put it, if there is evidence of foreign interference in elections, “we have to act swiftly and firmly”. That is now the test in Bulgaria.

Bulgaria’s legislative election is taking place in an already fragile political environment. It is the eighth in five years, and public trust in institutions is badly worn down. In such conditions, online manipulation is especially dangerous because it can feed on real public frustration and make manufactured momentum look like genuine political sentiment.

That is why triggering the EU’s Digital Services Act Rapid Response System matters. It followed evidence of coordinated digital manipulation in Bulgaria’s online space, including suspicious amplification around leading political figures.

The concern is not simply that misleading or false content is circulating. It is that the architecture of visibility itself may be being distorted, with certain candidates or narratives artificially elevated through inauthentic activity.

This is exactly the kind of threat the EU says it is ready to confront. But the existence of a mechanism is not the same as an effective response.

An illustration of a protester in Bucharest supporting Romanian presidential candidate Calin Georgescu

Digital platforms must act fast

The first urgent task for the European Commission is to require all Very Large Online Platforms, especially TikTok and Meta, to provide updated Bulgaria-specific election risk assessments. They should be asked what mitigation measures are already in force, how much Bulgarian-language moderation capacity they actually have, and what evidence they have seen of bot networks, manipulated media or suspicious spikes in engagement.

That process should happen in real time, not through slow-moving bureaucratic exchanges. There should be daily reporting, named contact points available around the clock, and response times measured in hours rather than days.

The second task is to insist on concrete mitigations. If there is evidence of coordinated inauthentic behaviour, platforms should reduce algorithmic amplification of suspicious content, introduce friction into abnormal engagement patterns, and rapidly detect and remove bot-driven networks.

Political advertising and paid influencer content should be clearly labelled. Manipulated media should be flagged. Known false narratives should be downranked, while authoritative election information should be made more visible.

But Brussels cannot do this alone. Bulgaria’s own institutions are not yet organised for the challenge. There is no fully developed national structure for monitoring coordinated digital behaviour or for sharing evidence quickly between government, regulators, and platforms.

That has to change immediately. The Bulgarian government should create a national coordination cell, bringing together the relevant regulators, the election commission, the media regulator, the data protection authority, security services, and the Prime Minister’s office. It should also draw on credible outside expertise from civil society, technical analysts, and researchers.

That group should produce a daily risk assessment, feed structured evidence to Brussels, and maintain direct pressure on the platforms. It should also track suspicious financial flows, including covert payments to influencers or citizens being paid to amplify content.

At the same time, public communication matters. The government should warn voters about manipulation tactics and help them identify false or manipulated content, but do so carefully enough to avoid feeding claims of censorship.

Europe says it learnt a lesson from Romania. Bulgaria is where that claim will now be tested. If the evidence is already visible before polling day, waiting until afterwards would be a conscious failure.

The question is no longer whether the EU has a shield. It is whether it is willing to use it in time, and on its own terms. In the end, the standard remains the one that von der Leyen herself set: in the face of election interference, Europe must act “swiftly and firmly”.


Our readers read next:

Next
Next

Bee Magazine: TikTokcracy: When Social Media Pollutes Democracy