Monday, February 9, 2026
32.8 C
Colombo

The Role of AI-Generated Disinformation in Elections

By Manjula Gajanayake

The 2023 Slovak parliamentary election, a keenly contested election, were expected to yield a predictable electoral outcome. Multiple opinion polls have projected a victory for the Progressive Slovakia (Progresivne Slovensko) Party by a narrow margin. Yet, public expectations shifted dramatically after a deepfake audio was widely circulated by political rivals.

Any election typically spans over several weeks of campaigning and costs millions of dollars. In this case, all it took to undermine a massive election campaign was a single fake audio of 2.13 minutes.

Truth decay

The audio, created in Slovak, allegedly captured a private conversation between Michal Šimečka, Progressive Slovakia’s leader, and a journalist. The audio made claims as to how vote buying was a strong possibility and was released during the official pre-election silent period. Factcheckers verified and labelled the content as fabricated, but the speed and reach of the fake outpaced the verification. Pertinent to note that AI can also create fake fact-checking platforms, further undermining public trust. The political consequences, however, were real, and voters turned away from Progressive Slovakia, pushing the party to the Opposition benches.

An election is often the single largest logistical exercise in a year, involving thousands of officials, vehicles, polling stations, ballot papers, observers and security arrangements. Yet a mechanism of this scale can be destabilised by a deepfake, through strategically placed “fabricated truths,” speed and anonymity contributing to their success.

Content created by artificial intelligence (AI) can spread quickly across social media platforms, often without a verified source or identity, and at a pace far greater than that of traditional organisations. Such videos, images, or audio clips use advanced AI techniques, such as generative adversarial networks (GANs), creating highly realistic forgeries.

As an election observer, I believe every phase of the electoral process is vulnerable to disinformation and deepfakes, but the most damaging could be the silent period – and the Election Day itself. The final 24-48 hours, when formal campaigning has concluded and voters prepare to poll, offer a narrow but sensitive window to alter the outcome. This is a critical period when citizens make final decisions, and institutions are able to respond publicly without violating neutrality rules.

In many countries, including Sri Lanka, election laws lack clarity on the actual status of silent periods and what could specifically constitute a violation of election laws. This creates ambiguity about enforceability and accountability.

Vulnerable phases

Election disinformation and synthetic content can prove lethal during all the phases of an election, from pre-election to the results phase. These are times when public trust is most fragile. At such times, election disinformation does not merely undermine political competition, but triggers confusion, unrest, and long-term damage to democratic legitimacy  and may contribute to distorting the electoral outcome.

Consider this scenario on polling day. In a closely contested area, where a particular party or candidate holds a narrow advantage, if voters are instructed to travel to an alternative polling station through fabricated “hot mic” recordings created through voice cloning, the outcome itself can be altered.

In razor-sharp contests, all it takes is to shift a handful of votes. A case in point is Sri Lanka’s 2024 parliamentary election. The margin between Kalutara District’s winning candidate from the New Democratic Front (NDF) and the next most preferred candidate was merely 119 votes. In such tightly contested situations, the smallest fabrication may swing votes and the outcome.

Many global organisations are now studying and striving to reduce the impact of AI-generated election disinformation. International IDEA, the National Democratic Institute, The Carter Center, and the European Union Election Observation Missions are among those most active. There are several local organisations, such as Hashtag Generation, which are particularly active in monitoring and countering digital threats to elections.

There is a wide range of technology and research institutions dedicating time and resources to study AI-generated election disinformation, reflecting global recognition of the enormity of the problem.

Among the leading institutions examining the global rise of disinformation are the Stanford Internet Observatory, Oxford Internet Institute, Atlantic Council’s Digital Forensic Research Lab, Brookings Institution, Meedan, The Global Disinformation Index, and the International Fact-Checking Network (IFCN) at Poynter Institute. These organisations have become central to the emerging global infrastructure that studies, tracks, and exposes digital manipulation.

AI-driven disinformation

Many publications within and outside Sri Lanka examine the threat of AI-driven misinformation. One notable work is the 2023 publication ‘Foolproof: Why We Fall for Misinformation and How to Build Immunity’ by Sander van der Linden.

A widely cited idea from the book states: “At a basic cognitive level, we are all susceptible to misinformation.” It argues that misinformation behaves like a virus, “infecting our minds, altering our beliefs, and replicating at astonishing rates.” The psychological impact on voters cannot be easily neutralised or reversed. Once false narratives take root, it is difficult to dislodge them even with factual corrections.

As for Sri Lanka, we have not yet seen AI-generated misinformation shifting electoral outcomes in a decisive manner. This offers no guarantee for the future. What we have witnessed so far includes the spread of false or misleading narratives through public opinion surveys and deliberate misrepresentation of findings by international election observation missions, serving as early warning signs of a much larger risk.

At present, the Election Commission of Sri Lanka (ECSL) has only a few officers dedicated to monitoring cyber-related election threats and is currently building technical expertise. Election observation organisations face similar challenges. Many operate under constrained structures and are heavily dependent on international funding. The combination of limited capacity and increasing technological threat creates a dangerous asymmetry.

In this context, the top priority should be the development and collection of practical ‘verification tools.’ The integrity of future elections may depend on the swift strengthening of defence capabilities and their deployment.

International guidance is now available for election authorities to confront the growing threat of digital manipulation. These include UNESCO’s ‘Elections in Digital Times: A Guide for Electoral Practitioners,’ the OSCE and ODIHR ‘Handbook for Observation of ICT in Elections,’ the Council of Europe’s ‘Guidance Note on Countering the Spread of Online Mis and Disinformation,’ and International IDEA’s 2024 policy brief, ‘Protecting Democratic Elections through Safeguarding Information Integrity.’ Together, these frameworks offer a collective global playbook for defending democratic processes in the digital age.

Legal safeguards

However, the existing legal frameworks offer limited solutions. In Sri Lanka, there is an inadequacy of legal safeguards. Election campaigns are governed by legislation introduced in 1980/1981. The existing codes of conduct relating to pre-elections or extraordinary gazettes, on their own, are insufficient to stem the daily flood of coordinated disinformation.

There are examples of targeted legal interventions from elsewhere. In 2023, the US State of Texas introduced Senate Bill 75, under which political deepfakes were banned for attempting to mislead voters during the 30-day period immediately preceding an election. This represents a shift from voluntary standards to enforceable legal boundaries.

As for Sri Lanka, two fundamental tasks should be completed. First, massive public campaigns to create nationwide public awareness. Second: Investing in developing both technical and legal capacity within the Election Commission itself. It is only thereafter that meaningful collaborations could take place among political parties, tech platforms, media and civil society organisations. Voters need to be educated to treat everything seen and heard on the Internet with disciplined suspicion.

There is a deep need for the public to resist the instinct to believe and share election-related videos and audios because they are dramatic or viral content. It is necessary to exercise caution and to have content verification by credible and independent sources. This is why formal, rapid response mechanisms are required from institutions such as the Election Commission, the Election Section of the Sri Lanka Police, election observation organisations, and the Telecommunications Regulatory Commission (TRC).

Voters do not always pause to question the information they encounter. Political loyalty and emotional appeal often override critical judgement.

Features like weird facial artefacts, especially around facial boundaries, unnatural body or limb movement, and overly short, “too perfect” clips or stitched segments are signs of AI-generated content. Often, there are irregularities in hands and fingers, as well as inconsistent lighting, shadows, reflections, or background elements.

American lawyer Felicia Farber’s recent novel ‘Fake Out’ offers a powerful fictional reflection of this reality. The book deals with the inadequacy of laws to tackle sexting and cyberbullying. Particularly with respect to minors, it has a recurring theme: seeing and hearing can no longer be equated with believing.

Elections were once battles of ideas fought in public squares and through printed matter. Today, these battles take place more often in hidden realms, inside algorithms, artificial voices, and fabricated worlds.

The age of artificial intelligence does not demand blind faith or fear. It demands disciplined thinking, institutional courage, and a new social contract between citizens and information. Because when the last line of defence is no longer the ballot box, but the human mind, the future of democracy will be decided not by what we see, but by what we choose to doubt.

About the author:

Manjula Gajanayake currently serves as the Executive Director of the Institute for Democratic Reforms and Electoral Studies (IRES). An experienced election observer and an activist, he has contributed to strengthening Sri Lanka’s electoral processes through impactful civil society action.

This Op-ed was produced under the CIR– FACTUM Media Fellowship Program and was originally published in Ceylon Today on 21 December 2025.

Hot this week

Unpacking misinformation behind PCs and delayed PC Polls

By Ayodhya Kiriella The postponed Provincial Council Elections can be...

Senior editors call for a ‘journalism reboot’ for effective disaster reporting

COLOMBO - A panel of senior editors discussed the...

CIR marks six years with expert discussion on crisis reporting

COLOMBO – The Center for Investigative Reporting (CIR) successfully...

Journalists urged to uphold accuracy and ethics in crisis reporting

NIMJN founder Rajneesh Bhandari calls for greater collaboration among...

CIR launches handbook for journalists

Reporting on Mass Graves: A Practitioners Guide The Center for...

Topics

Unpacking misinformation behind PCs and delayed PC Polls

By Ayodhya Kiriella The postponed Provincial Council Elections can be...

CIR marks six years with expert discussion on crisis reporting

COLOMBO – The Center for Investigative Reporting (CIR) successfully...

Journalists urged to uphold accuracy and ethics in crisis reporting

NIMJN founder Rajneesh Bhandari calls for greater collaboration among...

CIR launches handbook for journalists

Reporting on Mass Graves: A Practitioners Guide The Center for...

CIR marks anniversary with special event on crisis reporting

Keynote on how racism, institutional decay and information vacuums...

Panel discussion: When Every Second Counts: The Role of Journalists During Crisis Situations

“When Every Second Counts: The Role of Journalists During...

Expert Discussion : Frontlines of Truth: Reporting in Times of Crisis

Sri Lanka faces a medium to high overall disaster...

Related Articles

Popular Categories