Tuesday, April 28, 2026
29.5 C
Colombo

Platform accountability and elections: When global tech platforms collide with democracies

13 Feb 2026 | BY Buddhika Samaraweera

Across the world, elections are no longer fought only on streets, stages, and television studios as in the past. They unfold on social media platforms such as FacebookYouTubeTikTok, and X, that operate outside national borders but command significant influence and, at times, are even capable of altering electoral outcomes. 

These platforms are governed by community guidelines set out for global audiences and enforced through systems that struggle to grasp local politics, languages, and laws. This mismatch between global platforms and national democratic processes has become a serious global challenge and become extremely pronounced during election times.

In countries such as Sri Lanka, where digital platforms play an increasingly influential role in political communication and shaping public perceptions – but with debilitating limitations on the regulatory capacity – the question is no longer about social media’s profound influence on local elections, but whether they could be held accountable in ways that protect the electoral integrity without narrowing political expression which often enables online censorship. Experiences from Sri Lanka, India and Bangladesh indicate shortcomings in the existing mechanisms.

Global platforms with local stakes

Social media platforms have, from time to time, introduced election integrity-related policies to deal with voter suppression, coordinated and inauthentic behaviour, deceptive political advertising, and incitement. Even though these rules apply with uniformity in theory, their application, in practice, is different across the regions.

Election-related disinformation has taken multiple forms across recent election cycles in Sri Lanka. These include false claims about voting procedures, smear campaigns, ethnic or religious narratives, and coordinated networks that amplify partisan content during legally mandated silence periods. These activities mostly occur in local languages through pages, groups, and meme networks that do not formally register as political advertisers and therefore fall outside the relevant transparency tools, the experts say.

Silencing women online

During the 2025 Parliamentary Elections, a number of female candidates were subjected to coordinated online attacks that crossed the line from political criticism into intimidation, sexualised falsehoods, and public shaming. Much of the content circulating on social media appeared designed to humiliate and discredit female candidates rather than engage with their views or public work. Videos, images, and posts spread widely across platforms, treating misogyny as a form of political spectacle. In several instances, misleading or fabricated claims remained online even after complaints were made to the authorities, allowing the abuse to continue unchecked.

“Social media platforms responded with standard messages and closed the reports. No explanation, no contact person. In the end, I changed my privacy settings and limited my online presence. The abuse stopped, but there was no sense that anyone had been held accountable for what happened,” said a Local Government Election candidate who was subjected to harassment on social media platforms, Himali Balasooriya, while speaking to The Daily Morning.  

These case studies show how digital platforms amplify harassment and disinformation in ways that directly undermine fair competition and subject candidates to technology-facilitated gender-based violence. When such content spreads unchecked, it shapes voter perceptions long before any correction. 

Different behaviour in larger markets?

A social media researcher, speaking to The Daily Morning on the condition of anonymity, claimed that platform behaviour shifts noticeably in larger markets and especially in the Global North, where commercial and political stakes are higher. In India, which represents one of the largest user bases for companies such as Meta, he said that platforms engage more actively with state authorities, civil society actors, and election bodies. “This is not solely the result of stronger regulation but also of the market size. In the US and parts of Europe too, platforms have demonstrated a readiness to rapidly remove or suppress content when it aligns with state interests or heightened political sensitivities. The recent large-scale removal and de-prioritisation of content related to Gaza across major platforms has reflected how decisively companies can act when the pressure is sufficient and comes from countries with wealth and influence.”

A constrained EC

For their part, the National Election Commission (EC) has repeatedly raised concerns over the speed and consistency of platform responses during election periods. EC Chairperson, R.M.A.L. Rathnayake, speaking to The Daily Morning, said that social media companies are often slow to act on content flagged as harmful to the electoral process. In some cases, he said, they reject the reasons provided, and in others, no response is received until after the polling has ended, by which point remedial action has little to no value. “We rely on civil society groups to identify problematic content and to channel complaints to the relevant platforms and their representatives. Even when a lot of posts are flagged, only a small number may be taken down. During an election, we are the legally mandated authority to decide what constitutes interference with the electoral process. We do it within the relevant legal frameworks, and our determinations should be accepted by the platforms. However, it is not reflected in how moderation-related decisions are made,” he said.

The ability of the election authorities to hold social media platforms to account also depends on the resources available to them. In countries where election bodies have an adequate financial and institutional capacity, they can engage directly with the platform representatives, travel for meetings, deploy technical teams, and follow up more effectively. Explaining the situation in Sri Lanka, Rathnayake said: “We don’t have the financial provisions or logistical capacity to engage with these companies to the optimum. We have to depend on external organisations and civil society groups to communicate with platforms.” Rathnayake added that plans are underway to engage directly with platform representatives through meetings facilitated by partner organisations, including a proposed visit by EC officials to Singapore. He said that they aim to discuss faster escalation pathways and clearer lines of responsibility during the planned meetings. 

Disinformation and the “dark” campaign economy

Research into Sri Lanka’s 2024 Elections by digital rights analyst, Dr. Sanjana Hattotuwa shows accountability problems that extend beyond delayed takedowns of misleading or harmful content. Speaking to The Daily Morning, he described what he terms as a “dark campaign finance ecosystem” operating across social media platforms. In this system, he said that political actors bypass formal advertising channels and spending limits by paying cash directly to administrators of popular meme pages, gossip networks, and hyper-partisan accounts. He explained how these influence-related operations rarely resemble conventional political advertising: “They circulate as seemingly organic content, blend humour with insinuation, and rely on coordinated amplification rather than paid reach. As a result, platform ad libraries and transparency tools fail to capture much of this activity. The withdrawal of a tool used by journalists and researchers to track content spread, CrowdTangle, has further weakened independent oversight.” Dr. Hattotuwa also pointed to evidence of coordinated networks operating during legally mandated silent periods, when campaigning is prohibited, and continuing to circulate content debunked by fact-checking partners. In many cases, he said that warning labels or distribution limits are not applied to such, and added that this exposes a gap between the platforms’ stated trust and safety frameworks as well as the realities of political manipulation within Sri Lanka’s information environment. 

Improved but inconsistent accountability 

The People’s Action for Free and Fair Elections (PAFFREL), which has monitored elections in Sri Lanka for the past few decades, states that platform accountability has improved at times, but remains inconsistent and driven more by stakeholder pressure than clear obligation. “Platforms have taken action against election misinformation and disinformation mainly when there has been strong pressure from the civil society and other stakeholders. I can’t say that they didn’t cooperate. They did, particularly during the Presidential and Parliamentary Elections. FacebookTikTok, and X developed working arrangements with election observation groups. There were agreed mechanisms to flag harmful content, and those helped remove harmful content, but, what they did was not sufficient and did not continue,” PAFFREL’s Executive Director, Rohana Hettiarachchi, speaking to The Daily Morning, said. Even though about 80 per cent of the flagged content was taken down during periods of silence, he said that the responses were weaker during the rest of the election period. “Much content stayed online for several days, and some were not taken down at all. At times, they were not willing to prioritise compliance with the country’s election laws. It is true they are international companies, but they still need to comply with a country’s election-related laws. In some countries, they follow those laws more closely than they do here.” He added that the regulation of disinformation through social media is essential as it directly affects how people make decisions that shape the country’s future. “During elections, this has a direct impact on how people vote. Imagine an artificial intelligence generated video of a well-known candidate saying that they have withdrawn from the race or they support another candidate on the morning of Election Day. People will believe it. Even if it is corrected later, the damage is already done.” Hettiarachchi further said: “Social media companies can’t say that they lack resources. They are profit-making businesses. Even during elections, they earn a lot of money through ads. They should be able to predict future trends and take precautionary measures. At the very least, they should accept the observations made by the election authorities when those fall within the national legal framework. They cannot escape responsibility.”

Influencers, informality, and blurred lines

A social media influencer who created political content during a recent election, speaking to The Daily Morning on the condition of anonymity, described how informal payments are used to bypass platform safeguards. “I was approached through private messages and paid in cash. There were no contracts, no disclosures, and no labels saying that it was political content. The idea was to make it look like my personal opinion,” he said. He added that similar arrangements were common across meme pages and short-form video creators, particularly in the final weeks before polling

The impact of online claims is immediate for voters. One voter, speaking to The Daily Morning, recalled encountering repeated posts on social media during the 2019 Presidential Election claiming the Sri Lanka Podujana Peramuna’s candidate, Gotabaya Rajapaksa, held dual citizenship at the time. “This information prompted hesitation and second thoughts about whether to cast a vote for him. Even though the issue was eventually clarified and he had renounced his United States citizenship, the uncertainty lingered long enough to influence how voters like me assessed our choices.”

Inside the fact-checking response

A leading fact-checking platform in Sri Lanka, Fact Crescendo, states that the most common forms of election-related disinformation it encountered during the recent polls were misleading claims aimed at undermining public trust in the electoral process itself. Speaking to The Daily Morning, its Editor, Pavithra Sandamali said that these included false narratives questioning voting and counting procedures, attacks on the credibility of election officials, fabricated quotes attributed to political leaders, and edited or misleading video clips shared out of context. “We also flagged a surge in fake opinion polls and survey results, falsely attributed to both local and internationally recognised bodies, including election monitoring organisations such as the European Union (EU). Much of this content was amplified through coordinated campaigns on platforms such as FacebookWhatsApp, and TikTok, using emotionally charged messaging around ethnicity, voter suppression, and alleged secret political deals to influence voter perceptions.” “As a third-party fact-checking partner of Meta, Fact Crescendo’s work is limited to Meta-owned platforms, where responses have been relatively effective. Once a fact-check is published, misleading content is labelled, its distribution is reduced, and users are notified with links to verified information. Repeated violations can also affect a page’s reach and monetisation. However, similar enforcement mechanisms are largely absent on other major platforms such as YouTubeX, and TikTok, allowing false narratives to continue circulating even after they have been debunked elsewhere,” she added. She also highlighted several challenges faced by fact-checkers, including limited resources, the sheer volume of misinformation during election periods, and the time required for careful verification using official sources. She said that the need for detailed, evidence-based fact-checks can slow responses at moments when rapid intervention is critical. “This situation is compounded by low public awareness of fact-checking initiatives and recent cuts by global platforms to trust and safety teams, which have weakened content moderation support. To address these issues, there should be expanded third-party fact-checking partnerships across all major platforms, faster election-specific response systems, stronger collaboration with the election authorities, and greater investment in user awareness and media literacy.”

A South Asian conundrum?

Similar concerns were raised by a digital rights researcher based in Bangladesh, Miraj Chowdhury, who claimed that platforms should meaningfully engage with civil society in smaller markets. “In my experience, YouTube does not engage with civil society at all. I have never seen them coming to us to discuss election-related issues. TikTok engages to some extent. Facebook is more transparent and has more policies around elections, but their enforcement remains inadequate. Also, automated systems often fail in local languages such as Sinhala, Tamil, and Bangla. Even where trusted partner networks exist, these organisations are small and lack the resources needed during elections.” Commenting further on platform accountability to The Daily Morning, he added that it also depends heavily on the market size. “Larger markets such as the US receive more attention and stronger responses, even if there are no strict laws. However, even if smaller countries have laws in place, they will not receive an adequate response. Platforms should have more effective responses and they should be in line with local contexts. Strengthening civil society organisations, improving researcher access, and properly resourcing the election authorities are very important to improve platform accountability.”

The risk of overcorrection

Even though calls for the stronger regulation of misinformation and disinformation on social media are growing, related risks should also be taken into account. For instance, the Online Safety Act, No. 09 of 2024 aims to address harmful online content, but has drawn criticism ranging from vague definitions to composition and other fault-lines. Several parties including political parties and civil society groups have warned that such laws may be used to suppress the legitimate rights of the people. The challenge is to avoid replacing platform accountability with state overreach. Electoral integrity depends on trust in platforms as well as regulators. Any framework or mechanism aimed at enhancing accountability should therefore be based on clear standards, due process, and a respect for fundamental rights.

What accountability could look like

When contacted by The Daily Morning, attorney-at-law Thineth Korasagalla, who works on digital safety, cybersecurity, and online harms, said that experiences from other regions show that stronger platform accountability requires practical, election-specific measures rather than broad promises. He said that digital platforms should be required to publish country-level transparency reports during election periods, detailing how much content is flagged, how long removals take, the outcomes of appeals, and how enforcement is carried out across different languages. Without such disclosures, he said that it is difficult to assess how platforms actually respond to election interference on the ground. He further pointed out that meaningful oversight is impossible without access to platform data. The removal of tools such as CrowdTangle, he said, has weakened the ability of researchers and civil society groups to monitor harmful trends and hold platforms accountable. Referring to developments in the EU, Korasagalla said that mandated access for researchers, with proper privacy safeguards, is increasingly recognised as necessary and could be adapted to other countries as well. On political advertising, he said that existing rules fail to address informal and indirect campaigning. Disclosure requirements, he added, should extend beyond paid advertisements to include influencer-driven political content and coordinated networks of pages that operate outside formal ad systems.

The Daily Morning sought comments from social media companies, including Meta platforms and TikTok, via publicly listed electronic-mail addresses, however, no responses were received at the time of publication.

This story was produced under the CIR– FACTUM Media Fellowship Program and was originally published in The Morning on 13 February 2026.

Hot this week

Toxic fallout: Mercury fears, substandard coal and a governance crisis converge at Norochcholai

Speed read Mercury(hg) concerns linked to substandard coal at the...

GIJN Launches Global Academy of Investigative Journalism

The Global Investigative Journalism Network (GIJN) has launched the...

Easter 2019: A tragedy foretold and still unresolved

Seven years ago, on 21 April 2019, sun dawned...

Sri Lanka’s Bumpy Road to a Political Reset

Sri Lanka is navigating a fragile political transition following...

AI Tools & Social Norms are Making the Internet Unsafer For Women

As generative AI usage rises in India, image, video,...

Topics

GIJN Launches Global Academy of Investigative Journalism

The Global Investigative Journalism Network (GIJN) has launched the...

Easter 2019: A tragedy foretold and still unresolved

Seven years ago, on 21 April 2019, sun dawned...

Sri Lanka’s Bumpy Road to a Political Reset

Sri Lanka is navigating a fragile political transition following...

AI Tools & Social Norms are Making the Internet Unsafer For Women

As generative AI usage rises in India, image, video,...

Energy minister under fire as NPP confronts first no-confidence motion over corruption charges

COLOMBO – Sri Lanka’s ruling National People’s Power (NPP)...

Six extraordinary stories on intangible losses due to climate change

There are many climate change stories but those that...

From maritime tensions to economic recovery: Can Sri Lanka navigate the next decade?

Speed Read: Sri Lanka’s relationship with neighboring India remains central...

Related Articles

Popular Categories