Bot Warfare and the 13th National Elections: Did the Interim Government Drop the Ball?
“Bot bahini” emerged as a full-fledged electoral tool during the 13th National Elections of Bangladesh, held on 12 February 2026. The term refers to automated networks of fake social media accounts that have become increasingly prominent in Bangladesh. Before, during, and after the elections, these accounts spread disinformation about candidates, defamed individuals, and engaged in coordinated cyberbullying.
One prominent example stands out: the case of Bobby Hajjaj’s wife, Rashna Imam, who was targeted with sustained online abuse. Personal attacks and disinformation campaigns significantly shaped voter perceptions. Even today, distinguishing fake news from authentic information remains a daunting task. In this context, how successful was the Interim Government in protecting candidates—or their families—from online harm, especially when women are dragged into the toxic arena of cyber-politics?
The Interim Government clearly fell short in protecting candidates in the digital sphere. BGD e-GOV CIRT (Bangladesh e-Government Computer Incident Response Team) appeared largely ineffective during the election period. Where institutional responsibility for monitoring is weak, bot networks gain a structural advantage. The result is reputational damage to candidates, distorted public discourse, and increased voter confusion. The ICT sector’s response to bot networks must evolve, particularly with upcoming city council elections in mind.
This article focuses on the following themes:
- Why AI-Driven Misinformation Scales Faster During Elections
- Consequences for Candidates, Voters, and Trust
- A Practical Solution Agenda for Future Elections
Why AI-Driven Misinformation Scales Faster During Elections
Generative AI has significantly reduced the expertise and time required to produce convincing false content—including fabricated images, manipulated videos, and synthetic audio that imitates real voices. At the same time, platform recommender systems tend to amplify content that drives engagement, especially when coordinated networks push identical narratives across multiple groups and pages (UNESCO and UNDP, 2025; World Economic Forum, 2024).
The speed gap is decisive. False information can reach large audiences within minutes, while verification and official responses may take hours or even days. This asymmetry allows bot networks to manufacture a false sense of consensus, where repetition substitutes evidence. Empirical research on organized social media manipulation shows that political actors and intermediaries across many countries deploy fake accounts, automation, and strategic communication tactics to shape public opinion (Bradshaw and Howard, 2019).
Consequences for Candidates, Voters, and Trust
In an online interview with Dhaka Stream (31 January 2026), Rashna Imam described how she was targeted for campaigning on behalf of her husband. She reported that bot bahini networks launched personal attacks against her on Facebook, undermining her dignity and questioning her right to participate in political campaigning as a woman. Her case is far from isolated. These attacks typically follow a structured cycle that produces three tiers of damage.
First, they shift elections away from substantive issues toward personality-driven attacks, insinuations, and harassment-based narratives.
Second, they create widespread voter confusion. When individuals are exposed to high volumes of conflicting claims, they may delay decision-making, switch preferences, or disengage altogether—particularly when corrections fail to achieve the same visibility as the original falsehoods (World Economic Forum, 2024).
Third, they erode institutional credibility. When voters perceive authorities as either incapable or unwilling to secure the information space, trust in the electoral process—and in credible sources of information—deteriorates. This concern is not unique to Bangladesh.
Misinformation and disinformation, including synthetic media, have been identified by the World Economic Forum as among the most significant near-term global risks, with the potential to destabilize social cohesion and democratic systems (World Economic Forum, 2024).
A Practical Solution Agenda for Future Elections
Effective responses do not require blanket censorship. Instead, they must be fast, coordinated, transparent, and grounded in due process. A realistic solution package includes five key elements:
Joint election integrity cell
Establish a coordination cell led by the Election Commission, bringing together focal points from the Ministry of Posts, Telecommunications and Information Technology, the ICT Division, the telecom regulator (BTRC), cybercrime law enforcement units, and government public information services. The cell should operate through a unified workflow: detection, verification, escalation, and public correction.
On-record platform cooperation
Replace informal requests with formalized escalation mechanisms and service-level expectations for addressing coordinated inauthentic behavior, synthetic media, and mass harassment. Meta’s reporting on coordinated inauthentic behavior (CIB) highlights the value of structured threat analysis and repeatable enforcement, but domestic stakeholders must be equipped to submit timely and well-documented signals (Meta, 2025).
Shareable, rapid public clarification
Disseminate concise corrections in the same formats used by voters—visual posts, short videos, and brief messages—while maintaining a publicly accessible fact-checking portal. This approach helps narrow the time gap between rumor and verification (UNESCO, 2025).
Transparency and accountability
Publish regular integrity reports summarizing key misinformation trends and response actions, while safeguarding sensitive investigations. Transparency reduces speculation about selective enforcement and strengthens deterrence.
Pre-election civic resilience
Conduct public awareness campaigns on common manipulation tactics, including AI-generated content indicators, in collaboration with media and civil society. International guidance emphasizes resilience-building alongside the protection of freedom of expression (UNESCO and UNDP, 2025).
The need for coordinated, multi-stakeholder approaches to electoral information integrity in Bangladesh has also been emphasized in recent discussions involving platforms, civil society, and public institutions. These efforts point toward a viable pathway for preparedness and collaboration ahead of elections (UNESCO, 2026).
The core issue is clear: when governance is slow and fragmented, bot networks thrive. Ensuring election integrity in an AI-driven information environment requires operational capacity—early detection of coordinated behavior, rapid institutional response, and clear communication with voters without resorting to excessive control.
Formal cross-agency coordination and structured platform engagement can reduce exposure to misinformation while preserving open democratic debate and public trust.
We hope that women like Rashna Imam receive justice—both offline and online.
References
- Bradshaw, S., & Howard, P. N. (2019). The global disinformation order: 2019 global inventory of organised social media manipulation. Project on Computational Propaganda, Oxford Internet Institute. https://demtech.oii.ox.ac.uk/wp-content/uploads/sites/12/2019/09/CyberTroop-Report19.pdf
- Meta. (2025). Integrity reports: Third quarter 2025. Meta Transparency Center. https://transparency.meta.com/reports/integrity-reports-q3-2025/
- UNESCO. (2026, January 22). UNESCO Dhaka brings together digital platforms, civil society to strengthen electoral information integrity. https://www.unesco.org/en/articles/unesco-dhaka-brings-together-digital-platforms-civil-society-strengthen-electoral-information
- UNESCO. (2025, June 11). New UNESCO–UNDP issue brief highlights the impacts of AI on freedom of expression and elections. https://www.unesco.org/en/articles/new-unesco-undp-issue-brief-highlights-impacts-ai-freedom-expression-and-elections
- World Economic Forum. (2024). The global risks report 2024. https://www3.weforum.org/docs/WEF_The_Global_Risks_Report_2024.pdf
- World Economic Forum. (2025, January 15). Global risks report 2025: Conflict, environment and disinformation top threats. https://www.weforum.org/press/2025/01/global-risks-report-2025-conflict-environment-and-disinformation-top-threats/
