A new kind of election battlefield
Nepal’s 2026 House of Representatives election is being fought on familiar streets and village squares, but also on algorithmically curated feeds and closed messaging groups. Compared with previous cycles, far more voters now receive political information first through their phones. This shift has created opportunities for new voices, but it has also lowered the cost of spreading false or hateful narratives at scale.
During the 2022 elections, the Election Commission experimented with limited cooperation with major platforms. Meta’s political ad library started covering Nepal, and some misleading posts were taken down after complaints. In 2026, the environment is more complex. Cheap generative AI tools can produce convincing voice clones, synthetic images and videos that are difficult for ordinary viewers to distinguish from reality.
What the Election Commission is doing
In response, the Election Commission has taken several steps. The Election Code of Conduct 2082 includes explicit provisions against deepfakes, fake accounts, digital propaganda and hate speech. The commission has created an Election Information Dissemination and Coordination Centre, within which an Information Ethics Promotion Unit screens online content.
By late January, this unit had already identified 302 pieces of harmful information and forwarded them to relevant agencies for action under election rules, cyber law and press regulations. A special social media monitoring cell with representatives from the Nepali Army, Nepal Police, Armed Police Force and National Investigation Department now works alongside district-level police cyber cells. These structures are designed to detect coordinated campaigns early and prevent them from undermining trust in the voting process.
Yet serious challenges remain. Monitoring capacity is limited compared with the volume of content. Much of the enforcement still depends on cooperation from global technology companies whose priorities may not always align with Nepal’s electoral calendar.
Civil society and media step into the gap
Recognising these limits, digital rights groups and media organisations have tried to fill some gaps. A coalition of 20 NGOs has urged the commission and government to publish a public dashboard of problematic content and actions taken, arguing that transparency is essential for public trust. They are also calling for clearer guidelines to ensure that takedowns target genuinely harmful content rather than critical journalism.
At the same time, major newsrooms have partnered with independent fact-checkers to monitor viral claims on social media. Under new agreements, specialist teams will verify fast-spreading rumours about candidates, vote counting and security incidents, then publish corrections in simple language across web, television and radio. This complements more formal enforcement by regulators and offers voters accessible tools to evaluate what they see online.
Balancing integrity and free expression
The central dilemma is how to protect election integrity without shrinking democratic space. Heavy-handed blocking of content can backfire, feeding conspiracy theories that authorities are hiding the truth. On the other hand, a purely hands-off approach can allow lies, hate speech and AI-generated hoaxes to drown out genuine debate.
A balanced approach requires three ingredients. First, clear and narrowly tailored rules that focus on behaviour rather than political viewpoints, such as bans on fabricated evidence, incitement to violence and impersonation. Second, transparent processes: when posts are removed or accounts sanctioned, authorities and platforms should give reasons and allow for appeals. Third, investment in digital literacy so that citizens themselves can question and cross-check extraordinary claims.
What voters and parties can do now
In the weeks before March 5, the most immediate responsibility lies with parties, candidates and voters. Parties that benefit in the short term from anonymous smear campaigns or deepfakes risk long-term damage to their own credibility and to democratic institutions they may later need to rely on.
Voters, for their part, can adopt simple habits: pause before sharing emotionally charged content, look for original sources, check whether reputable outlets are reporting the same story and be wary of posts that play on fear or identity-based resentment. When in doubt, they can consult verified fact-checks or official information channels from the Election Commission and trusted media.
If regulators, media and citizens cooperate on these lines, Nepal’s 2026 election can demonstrate that even in the age of AI, a small democracy can defend both electoral integrity and open debate.
