top of page

Lies, Damn Lies, and Diplomacy: Misinformation as a Tool of Statecraft

Across the shifting landscape of global politics, facts bend and twist under the weight of rival agendas. Global trade is no longer limited to goods; they trade choreographed realities. In that subterranean theatre, half-truths and outright fabrications stated briefly as “official facts”, their purpose not merely to deceive but to anchor diplomatic leverage, with an intention to lead allies down false paths of trust.


Lies, Damn Lies, and Diplomacy: Misinformation as a Tool of Statecraft

Illustration by The Geostrata


Similar to the society George Orwell depicted in his book 1984”. This article investigates the methodological role of misrepresented facts and selective testimony as deliberate instruments of statecraft. 


THEORIES OF MISINFORMATION IN THE CONTEXT OF IR


In order to understand why a country would deliberately utilise tools like misinformation, it is crucial to connect political theory with the practical aspects of international relations. A viable explanation comes from Neoclassical Realism, which reminds us that the international system remains a competitive and uncertain arena.


In such environments, states leverage doubt as a strategic resource. By spreading misleading narratives, they seek to bend public sentiment abroad, making adversaries unsure of their own institutions.

The falsehood then serves a twin credit, weakening the adversary’s political contest and, in the interim, buying the sponsor time to advance its own motives and strategic agendas. It is also important to consider that states face a difficult time reacting to these misinformation campaigns, as accusing another government of orchestrating such facts can risk admitting to suppressed weaknesses, such as shortcomings in cybersecurity infrastructure.


Disclosure of such information can result in the target state becoming a subject of criticism, when their domestic audience may compare their earlier perception of the truth with the new evidence.


This combination produces an uncertainty loop in which decision-makers are required to calculate the political cost at home against the possible strategic advantage of asserting a resilient image abroad. The task is rendered yet more complex by the contemporary information environment.


These campaigns, in the age of social-media algorithms and trends, now have leverage that spans across communities, used to widen gaps in national unity.


This may also lead to the rise of “conspiracy theories”, which are engineered to normalise distrust and draft narratives, all with the objective of controlling public perception. This results in the power of states being determined by their ability to mould belief, rather than by traditional levers of military or economic force.


CASES WHERE THE ART OF MISINFORMATION MIGHT BE SHAPING REALITIES


In contemporary times, the world has witnessed a rapid increase in state-sponsored misinformed campaigns, with the purpose of influencing electoral outcomes, redirecting public interest and exacerbating existing international crises.


According to the latest Global Risks Report, made and released by the World Economic Forum, disinformation and misinformation are among the highest near-term threats.

The report reiterates that such actions, when used as a weapon of statecraft, can corrode trust and deepen inter-nation divisions, striking at the core of international relations. There are many examples to further elaborate on this statement.


Since 2022, Russian-aligned cyberespionage units have attacked international organisations across multiple European countries and the United States, including penetrating humanitarian operations in Ukraine and employing generative artificial intelligence to create pro-Kremlin audiovisual propaganda.


Fueled by increasingly persuasive artificially generated sound and images, cobot accounts and invented individuals flood social media with deceptive material, while recent research reveals that problematic engagement styles with social media feeds can widen susceptibility to fake facts, making the spread of the misinformation possible.


The 2024 Romanian elections offered some revealing figures: machine learning models discovered that on TikTok’s recommendation sequences, far-right political content was promoted in just under a threefold amount to every other moderate and centre viewpoint combined.


Around the same time, Telegram funnelled the same Kremlin-crafted message to Romanian inboxes, revealing in detail the precise way that hostile state actors leverage digital ecosystems to mount lightning-targeted influence campaigns.

In 2025, the European Union imposed sanctions on several people as well as six organisations, seen as perpetrating destabilising activity. 


A new study now details how over 1,200 pieces of misleading or completely false content were disseminated to residents of Cyprus, Greece and Malta via social media between January and June 2024, highlighting the broad geographic reach and tactical diversity of the campaigns, from island to island.


What began as a response to a single crisis now emerges as a permanent instrument, slipping easily from region to region and regular life, using the hybrid mechanics of human-machine systems to turn perception on its head, exploit the openings where a line might falter and bust open those fractures across society.


INDIA'S OWN EXPERIENCE WITH MISINFORMATION


India’s history of military confrontation has become a case study in the power of manufactured narratives, peaking during the May 2025 India-Pakistan flare-up. The sequence began when Indian forces struck installations in Pakistan-Occupied Kashmir as part of Operation Sindoor on May 7, 2025.


Indian officials describe the strikes as precision attacks on militant infrastructure intended to reduce cross-border terrorism,  a move quickly distorted by Indian and international channels.

Broadcasts claimed sweeping airstrikes on nuclear targets, inflated air-to-air kill counts, and fabricated incursions into Karachi and Lahore.


The visuals were rarely authentic: instead, pre-simulated battle renders, AI-generated drone and satellite images, combined with manipulated voices of alleged captured air force chiefs presenting false confessions, blurred the line between fact and fiction, creating a spectacle that deeply impacted audiences.


Pakistan’s media outlets, not to be outdone, paraded fabricated triumphs, painting their neighbour as the unprovoked aggressor: their broadcasts thrummed with emotionally laced visuals, many humorously far from the truth.


Rising misinformation bred unyielding mistrust and nearly sealed off any viability for a diplomatic negotiation. India’s Ministry of Home Affairs released alerts over the phone, warning broadcasters that inflated headlines endangered emergency ordering and threatened public calm.


Platforms like X, Facebook, and Instagram became battlegrounds for this conflict; their algorithms fanned every deliberate falsehood. 

This shows how such actions are now a primary mode of modern warfare, with misinformative content having unprecedented reach. Countries find themselves in a dilemma in 2025, with a very fine line between violating the right to freedom and self-expression, and regulating campaigns that may hinder national security.


THE ROAD AHEAD


With misleading content increasingly integral to statecraft, the burden on policymakers and diplomats is steadily increasing. Classic diplomatic methods, relying on formal negotiation, transparent dialogue, and codified protocols, are routinely jeopardised by swift, unverifiable accounts that race across social media at a pace never before seen.


There is an urgent need to counter such situations using new, innovative methods that fuse technology, policy and promote cooperative actions at a global scale. In response to the persistent wave of hybrid threats, governments now recognise the necessity of building advanced cybersecurity frameworks.


These frameworks must integrate detection technologies capable of identifying and countering fabrications at the point of origin, thus preventing false narratives from attaining virality.

Other measures include encouragement of media literacy and consciousness among citizens, ensuring they stay immune and avoid further spread.


As the online battlefield grows in contemporary times, it’s crucial to note that the way forward demands the international community’s collective vigilance. Through empowerment, education and awareness, misinformation can be transformed into resilience, and democratic institutions can be safeguarded in a high-tech, interconnected world. 


BY ANANYA SHUKLA

TEAM GEOSTRATA

bottom of page