AI, ML and Cognitive Warfare
ArticlesEnglish Articles

On Cognitive Warfare: The Anatomy of Disinformation

Abstract: Cognitive warfare entails narrowing down the execution of warfare to the cognitive dimension. While presented as a new notion, cognitive warfare as a concept articulates the essence of warfare, namely changing an opponent’s attitude and will – and hence their cognition. Although the concept is not new, the resurgence in attention and relevance is due to the inception of cyberspace (and social media), as well as knowledge of cognitive psychology. This renewed focus is particularly evident in the use of disinformation in influence operations.

Problem statement: How is disinformation used to influence the cognition of other geopolitical actors?

So what?: Societies need to be aware of the dangers of cognitive warfare, and become acquainted with its techniques. However, cognitive warfare alone will not win wars; its effectiveness is maximised in combination with and synchronised with other instruments of state power.

Source: shutterstock.com/metamorworks

“War is thus an act of force to compel our enemy to do our will.”[1]

A Notion En Vogue

“Cognitive warfare” appears to be the latest fad in the security realm.[2] Cognitive threats refer to activities directly affecting human cognition without inflicting prior physical force or coercion.

Cognitive warfare can be understood as a part of hybrid warfare – another en vogue notion. Hybrid warfare is the use of all instruments, in all domains, to affect all dimensions (physical, virtual and cognitive). The core challenge in employing hybrid warfare lies in synchronising these capabilities and operations, including cognitive warfare.

The People’s Republic of China’s (PRC) policy of “three warfares”,[3] using public opinion, psychological, and legal means to achieve victory,[4] is a contemporary example of cognitive warfare; in this respect, China’s goal is to directly influence its opponent’s mind and break its resistance without fighting.

China’s goal is to directly influence its opponent’s mind and break its resistance without fighting.

Cognitive threats influence the human mind by using informational means such as words, narratives, and pictures. While influencing human cognition can be benign, using persuasive techniques, it can also be more malign or manipulative in nature. During the Cold War, the Soviets and the US used manipulative cognitive techniques to attain their goals in their respective doctrines of Active Measures and Political Warfare.[5] Fabricated messages, false data, or outright disinformation were often used to evoke human cognitive biases and heuristics, influencing and manipulating deliberation and decision-making processes. Hence, if cognitive warfare – perhaps under a different name – is not new, why is more attention being paid to these activities? Moreover, if disinformation appears to be the weapon of choice to affect human cognition, we have to ask how it works.

Cognitive War or Warfare?

War is an act of force intended to compel an opponent to fulfil one’s will. Compelling an opponent can be achieved through military means but could also be inflicted through diplomatic, economic, or informational methods. In fact, Sun Tzu argued that subjugating the enemy’s army without fighting is the true pinnacle of excellence.[6]

A distinction can therefore be made between war and warfare. War – especially in the legal sense – is an armed conflict between two states (or state-like entities). Conversely, warfare is the act of subjugating other parties – foe, friend or neutral – by any means available. Forced transnational migration, as witnessed between Belarus and Poland,[7] can be seen as a form of warfare, inducing or coercing a policy change. It follows that cognitive warfare is the art of inducing the other actor to accept one’s will, using and focusing on the cognitive dimension.

While all forms of warfare affect the opponent’s will, directly or indirectly, following a kinetic attack, cognitive warfare should be set aside from traditional conceptions. Cognitive warfare is not about territory or dominance over resources; it is a conflict between different perceptions, beliefs, or even a clash of civilisations or cultures.

Is Cognitive Warfare a New Phenomenon?

In the cognitive dimension, threats, or even warfare, are not new; nor is the battle over perceptions. This was evident as far back as the Peloponnesian War, and again in the Thirty Years’ War, the Spanish Civil War, and more recently during the Cold War. Each of these clashes had a clear cognitive dimension related to clashes in worldviews, belief systems, and religions. However, cognitive threats and warfare are prevalent today due to two developments: first, the emergence of cognitive psychology, magnified by the second development, the inception of cyberspace.[8]

In the cognitive dimension, threats, or even warfare, are not new; nor is the battle over perceptions.

Cyberspace forms part of the information environment, which states have always used in the quest for influence. The inception of cyberspace has not only added new layers to the information space; more importantly, the virtual layers, virtual objects (data), and virtual personas have also unlocked the information environment. Whereas in the past, information and influence operations were executed using cumbersome methods including pamphlets, bribery, or radio broadcasts, they are mainly accomplished today with social media.[9] A tweet or direct message can reach the capillaries of society at the speed of light. Moreover, cyberspace is conducive to creating specific virtual images (deepfakes, virtual reality), memes, virtual personas (Facebook, X [formerly Twitter] or Instagram accounts) or social media communities. It can spread information in an unfiltered, viral, and contagious way that is not limited by national boundaries.

The emergence of academic research in cognitive psychology after the Second World War highlighted the fact that our brain is a neural network governed by heuristics and biases.[10] It also became apparent that this knowledge could be used as an instrument to influence people. According to the Russian notion of Reflexive Control,[11] humans are prone to respond in a predetermined manner when subjected to specific information in a conditioned environment (time pressure, data overload). Reflexive Control – the primary technique in Active Measures doctrine – aims to find strategic advantages in the information environment by deception, provocation, subversion, and spreading disinformation.[12] With this mechanism, Russia influences target audiences subconsciously by exploiting cognitive biases, namely the limitations in human information-processing capacity.[13]

The combined developments of cyberspace and cognitive psychology can be misused to influence liberal democracies, which have open societies and rules to protect individual rights and freedoms. Liberal democracies openly discuss problems to find common ground. This openness allows authoritarians to abuse this transparent ecosystem by injecting malign data to incite discord and sow distrust. The very elements that are relevant and essential to liberal democracies (freedom of speech, distribution of power, independent media) are simultaneously our greatest vulnerability. Moreover, paradoxically, the only way to counter an attack on these core elements appears to be to violate our values. How can we resolve this self-inflicted conundrum?

Liberal democracies openly discuss problems to find common ground. This openness allows authoritarians to abuse this transparent ecosystem by injecting malign data to incite discord and sow distrust.

Malicious actors can exploit these features of cognitive psychology and the internet since social media favours sensationalist content, irrespective of source or factuality. Actors on social media can deliberately manipulate and amplify negative messages by sharing misleading, deceptive or incorrect information that the audience perceives as genuine. Social media actors use algorithms to distribute fake and exaggerated news, and sharing is amplified by automated bots that consistently repeat the news.[14]

Is Cognitive Warfare Effective?

Scholars, including Arquilla, Ronfeldt, Stiennon, and Stone, predicted that the next war would probably be one in which cyberspace and social media activities would support kinetic actions, or even vice versa, in that cyber operations would play the lead role.[15] This idea gained traction following the interference in the 2016 US presidential election and the 2017 NotPetya attack on the Ukrainian fiscal system, both committed by Russian state(-backed) actors. While the war in Ukraine (or the armed conflict between Israel and Hamas) is not a war in which cyber operations dominate, the persistent conflict in cyberspace is, nonetheless, the largest cyberoperation witnessed so far.[16]

In the war in Ukraine, both Russian and Ukrainian state and non-state actors are engaged in intelligence activities (through cyberspace or otherwise),[17] undermining critical infrastructure via cyberattacks and digital influence operations.[18] Oddly enough, the most effective operations are not the cyberattacks on critical infrastructure but the cognitive activities using cyberspace as a vector. Whereas most cyberattacks during the Russian-Ukrainian war are labelled as mere hindrances,[19] the synchronised effect of digital influence operations is strategically important for both sides. Russian actions influence domestic populations, and pro-Russia narratives sow discord and undermine Ukrainian morale. However, the most effective cognitive activities are Ukrainian operations that influence Western audiences in gaining support for the Ukrainian cause, which is strategically important.[20]

The Anatomy of Disinformation – How Does It Work?

Words have an effect, whether persuasive, coercive or manipulative. Cognitive activities aim to influence human cognition without threatening or imposing kinetic force. The ultimate goal of cognitive warfare is “directly interfering with or subconsciously controlling the enemy’s brain”.[21] This would enable an operator to “induce mental damage, confusion, and hallucinations in the enemy, forcing them to lay down their arms and surrender”.[22]

The ultimate goal of cognitive warfare is “directly interfering with or subconsciously controlling the enemy’s brain”.

Not all influence operations employ subconscious methods. Coercive influence operations cut short or circumvent the targeted audience’s deliberate understanding and autonomous decision-making process, forcing them to make an ‘unwilling’ choice consciously. The targeted audience is well aware of the coercive action, leaving them with no other options. Influence operations are not malign per se. Persuasive influence operations aim to change the weighing and number of options available to targeted audiences, so that they make a voluntary (or willing) choice that benefits the influencer. The PRC’s public opinion warfare is an example of a persuasive influence operation using media outlets, including China Global Television Network (CGTN) and the Global Times, to paint a benign but framed picture of the PRC.[23]  

While persuasive and coercive influence operations use rational and conscious techniques, manipulative influence operations use subconscious and covert techniques, subverting or usurping the autonomous decision-making process. An often-used technique to deflect target audiences into making reflexive and biased judgements based on cognitive and social heuristics,rather than on rational deliberations, is disinformation.[24]

Disinformation is inherently deceptive; it uses heuristics and biases to lure the target audience away from rational decision-making processes in favour of what Petty and Cacioppo call the peripheral route.[25] The peripheral route is invoked by luring the target audience towards realistic socially divisive topics (such as poverty, racial issues, or police violence) and then impairing their ability to process incoming data related to that topic by linking the topic to subconscious biases. The information – or rather disinformation – provided to impair the public will be framed and adjusted to the target audience. Groups or individuals will be incapable of verifying or making sense of incoming data due to an overload of data or lack of time. In the 2016 US election, posts were disseminated, targeting religious areas in the US, claiming that candidate Clinton was endorsing gender equality or even stating that she was lesbian herself. Deeply religious people may disapprove of gender equality or same-sex marriage, and hence they would likely anchor Clinton to those negative sentiments. Likewise, posts were shared claiming that the Pope endorses Trump.[26]

Disinformation – in contrast to malinformation (hate speech or trolling) or misinformation (unintended misleading data) – is intentionally misleading information aiming to gain, or contribute to, a strategic intent. Disinformation has different guises. First, it can be deliberately false or fabricated to be deceptive. Second, disinformation can occur when the content and context of a message are not in congruence. Suppose French President Macron were to deliver an official speech in a foreign language or during a Sesame Street broadcast. In that case, the content might be correct, but the message would still be misleading due to the incompatibility with the context. Following this rationale, a commercial or advertisement in which the message is “framed”, delivered by a cartoon polar bear or even intentionally misleading, should not be considered disinformation since the context is congruent with the content of the message.

Disinformation – in contrast to malinformation (hate speech or trolling) or misinformation (unintended misleading data) – is intentionally misleading information aiming to gain, or contribute to, a strategic intent.

Protection Against Disinformation

Countering disinformation is challenging, not only in practical terms but also in legal and ethical terms. Awareness-raising and digital hygiene are beneficial for augmenting resilience in society, especially when disinformation campaigns are difficult to attribute. There is, however, an underlying tension for open and transparent democratic systems. On the one hand, there is a desire to halt or counter disinformation campaigns by hostile actors such as Russia, while on the other hand, ‘maintaining the values that Western democracy is built upon – of freedom of information and expression – is paramount to preserving the legitimacy’ of our democratic institutions.[27] Violating these values will be portrayed as an act of hypocrisy and could be further exploited by hostile actors. An example of this is the banning of Russia Today (RT) and Sputnik by the EU in 2022, which sparked protests, first and foremost, by agencies of EU journalists since it violated the foundational principles and freedoms as expressed in international human rights law, such as the European Convention on Human Rights, to which the Russian Federation was also a party. 

Hellman and Wagnsson have categorised four avenues for countering Russian information warfare,[28] based on the one hand on the notion of engaging or not, and, on the other hand, on focusing inwards on domestic audiences or outwards on foreign audiences. The resulting avenues are confronting, blocking, naturalising, and ignoring. This template can also be used to assess countering disinformation more generally. Banning RT and Sputnik is an example of blocking the dissemination of disinformation from Russian outlets to promote the pro-Russian war narrative at a time when EU members were expressing support for Ukraine. The PRC’s ‘block information’, part of its confrontational actions,[29] is another example of this avenue. Blocking aims to actively protect one’s own population.

Banning RT and Sputnik is an example of blocking the dissemination of disinformation from Russian outlets to promote the pro-Russian war narrative at a time when EU members were expressing support for Ukraine.

A more passive method could be awareness campaigns against malign disinformation on social media, or educational packages for secondary school students. The result would be the ignoring of disinformation. This trend was noticeable during the 2018 mid-term elections and the 2020 US presidential election. After the revelations about Russian interference during the 2016 presidential election, the electorate was no longer naive about the messages being spread on social media platforms.

A state could also focus on the source of the disinformation or what the receiver perceived as disinformation. A benign option is to amplify the core values of one’s own society and persuade other states to adopt these values. Numerous Western states advocate individual human rights in states with collective human rights.[30]

More assertively, one could counter or confront the disinformation directly and fight fire with fire. Covert activities by armed forces, especially the state’s intelligence services, could play a role in a confrontational counter-disinformation policy. This implies that state agencies would operate below the threshold of using (armed) force, often within the jurisdiction of another state, which could have legal ramifications since activities may go well beyond traditional espionage.

Paradoxically, many Western states have solid legal and legitimate frameworks for deploying kinetic military forces. However, they are highly reluctant to deploy security forces in the cognitive realm and in areas with effects below the force threshold.

Conclusion and Reflection

While the nature of cognitive warfare is age-old, the development of cognitive psychology and the inception of cyberspace have given cognitive warfare a more comprehensive range and increased its effectiveness. Cognitive warfare can be a useful instrument in the hybrid toolbox of a state or state-like actors. However, strife, competition, or even war – armed conflict between states – can and will only be won if the interplay between kinetic, informational, and cognitive warfare is in unison. These elements must have a unity of purpose and be synchronised. Cognitive warfare alone will not win the war.

Disinformation is an essential technique for waging cognitive warfare, as it directly affects human cognition. Disinformation – as frames, narratives or images with a deliberately misleading context or content – uses subconscious manipulation of the human brain by appealing to heuristics and biases that circumvent the rational decision-making process.

Disinformation is an essential technique for waging cognitive warfare, as it directly affects human cognition.

So what? What does this mean for future cognitive warfare and disinformation operations? While numerous developments and challenges come to mind, three stand out:

1.    New technologies (such as Artificial Intelligence large language models, including ChatGPT) as generators of disinformation. While the anatomy of disinformation is based on the workings of our neural networks, disinformation can also be produced based on algorithms. Big Data analysis can predict behaviour based on correlations instead of causality, invoking a near-deterministic mode of human behaviour. Future disinformation will not be crafted by cunning humans but by sheer computational power, making it even more powerful but at the same time elusive and uncontrollable.

2.    Second, there is an influx of private and non-state actors. If disinformation is a tool to gain strategic advantages, one might assume that state actors will instigate it. While the resources of these actors are almost unlimited, the number of actors is limited, and their actions will also be constrained by legal and ethical boundaries reflecting their ideologies and cultures. With the influx of non-state actors, the number of actors has increased exponentially. If they are willing or able to make use of new technologies, they may become an increasingly large set of actors in conflict and war that are unaffected by international law – examples of which have already been witnessed during Russia’s recent campaign in Ukraine.

3.    Finally, the threat in the cognitive dimension lies in the asymmetry between worldviews, especially between liberal democratic and authoritarian – between states or even within a state. Diverging worldviews should not be problematic in a healthy society. Their existence should reflect the democratic core values of freedom of expression. They can become problematic, however, if groups in society are locked in social media bubbles that are no longer connected.

While war and warfare have been the prerogative of armed forces for eons, perhaps the real conundrum is what kind of role those forces will have in the era of cognitive warfare.


Dr Peter B.M.J. Pijpers is Associate Professor of Cyber ​​Operations at the Netherlands Defence Academy in Breda and a Researcher at the Amsterdam Centre for International Law (University of Amsterdam). A Colonel in the Netherlands Army, he has been deployed four times to mission areas, including Iraq and Afghanistan, and was seconded to the European External Action Service for three years. Dr Pijpers has (co-)authored articles on the legal and cognitive dimension of influence operations in cyberspace and how armed forces can manoeuvre in the information environment. See also Orcid ID 0000-0001-9863-5618. The author can be reached at b.m.j.pijpers@uva.nl. The views expressed in this article are solely those of the author.


[1] Carl von Clausewitz, On War, ed. Michael Howard and Peter Paret (Princeton University Press, 1989), 75. 

[2] Nino Tsikhelashvili, “Cognitive Warfare Through Reflexive Control Strategy In Georgia,” The Defence Horizon Journal, no. September (2023).

[3] Peter Mattis, “China’s ‘Three Warfares’ in Perspective,” War On The Rocks, 2023.

[4] Koichiro Takagi, “The Future of China’s Cognitive Warfare: Lessons from the War in Ukraine,” War on the Rocks, 2022.

[5] Thomas Rid, Active Measures: The Secret History of Disinformation and Political Warfare (London: Profile Books, 2020); Linda Robinson et al., Modern Political Warfare: Current Practices and Possible Responses, 2018.

[6] Ralph D. Sawyer, Sun Tzu: Art of War (Westview Press, 1994).

[7] Shaun Walker, “Beatings, Dog Bites, and Barbed Wire: Life and Death on the Poland-Belarus Border,” The Guardian, 2023, https://www.theguardian.com/world/2023/oct/02/beatings-dog-bites-and-barbed-wire-life-and-death-on-the-poland-belarus-border.

[8] Alicia Wanless and Michael Berk, “The Changing Nature of Propaganda,” in The World Information War: Western Resilience, Campaigning, and Cognitive Effects, ed. T. Clack and R. Johnson, 1st ed, (Routledge, 2021),  66.

[9] Buddhika B. Jayamaha and Jahara Matisek, “Social Media Warriors: Leveraging a New Battlespace,” Parameters 48, no. 4 (2019): 11–23.

[10] Buster Benson, “Cognitive Bias Cheat Sheet,” Better Humans, 2016. See also the Cognitive Bias Codex: https://www.sog.unc.edu/sites/www.sog.unc.edu/files/course_materials/Cognitive%20Biases%20Codex.pdf.

[11] Timothy L. Thomas, “Russia’s Reflexive Control Theory and the Military,” The Journal of Slavic Military Studies 17, no. 2 (2004): 237–56, 238-243.

[12] Andrew Radin, Alyssa Demus, and Krystyna Marcinek, “Understanding Russian Subversion: Patterns, Threats, and Responses,” no. February (2020), 2-3; United States Senate Committee on Intelligence, “Report on Russian Active Measures Campaigns and Interference in the 2016 U.S. Election – Volume 2: Russia’s Use of Social Media,” vol. 2, 2019, 12-13.

[13] Johan E. Korteling, Anne-Marie Brouwer, and Alexander Toet, “A Neural Network Framework for Cognitive Bias,” Frontiers in Psychology 9 (2018), 2.

[14] Samuel C. Woolley and Philip N. Howard, “Political Communication, Computational Propaganda, and Autonomous Agents: Introduction,” International Journal of Communication 10 (2016), 1.

[15] John Arquilla and David Ronfeldt, “Cyberwar Is Coming,” in In Athena’s Camp: Preparing for Conflict in the Information Age, ed. John Arquilla and David Ronfeldt (RAND, 1997), 79–89; John Stone, “Cyber War Will Take Place!,” Journal of Strategic Studies 36, no. 1 (2013): 101–8; Richard Stiennon, There Will Be Cyberwar (IT-Harvest Press, 2015); Alec Ross, “Will the Next War Be a Cyberwar,” Policy Review, 2019, 16–19.

[16] Digital Security Unit, “Special Report: Ukraine – An Overview of Russia’s Cyberattack Activity in Ukraine,” Microsoft, 2022; Office for Budget Responsibility, Fiscal Risks and Sustainability, 2022, 49-50.

[17] Matthias Schulze and Mika Kerttunen, “Cyber Operations in Russia’s War against Ukraine,” SWP Comments, 2023.

[18] Cyber Peace Institute, “Cyber Dimensions of the Armed Conflict in Ukraine,” 2023.

[19] With the exception of the ViaSat attack on February 23, 2022. It is not stated whether cyberattacks (including wiperware) might have had a strategic impact, but at the time of writing, this does not appear to be the case. See Kraesten L. Arnold et al., “Assessing the Dogs of Cyberwar: Reflections on the Dynamics of Operations in Cyberspace during the Russo-Ukrainian War,” in Reflections on the Russian-Ukrainian War, ed. Maarten Rothman, Lonneke Peperkamp, and Sebastiaan Rietjens (Leiden University Press, 2024) (forthcoming).

[20] Amber Brittain-Hale, “Clausewitzian Theory Of War In The Age,” The Defence Horizon Journal, no. December (2023).

[21] Takagi, “The Future of China’s Cognitive Warfare: Lessons from the War in Ukraine.”

[22] Idem.

[23] Paul Charon and Jean-Baptiste Jeangène Vilmer, “Chinese Influence Operations: A Machiavellian Moment,” 2021, 29-31; Emilio Iasiello, “China’s Three Warfares Strategy Mitigates Fallout From Cyber Espionage Activities,” Journal of Strategic Security 9, no. 2 (2016), 52-56.

[24] See e.g., Johan E. Korteling, Maaijke Duistermaat, and Alexander Toet, “Subconscious Manipulation in Psychological Warfare,” 2018; Robert B Cialdini, Influence: The Psychology of Persuasion, Rev. ed. (New York: Harper, 2007).

[25] Richard E. Petty and John T. Cacioppo, “The Elaboration Likelihood Model of Persuasion,” Advances in Experimental Social Psychology 19 (1986), 126.

[26] Renee Diresta et al., “The Tactics & Tropes of the Internet Research Agency,” New Knowledge, 2018.

[27] Aiden Hoyle and Peter B.M.J. Pijpers, “Stemming the Narrative Flow: The Legal and Psychological Grounding for the European Union’s Ban on Russian State-Sponsored Media,” Defence Strategic Communication 11, no. Autum (2022): 51–80, https://doi.org/10.2139/ssrn.4220510.

[28] Maria Hellman and Charlotte Wagnsson, “How Can European States Respond to Russian Information Warfare? An Analytical Framework,” European Security 26, no. 2 (2017): 153–70, 157-158.

[29] Charon and Jeangène Vilmer, “Chinese Influence Operations: A Machiavellian Moment.”

[30] Government of the Netherlands, “Joint Statement on Behalf of 47 Countries in the UN Human Rights Council on the Human Rights Situation in China” (2022); He Zhipeng, “The Chinese Expression of the International Rule of Law,” Social Sciences in China 38, no. 3 (2017): 175–88, 180-181.

You may also like

Comments are closed.