Malign Actors Disinformation
ArticlesEnglish Articles

New Problems in Hybrid Warfare: Cyber Meets Cognition

Abstract: Hybrid warfare encompasses the area of adversarial relations between war and peace. In this space, questions have emerged about how cyber action, which involves the subversion of confidentiality, integrity, and availability of data, intersects with information operations (also known as propaganda or influence). While definitions of these phenomena remain imprecise and emergent, terms such as social and cognitive cyber security are gaining currency amongst scholars and practitioners.

Problem statement: How are cyber techniques used to disseminate information designed to influence publics, elites, and leaders?

So what?: The most open societies are likely the most vulnerable to data manipulation and information operations. The community of democratic states, namely those populating the OECD or the NATO alliance and its Pacific analogues, must erect defences against malign information influence delivered through cyberspace.

Source: shutterstock.com/Gorodenkoff

Conflict in Cyberspace

Strategic thinkers have been pondering conflict in cyberspace for three decades. In the 1990s, Arquilla and Ronfeldt argued that “Information is becoming a strategic resource that may prove as valuable and influential in the post-industrial era as capital and labour have been in the industrial age”.[1] Since then, the militaries of the United States and its Western allies have cashed in a peace dividend at the Cold War’s end, waged a war on terror in the aftermath of 9/11, and have now entered a renewed period of great power competition, primarily marked by the rise of the People’s Republic of China (PRC). Through all this, scholars and practitioners have debated the role of information and computing technologies in the calculus of power.[2]

Writing on power a decade ago, Joseph Nye described it as resting on three legs: military, economic, and soft power.[3] This is not far removed from where Carr rested his pillars of power in 1939, substituting “power over opinion” for Nye’s more recently coined “soft power” term.[4] For all this consistency over time on what power is, Western societies have a tough time identifying and measuring this third leg of power.

The Cyber-Information Nexus

If there is an unanticipated externality of the rise of massively networked computing to the global scale, it is cybersecurity – the security of cyberspace, a construct of science fiction[5] and a theoretical vehicle of the earliest thinkers on robotics and forms of machine intelligence.[6] An interconnected, worldwide computational infrastructure, cyberspace can be a vehicle for both malicious behaviours undertaken through it, and attacks made upon it.[7] Cybersecurity is a desired end state. The National Institute of Standards and Technology, an agency of the U.S. Commerce Department, defines cybersecurity as:

Prevention of damage to, protection of, and restoration of computers, electronic communications systems, electronic communications services, wire communication, and electronic communication, including information contained therein, to ensure its availability, integrity, authentication, confidentiality, and nonrepudiation.[8]

For this work, cybersecurity is a socio-technical activity in which computer systems are protected from subversion or manipulation; activities often labelled as “hacking”.[9] In the last thirty years, it has morphed from a curiosity into a significant area of military activity, described by the U.S. Department of Defense (DoD) as a conflict domain alongside land, sea, air, and space. The field has enormous taxonomies of vulnerabilities, attacks, defences, and other related phenomena. It is the intersection of computing and illicit, criminal, belligerent, and militarily hostile behaviour.[10]

Cybersecurity is largely a product of technological innovation for both offence and defence. The field is heavily preoccupied with attacks, which have definitional beginnings stemming more from intellectual contributions in cryptography and computing than international security.[11] Every attempt to subvert a system, or violate the five terms in NIST’s definition above, is considered an attack. As time passes, practitioners are learning that cyberattacks are seldom decisive.[12] On their own, they can be – but rarely are – a form of coercive action.[13] Consider two cases, the first involving a cyber-kinetic hack and the second involving multiple incidents of data confidentiality compromise coupled with disinformation campaigns.

The first case, well-known and subject to much revisionist re-evaluation, is the Stuxnet cyberattack on the Industrial Control System (ICS) for Iran’s nuclear enrichment infrastructure.[14] As best we can tell, in 2007, when the George W. Bush administration considered options for dealing with Iran’s advancing effort to construct a nuclear weapon, diplomatic efforts were considered insufficient, while air strikes by Israel and the US appeared too risky.[15] Cyberattacks on Iran’s centrifuges were weighed as a third option and eventually employed (along with covert action against Iran’s nuclear scientists and engineers).[16] Four years after Stuxnet became public information, Iran agreed to curtail its nuclear programme, agreeing to a Joint Comprehensive Plan of Action (JCPOA) in July 2015.[17] The cyberattack against Iran’s nuclear programme demonstrated technical superiority but was very narrow in scope. Stuxnet made enriching uranium harder for Iran by sowing some degree of chaos in the machines doing that job.[18] It did not deliver the JCPOA, but it certainly demonstrated a degree of technical sophistication not found elsewhere in achieving a major military-diplomatic goal.

Four years after Stuxnet became public information, Iran agreed to curtail its nuclear programme, agreeing to a JCPOA in July 2015.

Russia undertook a very different cyber campaign during the 2016 US presidential election.[19] While systems were subverted, email accounts “hacked”, and information purloined from servers, cyber security was a component of a broader online influence campaign.[20] Cyberattacks against the Democratic National Committee and the Clinton campaign hardly matched the technical novelty of Stuxnet, but still had impact.[21] Emails purloined from the campaign and its chief of staff’s personal email account were made public by a false leaker, Guccifer 2.0, acting as a stand-in for elements of Russian intelligence services.[22] Guccifer was not alone in doing this work. The leaks were a small part of a larger influence campaign to disrupt the 2016 election.[23] Representative Jackie Speier summarised the events in a hearing: “We basically have the brightest minds of our tech community here, and Russia was able to weaponise your platforms to divide us, to dupe us and to discredit democracy.”[24] A combination of leaked information and social media advertisements damaged the Clinton campaign.[25] As with the JCPOA, the cyber actions undertaken by Russia in the 2016 election may not have been definitive in producing the outcome, but they likely shaped it to a degree.

To summarise, various forms of cyberattack diminished Iran’s nuclear enrichment programme and the Clinton campaign. They both exposed how computerised tools and information resources could be subverted to produce an unexpected outcome. They are cases in which forms of information power (or smart power) influenced the outcome of events.

While we have narrow definitions of cyber security and cyberattack, the broader set of phenomena involving cyberattacks must also be considered. A cyberattack is an accepted tool in the repertoire of covert action.[26] Iran’s centrifuges were tampered with by malicious software, but agents or operatives of foreign powers also assassinated some of its nuclear scientists.[27] That targeted violence may also have influenced government decision-making in Tehran. Similarly, the cyberattacks on the Clinton campaign no doubt mattered, but so did Russian propagandists’ words and advertising buys, which formed part of an information strategy designed to affect opinion.

A cyberattack is an accepted tool in the repertoire of covert action.

Today, we are contending with both narrow and broad cyber securities. Consider attacks on cyberspace as well as those delivered through cyberspace again. An attack on cyberspace may be anything from denial-of-service to a kinetic attack by a process control computer. One coming through cyberspace may deliver messages that delegitimise politicians, purloin sensitive data, or confuse citizens.In narrow cyber security, there is a generally accepted offence-dominant bias in favour of the attacker.[28] We do not know whether this offence-dominant bias exists for broader forms of action undertaken through cyberspace.

Interdependencies Between Cyberspace and Information

Some years ago, I wrote “[C]yberspace is a reflection of the human condition”.[29] This claim stands at odds with the military doctrinaires who see cyberspace as the first human-made domain in which forces fight wars.[30] This contention, in turn, is hard to square with Neal Stephenson’s reference to cyberspace as a consensual hallucination.[31] Applying a comprehensive definition to cyberspace remains challenging. It is science fiction that became technological fact. Even the once fanciful concept of a noosphere appears to be more tangible.[32]

The linkages between the global digital infrastructure we call cyberspace and the information space of news, microblogs, short videos, and all manner of other images and text seem inextricable at this point. All the information that we consider news comes in packets. Flows of such information are subject to disruption in cyberspace. A decade ago, the website of The New York Times was knocked offline when the paper’s domain name registrar was compromised by malicious hackers calling themselves the Syrian Electronic Army (SEA).[33] This disruption came at a critical point in Syria’s civil war, after the Assad regime used nerve agent chemical weapons against domestic insurgents. Days before the hack, SEA reputedly compromised the Twitter account of the Associated Press and then sent a tweet: “Breaking: Two explosions in the White House and Barack Obama is injured[.]”[34] An ancillary result of the latter action was the $136 billion “flash crash” on the New York Stock Exchange. From that knee-jerk reaction, we learned the significant degree to which automated securities trading algorithms were linked to social media.[35]

A decade ago, the website of The New York Times was knocked offline when the paper’s domain name registrar was compromised by malicious hackers calling themselves the Syrian Electronic Army.

It is reasonable to argue that cyberspace and the information space are largely intertwined. For this reason, the U.S. Department of Defense’s labelling of cyberspace as a warfighting “domain” and information as an “environment” might have been a hasty decision.[36] However, there is more. Emergent concerns in cyber security mention AI and even neurocognitive hacking.[37] We remain concerned with the hacking of information resources, but must also accept forms of cyber information influence activity that are designed to alter human processing of information resources, denying, disrupting, or otherwise manipulating them. So much of the information we consume, guided by internet searches or social media prompts, will be computer-controlled or mediated. That can be hacked too.

The Cyber-Information-Influence Nexus

Put into print more than 80 years ago, E.H. Carr’s conceptualisation of information power has advanced from delivery of international propaganda by the mass media of his time to tailored messages delivered via cyberspace today.[38] We see exemplars of information power in the operations to subvert democratic elections, spread disinformation, and incite violence against different ethnic or political groups. Cyberspace is the de facto medium of transmission for contemporary information operations.

What has arisen in the last decade are information and cyber operations from a set of states increasingly hostile to Western democracies. These nations, which include the People’s Republic of China, Russia, North Korea, and Iran (CRNKI), utilise information and computing technologies (ICTs) for espionage, political influence, economic destabilisation, and industrial sabotage. Western powers also use ICTs for espionage and covert action; differences arise in information controls.

The CRNKI states have created enormous infrastructure for information controls. They exchange technology and tradecraft for isolating themselves from the rest of the world’s information ecosystem.[39] The rest of the world varies widely on online information controls. The U.S. and the Organization of Economic Co-operation and Development (OECD) countries largely embrace freedom of speech for online activity. This is not a universal norm, as more than 50 countries have either convicted or incarcerated citizens for their speech online.[40] States reside on a continuum of information and internet freedoms, and those with the greatest degree of freedom may be the most vulnerable to the malign influence of information.

The U.S. and the OECD countries largely embrace freedom of speech for online activity.

A framework of understanding for cyber-information-influence may be found in contemporary advertising. While ads were once distributed in print publications or broadcast to wide audiences in television and radio programming, their transmission has been revolutionised by the internet. Facebook and Google, rebranded Meta and Alphabet, respectively, have generated enormous profits from the capacity of their platforms to issue precisely targeted advertisements to individuals based on their interests and online activity.

The messages pushed to individuals’ devices, including televisions, personal computers, mobile telephones, and wristwatches, can be designed to influence beliefs. The aspiration to convince people to believe particular ideas is nothing new, but the Dick Tracy wristwatch is. This global constellation of internet devices, the äppärät for most of humanity, attracts an enormous amount of human attention.[41] Communicating ideas to äppärät is at the centre of a cyber-information-influence strategy. Returning to the relentless drive for advertising as a vehicle for Silicon Valley revenue, there is a certain irony that one of the two most popular mobile phone operating systems is largely designed to facilitate Google’s Adwords mobile advertising software.

Our technical understanding of protocols, software, and hardware for delivering messages to computerised devices is relatively solid. Much remains to be learned when it comes to the efficacy and dynamics of the human-machine interface, however.[42] How do we know what ideas will take hold with individuals? Which individuals will influence their peers to believe such ideas? To what extent is a cyber-information-influence campaign effective at drawing societal attention and gaining acceptance among a large audience? The answer may be found in research programmes in cognitive warfare, for which multiple views are emerging.

 “Cognitive warfare is … an unconventional form of warfare that uses cyber tools to alter enemy cognitive processes, exploit mental biases or reflexive thinking, and provoke thought distortions, influence decision-making and hinder action, with negative effects, both at the individual and collective levels.”[43] Similarly, a pair of researchers posit that “Cognitive warfare is specific to the domestic information environments of … states … and takes as its overarching goal to undermine or shape domestic political processes by changing mindsets”.[44] Both of these point to Carr’s third leg of power, that over opinion. There may be a historical record of how propaganda and hybrid warfare have worked together, but what is relatively new is the computerisation of information.

Cognitive warfare is specific to the domestic information environments of … states … and takes as its overarching goal to undermine or shape domestic political processes by changing mindsets.

“Cognitive warfare is not new. Weaker parties in an asymmetric conflict have manipulated information and ideas to convince stronger opponents to not fight … What is new is the extent to which technologies enable cognitive warfare – resulting in the delegitimization of governments by sowing discord and creating division in order to compel acceptance of political will.”[45]

At the beginning of the Intifada in 1987, Palestinians shifted their methods from violent terrorist activity to futile acts of stone-throwing against well-armed Israeli soldiers and police. The First Intifada’s stone- throwers represent asymmetric victories of the (somewhat) non-violent or the ideologically driven, attempting to find support for their cause in a display of weakness. The application of overwhelming force employed by Israel’s security forces has only served to generate greater sympathy for the Palestinians. Conversely, the terror bombings of the Second Intifada are antithetical to the sympathy created by the futile resistance of its predecessor. These were both insurgencies tied to traditional media, especially television. The tableau of contemporary media is electronic, computerised, and persistent. Mobile devices, social media, and constant connection likely alter human cognition.[46] They change the methods of cognitive warfare, but the aim is still the same: to change how others think and feel about a particular people, cause, or issue.

At hand is how computing may change the discovery, presentation, and exchange of information in politics. A preferable term to cognitive warfare may be computerised political-cognitive influence (computational propaganda is also useful). What is important to recognise is that this exercise of power largely falls inside Carr’s “power over opinion” category or Nye’s “soft power”. The questions our discipline needs to ask now are: Do cognitive techniques work in statecraft, and how may we measure their effectiveness? Answers may be found in the areas of information and neuroscience, as well as psychology and computing.[47] Obviously, the more we learn about how digital devices affect our minds and perceptions, the more we know about how ICTs can influence beliefs and opinions.

Do cognitive techniques work in statecraft, and how may we measure their effectiveness?

Where from Here?

In Western democracies, an enormous amount of effort is expended in the media and news on offering opinions and exercising persuasion. It is puzzling why some groups or states resist, often at great cost, while others may capitulate with relative ease. Why has Ukraine stood fast against the most recent Russian invasion since February 2022?[48] How come the Afghan government left behind by the US-led international force there collapsed in days?[49]

What can either of those examples tell us about a war over Taiwan? In understanding hybrid conflict involving information and cognition, we seek to know which tools of information power may produce the desired outcomes for those who wield them. In understanding the linkage between computing, information, and influence, we remain in the earliest of days. Hacking systems remains relatively easy. Hacking publics, states, and even alliances are far more challenging tasks. Cyber-information-influence tools that are simple in their employment and predictable in their effects are most likely quite a way off. That may not be all bad.

Note: This paper is a synthesis of materials produced for the early October 2023 Cyber Power Symposium on Hybrid Conflict/Warfare held by the European Centre of Excellence for Countering Hybrid Threats. An earlier draft was presented at the International Studies Association sub-conference at the US Air Force Academy in late October 2023.


Chris Bronk is an associate professor at the University of Houston and director of its cybersecurity graduate program. He has conducted research on the politics and diplomacy of cyberspace; critical infrastructure protection; propaganda and disinformation; counter-terrorism; and cybersecurity. He has served as both a Foreign Service Officer and Senior Advisor at the U.S. Department of State. The views expressed in this article are the author’s alone and do not represent those of the University of Houston nor the State of Texas.


[1] John Arquilla and David Ronfeldt, “Cyberwar is coming!,” Comparative Strategy 12, no. 2 (1993): 141-165.

[2] James Der Derian, “The question of information technology in international relations,” Millennium 32, no. 3 (2003): 441-456.

[3] Joseph S. Nye Jr, The future of power, PublicAffairs, 2011.

[4] E. H. Carr, The twenty years’ crisis, 1919-1939: Reissued with a new preface from Michael Cox. Springer.

[5] Katie Hafner and John Markoff, Cyberpunk: outlaws and hackers on the computer frontier, revised. Simon and Schuster, 1995.

[6] Norbert Wiener, Cybernetics or Control and Communication in the Animal and the Machine, MIT press, 2019.

[7] Ronald J. Deibert and Rafal Rohozinski, “Risking security: Policies and paradoxes of cyberspace security,” International Political Sociology 4, no. 1 (2010): 15-32.

[8] Cybersecurity, National Institute of Standards and Technology, https://csrc.nist.gov/glossary/term/cybersecurity.

[9] Ben Buchanan, The cybersecurity dilemma: Hacking, trust, and fear between nations, Oxford University Press, 2016.

[10] Chris Bronk, “Cybersecurity,” in: David J. Galbreath and John R. Deni, eds. Routledge Handbook of Defence Studies, London, New York: Routledge, 2018.

[11] Willis Ware, Security controls for computer systems, Rand Corp. Tech. Rep, 1970.

[12] Thomas Rid, Cyber war will not take place, Oxford University Press, USA, 2013.

[13] Brandon Valeriano, Benjamin M. Jensen and Ryan C. Maness, Cyber strategy: The evolving character of power and coercion, Oxford University Press, 2018.

[14] Erik Gartzke, “The myth of cyberwar: bringing war in cyberspace back down to earth,” International Security 38, no. 2 (2013): 41-73.

[15] Kim Zetter, Countdown to zero day: Stuxnet and the launch of the world’s first digital weapon, Crown, 2015.

[16] William Tobey, “Nuclear scientists as assassination targets,” Bulletin of the Atomic Scientists 68, no. 1 (2012): 61-69.

[17] Ardavan Khoshnood, “The Attack on Natanz and the JCPOA,” BESA Center Perspectives Paper 1,997 (2021).

[18] Ralph Langner, “Stuxnet: Dissecting a cyberwarfare weapon,” IEEE Security & Privacy 9, no. 3 (2011): 49-51.

[19] Andy Greenberg, Sandworm: A new era of cyberwar and the hunt for the Kremlin’s most dangerous hackers, Anchor, 2019.

[20] Siyu Lei, Silviu Maniu, Luyi Mo, Reynold Cheng and Pierre Senellart, “Online influence maximization,” in: Proceedings of the 21st ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 645-654. 2015.

[21] Andy Greenberg, Sandworm: A new era of cyberwar and the hunt for the Kremlin’s most dangerous hackers, Anchor, 2019.

[22] Eric Lipton, David E. Sanger and Scott Shane, “The perfect weapon: How Russian cyberpower invaded the US,” The New York Times 13 (2016).

[23] Michael Buratowski, “The DNC server breach: who did it and what does it mean?,” Network Security 2016, no. 10 (2016): 5-7.

[24] https://www.nytimes.com/2017/11/01/us/politics/russia-2016-election-facebook.html.

[25] Gabi Siboni and David Siman-Tov, The superpower cyber war and the US elections, Institute for National Security Studies (INSS), 2016.

[26] Chris Bronk, “Getting creative on what will do: cyber espionage, conflict and covert action,” Conflict and Covert Action (2016).

[27] Sharon Weinberger, “Murders unlikely to slow Iran’s nuclear efforts: experts say international sanctions are the best way to stall the weapons programme,” Nature 481, no. 7381 (2012): 249-250.

[28] Rebecca Slayton, “What is the cyber offense-defense balance? Conceptions, causes, and assessment,” International Security 41, no. 3 (2016): 72-109.

[29] Chris Bronk, Cyber threat: The rise of information geopolitics in U.S. national security, Bloomsbury Publishing USA, 2016.

[30] Glenn Alexander Crowther, “The cyber domain,” The Cyber Defense Review 2, no. 3 (2017): 63-78.

[31] William Gibson, Burning chrome, Hachette UK, 2017.

[32] Teilhard De Chardin, The future of man, Image, 2004.

[33] Christine Haughney and Nicole Perlroth, “Times Site Is Disrupted in Attack by Hackers,” The New York Times, August 27, 2013.

[34] Max Fisher, “Syrian hackers claim AP hack that tipped stock market by $136 billion. Is it terrorism?,” The Washington Post, April 23, 2013.

[35] Ilya Zheludev, Robert Smith and Tomaso Aste, “When can social media lead financial markets?,” Scientific reports 4, no. 1 (2014): 4213.

[36] Gian Piero Siroli, “Considerations on the cyber domain as the new worldwide battlefield,” The International Spectator 53, no. 2 (2018): 111-123.

[37] Kim Hartmann and Christoph Steup, “Hacking the AI – the next generation of hijacked systems,” in: 2020 12th International Conference on Cyber Conflict (CyCon), vol. 1300, 327-349. IEEE, 2020 and John J. Heslen, “Neurocognitive hacking: A new capability in cyber conflict?,” Politics and the Life Sciences 39, no. 1 (2020): 87-100.

[38] Edward Hallett Carr, The twenty years’ crisis, 1919-1939: Reissued with a new preface from Michael Cox, Springer, 2016.

[39] Andrea Kendall-Taylor and David Shullman, Navigating the deepening Russia-China partnership, Center for a New American Security, 2021.

[40] Adrian Shahbaz and Allie Funk, Freedom of the Net: The Global Drive to Control Big Tech, Freedom House, 2021.

[41] Gary Shteyngart, Super sad true love story: A novel, Random House Trade Paperbacks, 2011.

[42] Ben Shneiderman, Software psychology: Human factors in computer and information systems (Winthrop computer systems series), Winthrop Publishers, 1980.

[43] B. Claverie, B. Prébot, N. Buchler and F. Du Cluzel, “Cognitive Warfare: The Future of Cognitive Dominance,” in: First NATO scientific meeting on Cognitive Warfare (France)‒21 June, 2022-03, 2021.

[44] Oliver Backes and Andrew Swab, “Cognitive Warfare,” The Russian Threat to Election Integrity in the Baltic States, Cambridge: Belfer Center for Science and International Affairs(2019).

[45] Laurie Fenstermacher, David Uzcha, Katie Larson, Christine Vitiello and Steve Shellman, “New perspectives on cognitive warfare,” in: Signal Processing, Sensor/Information Fusion, and Target Recognition XXXII, vol. 12547, 162-177. SPIE, 2023.

[46] Henry H. Wilmer, Lauren E. Sherman and Jason M. Chein, “Smartphones and cognition: A review of research exploring the links between mobile technology habits and cognitive functioning,” Frontiers in psychology 8 (2017): 605.

[47] Andreea Stoian-Karadeli and Daniel-Gabriel Dinu, “Securing the Mind: The Emerging Landscape of Cognitive Warfare,” Redefining Community in Intercultural Context (2023): 26.

[48] Timothy Snyder, “Ukraine holds the future: The war between democracy and nihilism,” Foreign Affairs. 101 (2022): 124.

[49] Jennifer Bick Murtazashvili, “The collapse of Afghanistan,” Journal of Democracy 33, no. 1 (2022): 40-54.

You may also like

Comments are closed.