Abstract: Although subduing the opponent’s will has been the pinnacle of warfare since Sun Tzu, the existing notion of cognitive warfare has gained traction with the possibility of influencing the opponent directly via cyberspace and social media. Influence operations via cyberspace entail swaying public opinion, manipulative psychological warfare, and lawfare. The use of law as an instrument of power to affect perception and cognition is possible because of ongoing legal disputes about how to apply (international) law to cyberspace. States can cherry-pick or even assertively exploit variations in interpretations of international law to pursue or defend their national interests as a means of cognitive warfare.
Problem statement: Can states use legal ambiguity as an instrument of power to further their national interests?
So what?: Legislation is exploited to affect the cognition of target audiences. To tackle this, states first need to raise awareness about cognitive influencing and align their NATO/EU position against these aggressors. We must recognize that technological developments outpace legal absorptive capacity. However, we should be cognizant that law is used as an instrument of power. New laws must not reinforce authoritarian practices, but nor should they accentuate Western dominance.
Influence the will: An introduction
In May 2024 Annalena Baerbock, German Federal Minister for Foreign Affairs, attributed a cyberattack on the German Social Democratic Party (SPD) to APT 28, an agent of the GRU, the Russian Military Intelligence Service.[1] The attack, probably a spear-phishing attack, was part of a broader campaign to undermine the June 2024 European (EU) elections. Similarly, NATO’s North Atlantic Council expressed concerns as it witnessed subversive and undermining cyberattacks against the Baltic states, Poland and the United Kingdom.
Elections are precarious periods for democracies; they are conceptual seams where a society moves from one set of elected lawmakers to another. In any system, whether organizing a military campaign or welding a heating system, seams are vulnerable. Liberal democracies are more vulnerable – as there are more seams in a democratic system – than authoritarian states, where there is often no genuine division, let alone a change, of power.
Elections are precarious periods for democracies; they are conceptual seams where a society moves from one set of elected lawmakers to another.
Influencing the people’s will through elections was long part of the game plan in the bipolar Cold War. The Soviet Active Measures and American Political Warfare covered election interference to persuade or manipulate the cognition of foreign audiences and political leaders to elect or put in place a government in line with Soviet or US interests respectively.
Although subduing the opponent’s will has been the pinnacle of warfare since Sun Tzu, the notion of cognitive warfare has gained traction with the growth of cyberspace and the possibility of influencing opposing audiences directly via social media. Cyberspace is a human-made domain that has added three layers to the existing information environment: the hardware itself; the virtual persona we use to communicate online; and the data and protocols that make communication possible.[2] These additional layers provide new target surfaces that state and non-state actors will want to protect or use to engage with others.
The dawn of cyberspace has enabled three cyber-related categories of activities: digital intelligence gathering (espionage) through scanning or copying of data confined to virtual repositories; subversive digital influence operations;[3] and digital undermining.[4] The latter cyberattacks are activities in the virtual dimension that undermine cyberspace with binary code, modify or manipulate data, and degrade or destroy the hardware or protocols, resulting in virtual and/or physical effects in cyberspace. Digital influence operations use cyberspace as a vector (without affecting it) to target the (human) cognitive dimension of groups or audiences, using content, words, memes and footage as “weapons”.[5] Apart from large state-supported activities such as Stuxnet in the past, most cyberattacks witnessed in Ukraine and Gaza have had a limited impact. Conversely, state-level influence operations, including Russian interference during the 2016 US presidential election, did have strategic effects.[6]
Apart from activities in cyberspace, the wars in Ukraine and Gaza have witnessed the emergence of new actors and technologies. Non-state actors, including Anonymous, Microsoft and Elon Musk, play a role in these conflicts without becoming a belligerent party, and artificial intelligence is used in targeting systems in the Gaza war.[7] These topics raise not only operational and ethical but also legal questions – for example, concerning DDoS attacks by a non-state actor and international humanitarian law (IHL), or IHL article 49 AP1’s coverage of cyberattacks.[8]
Using or exploiting states’ varying interpretations of (international) law can even be used as an instrument of power to affect perception and cognition. This form of “lawfare”[9] can be a tool for influencing the cognition of target audiences through cyberspace. States can cherry-pick or assertively exploit the variations in interpretations of international law to pursue or defend their national interests as a means of cognitive warfare.
What is cognitive warfare?
From a security or military perspective the cognitive domain is the pinnacle of warfare. Thinkers such as Thucydides and von Clausewitz argue that the essence of warfare is to subdue an enemy –ensuring that the opposing actor (willingly or unwillingly) becomes convinced that it should change its behaviour and act in accordance with our will.
From a security or military perspective the cognitive domain is the pinnacle of warfare.
In the past the cognitive domain was influenced by physical acts and therefore indirectly by the (threat of the) destruction of armies or capitals. With the inception of cyberspace and the increased knowledge of cognitive psychology,[10] today’s cognitive warfare also directly targets the mind, using influence and information operations and psychological warfare – hence, warfare without the use of kinetic force. Cognitive activities can be applied to persuade our conscious mind. However, their focus is on exploiting our subconscious mind,[11] the main driver of our behaviour: biases; heuristics; intuition; and emotions.
As a conceptual notion cognitive warfare cannot easily be defined. In a research paper by Cluzel it is compared to hacking the minds of individuals to “erode the trust that underpins every society”, which includes the use of neuroscience and technology.[12] Hung and Hung argue that information warfare is a subset of cognitive warfare,[13] and influence operations are merely the cyber-related elements of information warfare. Others argue the opposite, stating that “cognitive warfare has absorbed information warfare”.[14] In both cases there is a shift from controlling the media (information) to controlling the brain (cognition).
NATO’s proposed definition is “deliberate, synchronised military and non-military activities throughout the continuum of competition designed to affect audience attitudes, perceptions and behaviours to gain, maintain and protect cognitive superiority”.[15] Other definitions of cognitive warfare argue that it is a strategy that focuses on altering how a target population thinks, and thus how it acts. Alternatively, they claim that “in cognitive warfare, the ultimate aim is to alter our perception of reality and deceive the brain in order to affect our decision-making”.[16] In all definitions and descriptions of cognitive war, trust and truth are the primary targets.[17]
Cognitive warfare via cyberspace
With the growth of cyberspace our societies have become more digitalized, but warfare is also digitalized. The potential and actual impact of cyberactivities is widely debated. Although some scholars argue that cyberwarfare equates to regular warfare, a more common view is that most cyberoperations will not reach the threshold of war. This means that labelling cyberoperations will benefit from examining the effects they may have rather than the act itself.[18]
A recent example of large-scale cyberactivities is the Russia–Ukraine war. There have been more than 3,500 attacks since the start of the invasion in February 2022.[19] Various actors, including states, have undertaken these attacks. However, 95 per cent can be labelled as DDoS, defacements, or hack (and leak) operations, and some 90 per cent of these were executed by non-state actors. DDoS and defacements are what Gartzke and Lindsay categorize as hindrances or nuisances,[20] neither causing “death and destruction” nor directly supporting a military campaign. Although some cyberattacks have supported operational-level military or diplomatic campaigns, including digital espionage or severe wiperware attacks, none with a severe strategic impact (similar to a cyber Pearl Harbour) has been registered.
Despite the scale, the impact of cyberspace activities in the Russia–Ukraine war appears marginal, possibly due to Ukrainian resistance, resilience (supported by firms such as Microsoft) and faltering Russian operations. There are some notable exceptions, however, as some cyberoperations have served their purpose. First, on the eve of the invasion Russia attacked the “Viasat” satellite internet connection, imposing a digital blackout on Ukrainian forces. Second, Ukrainian president Zelenskyy’s fervent online strategic communication with foreign parliaments has resulted in diplomatic support and the supply of funds, military systems and ammunition.
Despite the scale, the impact of cyberspace activities in the Russia–Ukraine war appears marginal, possibly due to Ukrainian resistance, resilience and faltering Russian operations.
Contrary to undermining cyberattacks, digital influence operations can have strategic effects. While influence operations are not inherently malign, they intend to affect deliberate understanding and autonomous decision-making processes of humans or groups consciously, or preferably, subconsciously. Ultimately, cognitive warfare via influence operations in cyberspace does not aim for the destruction of humans but the “reformatting” of the target audience with values, morality, and an understanding of good and evil in line with the wishes of the attackers.[21]
Since the annexation of Crimea pro-Russian state and non-state actors have conducted cyber-enabled disruptive propaganda and disinformation campaigns to create an information environment with opposing views and perceptions.[22] The main purpose of Russian “information confrontation”[23] operations is to demoralize the Ukrainian population and drive a wedge between Ukraine and its Western allies. Influence operations are also used to target domestic Russian audiences. The narratives used are Western Russophobia, the “denazification and demilitarization” of Ukraine, and the endemic corruption within the Ukrainian government.[24] Ukraine similarly exploits social media. Since the invasion President Zelenskyy has addressed his population online and maintained the morale of his troops, positively affecting the cognitive dimension of both friend and foe.[25] International support is Ukraine’s lifeline and is thus both a centre of gravity and therefore also an Achilles’ heel.[26]
Influence operations, especially manipulative ones, are inherently deceptive and use heuristics and biases, luring the target audience away from a rational decision-making process in favour of what Petty and Cacioppo call the peripheral route.[27] The peripheral route is invoked by using a socially divisive topic to distract a targeted audience, impairing their ability to process incoming data due to the emotional or provocative sentiment attached. Hung and Hung make a similar assessment, arguing that cognitive warfare uses two dimensions: psychological techniques (how our brain works) based on heuristics and repeated stimulation; and the cognitive handling of external information. To influence humans, a gap (or “free energy”) needs to exist – or to be created – between prior predictions and incoming stimuli; in effect, the target audience needs to start to doubt, which is in line with the Russian information confrontation approach.[28]
Western democracies are more vulnerable to manipulative influence operations as an element of cognitive warfare – and hence to Russian information confrontation – because of their open societies built on the freedom of speech, the press, and to vote and be elected. Notions embedded in the principles of legality and legitimacy go hand in hand with the trust people have in the government, judges, and traditional (often written) media.
Western democracies are more vulnerable to manipulative influence operations as an element of cognitive warfare.
Western democracies are entirely free to discuss and absorb incoming stimuli, create new ideas, innovate, fail, and learn. This contrasts with authoritarian states, which attempt to undermine incoming (foreign) stimuli, information and new ideas to ensure the population’s inoculated perception (or prior beliefs) is aligned with the (state-controlled) information environment and not distorted by (false or factual) evidence that will change the prior belief and create doubt.
Legislation in cognitive warfare
Alongside the example of Russia’s information confrontation the Chinese Three Warfares is another example of cognitive warfare. This doctrine, governed mainly by the Chinese Communist Party’s (CCP) United Front Work Department[29] and the People’s Liberation Army,[30] aims to maintain the CCP’s political power and “control the prevailing discourse and influence perceptions to advance China’s interest”.[31] To suppress incoming stimuli and propagate a benign image of the People’s Republic of China (PRC), diasporas are dissuaded from voicing dissenting opinions. The internet and social media are frequently censored domestically.[32] The Three Warfares doctrine not only entails a persuasive and manipulative perception but also a legal one of how to change the attitude and thus the behaviour of targeted audiences – at home or abroad.[33]
Persuasive public opinion warfare, or media warfare, aims to shape “targeted audiences through information derived and propagated by mass information channels”, both traditional (television, newspaper, movies) and on the internet.[34] Public opinion warfare is related to shaping (online) public opinion to transmit a consistent message to the targeted audience in a way favourable to Chinese positions.[35]
Whereas public opinion warfare focuses on framing or highlighting some aspects of the truth while neglecting others, often with a pinch of humour, psychological warfare is more manipulative. Psychological warfare involves using information to pressure an opponent and “create damaging or deleterious habits and ways of thinking, to reduce its will to resist, and perhaps even to induce defeatism and surrender”.[36] Psychological warfare uses a variety of techniques, including intimidation, religious interference,[37] dissuasion, manipulation and deception.[38]
Interestingly, the Chinese Three Warfares are applicable in all conflict phases (from peace to war), using diverging legal interpretations to influence others. Legal warfare is designed “to justify a course of action”,[39] forging a normative environment favourable to China. The PRC’s legal warfare, which echoes Western debates on lawfare,[40] is a tool of non-kinetic warfare that offers influence on an actor’s behaviour to achieve strategic ends. Successful legal warfare limits others’ freedom of movement while expanding the PRC’s freedom of action.[41]
The Chinese Three Warfares are applicable in all conflict phases, using diverging legal interpretations to influence others.
Three Warfares is not a specific policy of the CCP. Its effectiveness is that it is a society-wide endeavour. When addressing foreign audiences, the Three Warfares activities use the PRC’s entire media landscape so that different sources and versions reiterate and reinforce a given message. Outlets include media channels (CGTN), cultural institutes (Confucius Institutes), Chinese exchange students,[42] diaspora communities, think tanks and the Chinese diplomatic network to affect foreign audiences.[43]
Law as an instrument of warfare
The PRC’s legal warfare exploits the ambiguity in international law related to new developments, a discourse that is not new. Nuclear weapons and aeroplanes were introduced after the Laws of Armed Conflict (IHL) were conceived. However, as (international) law is based on principles including military advantage, distinction, proportionality and necessity, not on specific situations or techniques, the law will still apply. In practice a discourse will start on how to apply the existing international law to the new development – for example, in the United Nations Group of Governmental Experts or the Open-Ended Working Group.[44]
On the one hand, as international law is based on principles from which rules are derived, it has always been the purpose of the body of international law to provide legal room to manoeuvre so that generic rules can be applied to a specific situation or new developments.[45] On the other, new developments can cause challenges, not least due to the speed of (technological) developments, including artificial intelligence,[46] human enhancement, drones and cyberspace. This parallax causes uncertainty about how to apply the law. There is a debate in cyberspace about whether sovereignty – a legal obligation in traditional international law – is a rule (obligation) and principle or merely a principle of law; the latter is the UK position. This is not a semantic discussion because if sovereignty is a principle – and hence not an obligation – it cannot be violated. The articles on State Responsibility state that an Internationally Wrongful Act constitutes a breach of a primary rule of law (an obligation) that can be attributed to a state. If sovereignty is breached by a state that does not see it as an obligation, the redress or countermeasure may be a violation of international law, in which case a row could escalate into a conflict.
Another source of ambiguity is whether cyberspace is itself part of the territory of a state and thus subject to its laws. In many Western views territory includes the soil, the territorial sea and the air column above them, not space in general or the virtual aspects of cyberspace – the zeros and ones.[47] In this sense cyberspace’s virtual dimension is borderless. In many authoritarian states the totality of cyberspace is linked to the control of territorial integrity. Hence, the PRC argues that it has digital sovereignty over cyberspace “on its soil”, while Western states only have territorial control of the hardware on their soil.
Moreover, while Western states argue that international law supersedes national law, the Russian constitution argues that national law has priority over international law. Conversely, the PRC uses international law to underline its claims in the South China Sea, for example,[48] and disputes the Western view that only natural (not artificial) islands are part of a territorial claim.
While Western states argue that international law supersedes national law, the Russian constitution argues that national law has priority over international law.
Finally, there is no clear distinction for the PCP between war and peace. Based on the Three Warfares, these forms of “warfare” commence before any actual military engagement and are conducted to shape and prepare the battlefield and its participants. All these forms of the Three Warfares are applicable across the spectrum of war and peace.
How to counter the use of lawfare
The use of law as an instrument of power to affect perception and cognition is possible because of ongoing legal disputes, and states hold varying interpretations about how to apply (international) law to cyberspace. To counter the activities of cognitive warfare effectively, it is critical to understand the aggressor’s intent before responding. NATO and EU states must raise public awareness of possible foreign cognitive warfare activities, including lawfare, and align common positions within the alliances. Finally, a discourse on whether new law is required remains valid.
First, states, especially liberal democracies, must understand that Chinese and Russian cognitive warfare differ in intent and depth. Russian activities are intended to sow confusion through the dissemination of information that conflicts with or confronts existing knowledge. An example of this is the firehose of falsehoods that followed Russia’s downing of MH17. Russian cognitive and influence operations can be seen as a blunt instrument affecting audiences in foreign states, with no other intention than to confuse, sow discord and undermine trust in democratic foundations. Although Russia exploits the variances of international law, it would prefer to neglect it altogether.
Conversely, Chinese activities are subtle and clearly intend to uphold or improve foreign audiences’ benign image of the PRC. The PRC relies on international law but favours a renegotiation of its foundations because, according to the PRC, the current body of international law is a reflection of Western interests. In countering the cognitive activities of Russia or the PRC, the intent of the aggressor needs to be considered. The worst mistake would be to assess the cognitive act in accordance with Western standards.
Raising awareness is (generally) an effective means to counter cognitive warfare. US citizens were unaware of the impact foreign actors’ social media campaigns could have in the run-up to the 2016 presidential election – a naivety that had already largely vanished by the 2018 mid-term elections. Free access to education is pivotal, as are educational programmes for schools on the advantages and dangers of an open and free (and hence unfiltered) internet where this is already the case.
Besides raising awareness, coalition alignment can also block foreign cognitive warfare by formulating a common position and forming a common bloc among NATO/EU member states with partners such as Japan and Australia. Adversaries will exploit the seams in these coalitions, especially when there is no common rationale, as we currently see in the fragile alignment and hence increased friction within the varying positions of NATO/EU member states regarding the Ukraine war.[49]
Besides raising awareness, coalition alignment can also block foreign cognitive warfare by formulating a common position and forming a common bloc among NATO/EU member states with partners.
Most international legal scholars argue that the current law is sufficient. Yet refinement is needed concerning how to apply the law, for which more state practice and legal statements (opinio iuris) by states are needed. There is a danger that this is wishful thinking. It will be a real challenge to align the diverging opinions of states – as sound legal opinions or as a reflection of political pragmatism. Some states are already entrenched or have seen the benefits of using law as an instrument of power during UN/OEWG sessions, for example.
Moreover, new developments (AI, quantum computing) are more complex than in the past, and international law can no longer keep pace with new developments. EU lawmakers remain unable fully to grasp the potential and danger of developments such as AI. They correctly see the need for legislation, however. The result is laws that above all reflect the consensus building of the legislative process, but that are highly ambiguous in content, in turn fuelling legal cherry-picking and hence the use of law as an instrument of power – a devil’s dilemma.
Dr Peter B.M.J. Pijpers is an associate professor of cyber operations at the Netherlands Defence Academy, a researcher at the University of Amsterdam Centre for International Law, and a non-resident fellow at the University of South Florida Global and National Security Institute. Dr Pijpers has published on the legal and cognitive dimensions of influence operations in cyberspace, and how armed forces can manoeuvre in the information environment. See also Orcid ID 0000-0001-9863-5618. The author can be contacted at [email protected]. The views contained in this article are the author’s alone and do not represent those of the Netherlands Defence Academy.
[1] APT means an advanced persistent threat (usually a state (financed) cyber actor); the GRU is the Russian military intelligence service. See Marcel Rosenbach and Christophe Schult, ‘Baerbocks Digitaldetective decken russische Lügenkampagne auf’, Der Spiegel, 26 January, 2024, https://archive.ph/2024.01.26-114242/https://www.spiegel.de/politik/deutschland/desinformation-aus-russland-auswaertiges-amt-deckt-pro-russische-kampagne-auf-a-765bb30e-8f76-4606-b7ab-8fb9287a6948.
[2] Peter B.M.J. Pijpers, ‘Careful what You Wish for: Tackling Legal Uncertainty in Cyberspace’, Nordic Journal of International Law Volume 92, Issue 3 (2023): 397–399.
[3] Andreas Krieg, Subversion: The Strategic Weaponization of Narratives, 2023.
[4] Peter B.M.J. Pijpers and Kraesten L. Arnold, ‘Conquering the Invisible Battleground’, Atlantisch Perspectief Volume 44, Issue 4 (2020): 11–14; Paul A.L. Ducheine, Peter B.M.J. Pijpers, and Kraesten L. Arnold, ‘The “Next” War Should Have Been Fought in Cyberspace, Right?’, in Beyond Ukraine, Debating the Future of War, eds Tim Sweijs and Jeff Michaels (Hurst Publishers, 2024); Paul A.L. Ducheine, Jelle van Haaster, and Richard van Harskamp, ‘Manoeuvring and Generating Effects in the Information Environment’, in Winning Without Killing: The Strategic and Operational Utility of Non-Kinetic Capabilities in Crisis – NL ARMS 2017, ed. Paul A.L. Ducheine and Frans P.B. Osinga, 2017.
[5] Miranda Lupion, ‘The Gray War of Our Time: Information Warfare and the Kremlin’s Weaponization of Russian-Language Digital News’, Journal of Slavic Military Studies, Volume 31, Issue 3 (2018): 329–330; Calder Walton, ‘What’s Old Is New Again: Cold War Lessons for Countering Disinformation’, Texas National Security Review, Fall 2022.
[6] Ellen Nakashima, ‘Pentagon Launches First Cyber Operation to Deter Russian Interference in Midterm Elections’, The Washington Post, 2018, https://www.washingtonpost.com/world/national-security/pentagon-launches-first-cyber-operation-to-deter-russian-interference-in-midterm-elections/2018/10/23/12ec6e7e-d6df-11e8-83a2-d1c3da28d6b6_story.html.
[7] Yuval Abraham, ‘“Lavender”: The AI Machine Directing Israel’s Bombing Spree in Gaza’, +972 Magazine, April (2024), https://www.972mag.com/lavender-ai-israeli-army-gaza/.
[8] Article 49.1. of the 1977 Additional Protocol (1) to the Geneva Conventions states: ‘“Attacks” means acts of violence against the adversary, whether in offence or in defence.’
[9] Orde F. Kittrie, Lawfare: Law as a Weapon of War, (Oxford University Press, 2016), 4–8.
[10] Francois du Cluzel, ‘Cognitive Warfare’ (Innovation Hub, 2021), 12.
[11] Cornelus van der Klaauw, ‘Cognitive Warfare’, in: The Three Swords Volume 39 (2023), 99.
[12] Francois du Cluzel, ‘Cognitive Warfare’, 7.
[13] Tzu-chieh Hung and Tzu-wei Hung, ‘How China’s Cognitive Warfare Works: A Frontline Perspective of Taiwan’s Anti-Disinformation Wars’, Journal of Global Security Studies Volume 7, Issue 4 (2020): 2–4.
[14] Russtrat, ‘Cognitive Warfare: War of a New Generation’, Institute of Russian Strategies, 24 December 2021, https://russtrat.ru/en/analytics_/24-december-2021-2228-7813.
[15] NATO Cognitive Warfare Concept, version of 17 April 2024, Supreme Allied Command Transformation.
[16] Cornelus van der Klaauw, ‘Cognitive Warfare’, 100.
[17] Alonso Bernal et al., ‘Cognitive Warfare: An Attack on Truth and Thought’, NATO & John Hopkins, 2020; Francois du Cluzel, ‘Cognitive Warfare’, (Innovation Hub, 2021), 8–9.
[18] Michael N. Schmitt, Tallinn Manual 2.0 on the International Law Applicable to Cyber Operations, 2nd ed., (Cambridge University Press, 2017).
[19] CyberPeaceInstitute, ‘Cyber Dimensions of the Armed Conflict in Ukraine’ (2023), https://cyberconflicts.cyberpeaceinstitute.org.
[20] Jon Lindsay and Erik Gartzke, ‘Coercion through Cyberspace: The Stability-Instability Paradox Revisited’, in The Power to Hurt: Coercion in Theory and in Practice, 2016, 179–203.
[21] Russtrat, ‘Cognitive Warfare: War of a New Generation’.
[22] Todd C. Helmus et al., Russian Social Media Influence: Understanding Russian Propaganda in Eastern Europe (Rand Corporation, 2018), 7–25.
[23] Michelle Grisé et al., Russian and Ukrainian Perspectives on the Concept of Information Confrontation, Rand Research Report, 2022, 5–10.
[24] Tine Molendijk, ‘Morale and Moral Injury among Russian and Ukrainian Combatants’, in Reflections on the Russian-Ukrainian War, eds Maarten Rothman, Lonneke Peperkamp, and Sebastiaan Rietjens, (Leiden University Press, 2024), 99–106.
[25] The story of a Ukrainian fighter pilot, ‘the Ghost of Kyiv’, went viral online. Another occurrence concerned the bold response of Ukrainian troops defending Snake Island after Russia’s Black Sea Fleet flagship ‘The Moskva’ demanded their surrender or the attack on the Kerch bridge.
[26] Paul A.L. Ducheine, Peter B.M.J. Pijpers, and Kraesten L. Arnold, ‘The “Next” War Should Have Been Fought in Cyberspace, Right?’, 101–104.
[27] Richard E. Petty and John T. Cacioppo, ‘The Elaboration Likelihood Model of Persuasion’, Advances in Experimental Social Psychology 19 (1986): 126.
[28] T.S. Allen and A.J. Moore, ‘Victory without Casualties: Russia’s Information Operations’ Parameters Volume 48, Issue 1 (2018): 60.
[29] Marcel Angliviel de la Beaumelle, ‘The United Front Work Department: “Magic Weapon” at Home and Abroad’, China Brief Volume 17, Issue 9 (2017).
[30] But not solely: the ministry of State Security, the Taiwan Affairs office and the Central Committee of the Party (international liaisons, propaganda and the United Front work department) are involved, to name only a few.
[31] Pieter Zhao, ‘Chinese Political Warfare: A Strategic Tautology? The Three Warfares and the Centrality of Political Warfare within Chinese Strategy’, The Strategy Bridge, August (2023), https://thestrategybridge.org/the-bridge/2023/8/28/chinese-political-warfare-a-strategic-tautology.
[32] Alina Polyakova and Chris Meserole, ‘Exporting Digital Authoritarianism: The Russian and Chinese Models’, Policy Brief, Democracy and Disorder Series, 2019, 1–22, 2–6.
[33] Albert Zhang, ‘Gaming Public Opinion Influence Operations’, ASPI Policy Brief no. 71 (2023).
[34] Dean Cheng, Cyber Dragon: Inside China’s Information Warfare and Cyber Operations, Praeger, 2017, 51–53; Peter Mattis, ‘China’s ‘Three Warfares’ in Perspective’, in War on the Rocks, 2023.
[35] See e.g.: CGTN Official, ‘Samarland, Listed by UNESCO as a World Heritage Site’, X (Twitter), 2023, https://twitter.com/cgtnofficial/status/1707625764412440805?s=43&t=7eecH6cep1ONZNAMcRFBlw.
[36] Deng Cheng, Cyber Dragon: Inside China’s Information Warfare and Cyber Operations, 44–45.
[37] Tzu-chieh Hung and Tzu-wei Hung, ‘How China’s Cognitive Warfare Works: A Frontline Perspective of Taiwan’s Anti-Disinformation Wars’, 4.
[38] Paul Charon and Jean-Baptiste Jeangène Vilmer, ‘Chinese Influence Operations: A Machiavellian Moment’, IRSEM, 49–51; Nadine Yousif, ‘MP Michael Chong Urges US–Canada Cooperation on China Interference’, BBC News, 2023, https://www.bbc.com/news/world-us-canada-66791749.
[39] Emilio Iasiello, ‘China’s Three Warfares Strategy Mitigates Fallout from Cyber Espionage Activities’, Journal of Strategic Security Volume 9, Issue 2 (2016): 56.
[40] Aurel Sari, ‘Hybrid Threats and the Law: Concepts, Trends and Implications’, 2020, 10–12.; Bret Austin White, ‘Reordering the Law for a China World Order: China’s Legal Warfare Strategy in Outer Space and Cyberspace’, Journal of National Security Law & Policy Volume 11, Issue 2 (2021): 435–88.
[41] Charon and Jeangène Vilmer, ‘Chinese Influence Operations: A Machiavellian Moment’, 51–55.
[42] Pieter Zhao, ‘Chinese Political Warfare: A Strategic Tautology? The Three Warfares and the Centrality of Political Warfare within Chinese Strategy’, The Strategy Bridge, August (2023), https://thestrategybridge.org/the-bridge/2023/8/28/chinese-political-warfare-a-strategic-tautology.
[43] Rush Doshi and Robert D. Williams, ‘Is China Interfering in American Politics?’, Lawfare, October (2018).
[44] United Nations General Assembly, ‘Final Substantive Report’, Open-Ended Working Group on Developments in the Field of Information and Telecommunications in the Context of International Security, 2021.
[45] See e.g. the so-called Martens Clause in the preamble of the 1899 Hague Convention of the Law and Customs of War on Land.
[46] Todd C. Helmus, ‘Artificial Intelligence, Deepfakes, and Disinformation: A Primer’, Rand Perspective, July (2022); Adrian Agenjo, ‘Lavender Unveiled: The Oblivion of Human Dignity in Israel’s War Policy on Gaza’, Opinio Juris, April (2024): 1–5, http://opiniojuris.org/2024/04/12/lavender-unveiled-the-oblivion-of-human-dignity-in-israels-war-policy-on-gaza/.
[47] Michael N. Schmitt, ‘Wired Warfare 3.0: Protecting the Civilian Population during Cyber Operations’, International Review of the Red Cross (Cambridge University Press, 1 April 2019).
[48] National Institute for South China Sea Studies, ‘A Legal Critique of the Award of the Arbitral Tribunal in the Matter of the South China Sea Arbitration’, Asian Yearbook of International Law 24 (2020).
[49] Vladimir Soldatkin and Anita Komuves, ‘Hungary’s Orban talks Ukraine peace with Putin, stirring EU outcry’, Reuters, https://www.reuters.com/world/europe/hungarys-orban-says-no-position-negotiate-between-ukraine-russia-2024-07-05/.