Cognitive Warfare, Disinformation, FIMI
ArticlesEnglish Articles

Disinformation In Cognitive Warfare, FIMI, Hybrid Threats

Abstract: Cognitive Warfare, Foreign Information Manipulation and Interference (FIMI), Hybrid threats – how does disinformation strategically target vulnerabilities within democratic institutions? Rather than being competitive, these concepts are complementary and pertain to distinct societal domains. A nuanced understanding of disinformation in varied conceptual, social, and strategic contexts equips policymakers and defence agencies to craft cohesive, context-specific responses, effectively countering disinformation’s impact on democratic systems.

Problem statement: How can the functional differences of disinformation in Cognitive Warfare, Foreign Information Manipulation and Interference, and Hybrid Threats be utilised to design more adequate responses?

So what?: Various disinformation functions exploit different vulnerable points in democratic systems and institutions. In reaction to these threats, institutions have designed different explanatory concepts to inform their responses. These different routes of inquiry need to be utilised to form a more comprehensive understanding of the threat landscape and to design integrated responses that ensure interoperability between concepts and institutions.

Fake news propaganda conspiracy theories disinformation manipulation. Headline news titles international media abstract concept 3d illustration.

Source: shutterstock.com/Skorzewiak

Non-Kinetic and Below the Threshold of War

Disinformation is a significant element in concepts like Cognitive Warfare,[1] Hybrid Threats[2] and Foreign Information Manipulation and Interference (FIMI)[3]. The common denominator of these three concepts is that adversarial actors use them to harm democratic societies while relying on non-kinetic means to remain under the threshold of war. Thus, disinformation and its functionality—a vital means in all three concepts—cannot remain constant across the literature of the FIMI, Cognitive Warfare, and Hybrid Threats.

Disinformation, misinformation, and other information disorders have received significant societal and scientific attention since at least 2016.[4] As a stand-alone tactic, disinformation can be used by private actors for monetary gains or the pursuit of other goals. Governmental or non-governmental political actors can use it to change public opinion, pursue political goals or exert political power. State actors, civilian or military, may also weaponise disinformation tactics to influence or destabilise the political and democratic systems of adversaries.[5]

State actors, civilian or military, may also weaponise disinformation tactics to influence or destabilise the political and democratic systems of adversaries.

Disinformation is mainly distributed in the digital realm. However, it is not limited to it. In a metaphorical sense, it is noteworthy that misleading physical flyers strategically placed in mailboxes in the right place at the right time can potentially yield a more enduring impact compared to the incessant daily flow of digital information. However, disinformation tactics in the digital realm include disseminating false or misleading information, overt or covert propaganda, and promoting conspiracy theories. Tactics can also include the harassment of individuals or groups through trolling, the creation of fake grassroots movements (astroturfing), and the use of artificial intelligence (AI) to produce audio-visual deepfakes. Finally, actors can engage in the theft and release of information—phishing, hacking and doxing. As a result, open, liberal democratic societies are particularly susceptible to this type of interference.[6]

With more regulated media environments, societies like Singapore possess greater means to combat disinformation in their respective information environments, making them less vulnerable to the effects of unchecked disinformation.[7] However, the potential for misuse is high. Approaches like Singapore’s Protection from Online Falsehoods and Misinformation Act (POFMA) might be incompatible with the core democratic principles of freedom of expression and free speech inherent to liberal democratic societies, depending on the implementation. Still, the Singaporean approach aims to police the relevant information environment and impose the same rules on the digital realm that have been applied to the non-digital information space for decades. This contrasts starkly with the Chinese approach, which largely disconnects the Chinese digital sphere from the rest of the global internet.[8] This results in an information environment in which state entities do not just police but completely control and censor, preventing free speech and any kind of genuine democratic discourses.[9] King et al. noted that Chinese censorship explicitly targets the suppression of social mobilisation, rendering it nearly impossible to disseminate genuine or spurious news within this system.[10] Consequently, it can be posited that the impact of disinformation on China’s tightly controlled information environment is relatively minimal. However, genuine democratic discourse is not possible under such conditions.

State and non-state actors increasingly rely on digital public spheres and social media platforms to conduct disinformation campaigns to influence political processes.[11] Such campaigns are often regarded as information warfare, seeking to destabilise democracy in targeted states.[12]

One of the most well-known instances of state interference in democratic processes in recent years is the Russian interference in the 2016 U.S. presidential election in favour of the Republican candidate, Donald J. Trump.[13] The details of Russian activities in the 2016 U.S. presidential election were documented in the report by Special Counsel Robert Mueller.[14] The report documents that Russia interfered through operations conducted by the Internet Research Agency (IRA). The IRA conducted a complex social media campaign with a so-called “information warfare” approach to manipulate (online) public discourse to benefit the campaign of Donald J. Trump. A second path of interference named in the report was coordinated cyber intrusions into the computer infrastructure of Hillary Clinton’s campaign as well as other institutions of the Democratic Party related to the campaign. This led to the stealing and the leaking of internal campaign information.[15] The effectiveness of the Russian interference is hard to quantify. However, the efforts seem to have fallen on fruitful grounds. For example, Allcott and Gentzkow found that the average U.S. adult read and recalled one or more fake news articles regarding the 2016 election, with a higher exposure to pro-Trump articles.[16]

One of the most well-known instances of state interference in democratic processes in recent years is the Russian interference in the 2016 U.S. presidential election in favour of the Republican candidate, Donald J. Trump.

Other notable examples of the interference in democratic decision-making processes with disinformation include the “Brexit” referendum in the United Kingdom (UK) in 2016, the French presidential and Kenyan general elections of 2017, the Brazilian general elections in 2018 and 2022 as well as the U.S. presidential election 2020, among others.[17] It is important to note that disinformation campaigns are not only employed to influence elections and electoral outcomes. They are employed to weaken democratic systems by discrediting political actors, democratic institutions, societal groups and minorities and exploiting a given society’s wedge issues. A striking example of such operations is the longtime presence of Russia Today (RT) in many European countries.[18] RT is one of the world’s largest state propaganda organisations.[19] RT’s primary goal is to legitimise the Kremlin’s agenda and defend the Russian state by manipulating discourses about Russia in foreign media landscapes.[20] In 2022, the EU subsequently banned RT, Sputnik, and other outlets to prevent the further spread of Russian disinformation through media channels.[21] In practice, readers and outlets can bypass the sanctions.[22], [23]

Disinformation and its effects continuously threaten liberal democratic systems worldwide. Disinformation is a pivotal tactic within cognitive warfare, FIMI, and hybrid threats, each exhibiting a nexus of shared attributes. A comprehensive investigation into the role of disinformation within these contexts is imperative. Considering the varied means and ends, scrutiny promises a deeper understanding of the threat it poses to liberal democratic systems.

Disinformation and Deliberative Democracy

Disinformation finds fertile ground in liberal democratic societies characterised by advanced digitalisation. This is largely because of a confluence of AI — which can effortlessly generate text, images, sound, and video — the pervasive use of social media and the relentless 24-hour news cycle. In such environments, disinformation can quickly spread, intentionally or otherwise. In political and public discourse, the concepts of disinformation, misinformation, and “fake news” are often inadequately defined and are sometimes used synonymously. The term fake news, particularly, proves less conducive for research endeavours. Given its high degree of politicisation, it holds little analytical value, as it lacks a precise definition and often functions as an empty signifier.[24]

This paper uses the definition of disinformation by Wardle and Derakhshan; the European Commission has adopted this definition.[25] It defines information disorders as consisting of three concepts: disinformation, misinformation and malinformation. Disinformation is defined as “demonstrably false or misleading information which is conceived, presented and disseminated for economic gain or deliberate deception of the public and is likely to cause public harm”.[26] Misinformation is also false information, but crucially, that which is disseminated without intent, for example, content with false connections or misleading information. Intentional dissemination of false or misleading information at the same time functionally distinguishes the concept of disinformation from misinformation.[27] Lastly, malinformation is genuine information deliberately disseminated to inflict harm; examples include harassment and hate speech using leaked or stolen information or media. Analytically, disinformation proves to be the most practical concept. It accurately pinpoints the direction of influence and is easy to measure. It effectively captures what we aim to study when investigating the harmful activities of adversaries within democratic systems.

Information disorders are consisting of three concepts: disinformation, misinformation and malinformation.

It is important to note that disinformation is not a new phenomenon. Disinformation has been used throughout human history to manipulate opinions, decisions, and behaviour. The specific tactics of deliberately spreading falsehoods for monetary, strategic, or political gain have been constantly adapted throughout history as political and social realities and, more importantly, available technologies have changed and evolved. Before the advent of powerful media technologies, the spread of disinformation relied on word of mouth and a small number of valuable handmade documents and artefacts that were not widely available. The transformation began with Gutenberg’s 1450 printing press, revolutionising information distribution. Early on, a German Renaissance artist, Lucas Cranach the Elder, pioneered a workshop (the “Cranach-Werkstatt”) producing large numbers of commissioned paintings and prints. Many artworks advanced the political agendas of the royal and Reformation movements.[28] Mass production cuts costs, enabling broader dissemination of purpose-built imagery. These instances illustrate technology’s pivotal role in information propagation and the evolution of disinformation strategies.

Contemporary societies are defined by information technology, rapid information exchange, and interconnectivity. Therefore, it is crucial to assess the potential impact of disinformation in these information environments.[29] This work adopts a deliberative perspective on democracy to illustrate the effects of disinformation in a given democratic system. Deliberative democracy prioritises rational and comprehensive public discourse to reach collective decisions. It revolves around the notion that individuals should participate in transparent dialogue, exchange various viewpoints and strive for consensus in reaching informed and equitable policy decisions. Deliberative democracy aims to address the limitations of traditional representative democracy by promoting citizen involvement and encouraging reflective discussions.

Deliberative democracy prioritises rational and comprehensive public discourse to reach collective decisions.

The systemic approach to deliberative democracy expands on this idea by exploring the broader impact of deliberation on the political system.[30] It evaluates how deliberative processes affect public opinion, policy results, and democratic institutions. The approach emphasises the importance of diverse participation, connects deliberation to decision-making, prioritises transparency and accountability, and considers the surrounding socio-political context. By focusing on both micro-level deliberation and macro-level systemic impacts, the systemic approach provides a more comprehensive understanding of how deliberative processes contribute to informed and effective democratic governance.

In a systemic view of deliberate democracy, multiple public spaces of deliberation, as well as empowered spaces, exist.[31] The sum of public spaces forms the wild space of deliberation. Empowered spaces are characterised by their capacity to make collectively binding decisions. The different spaces are connected through diverse transmission mechanisms that convey the arguments, ideas, claims and justifications that characterise discourse.[32] Because transmission mechanisms can potentially alter arguments, ideas, claims and justifications by forwarding them and changing them (filtering, shaping, or contesting them), disinformation can influence the entire deliberative system. McKay and Tenove describe three functions of deliberative systems, which are all targeted by specific forms of disinformation: “The epistemic function promotes the likelihood that opinions and decisions will be informed by facts and logic, the ethical function promotes mutual respect among citizens, and the democratic function promotes inclusion and equal opportunities for participation”.[33] First, corrosive falsehoods spread by disinformation campaigns harm democratic deliberation by fostering misperceptions and discrediting credible sources. This corrosion of credibility leads to epistemic cynicism and disrupts informed decision-making by distorting preferences and opinions.[34] Algorithmic gatekeeping and viral spread amplify false claims, undermining a deliberative system’s epistemic role. Second, moral denigration involves disrespectful language and false accusations against opponents, undermining mutual respect and speech norms essential for deliberation.[35] In the digital domain, disinformation exploits fake accounts, bots, and promotions to create fake citizen support, distorting social group beliefs and deepening political polarisation. Third, unjustified inclusion, facilitated by fake or algorithmic accounts and foreign interference, disrupts democratic discussions and leads to pervasive inauthenticity.[36] It devalues content by genuine participants, eroding trust in and authenticity of discourse. This practice can marginalise certain groups within the digital discourse, causing internal exclusion and unequal treatment. For example, if a given group is depicted negatively in identity politics disinformation narratives.[37] These three forms of disinformation specifically target the epistemic, ethical and democratic functions of deliberative systems. They may severely harm democratic legitimacy if used effectively.

The epistemic function promotes the likelihood that opinions and decisions will be informed by facts and logic, the ethical function promotes mutual respect among citizens, and the democratic function promotes inclusion and equal opportunities for participation.

The distribution of increasing quantities of disinformation, misinformation, and malinformation in deliberative democratic systems undermines the core functions of said systems.[38] Their effect is not constrained to one area of a system but quickly spreads throughout its entirety. They are effective because the deliberative quality of a given system decreases; therefore, the system’s ability to deliver on its core functions (epistemic, ethical, and democratic) is reduced. Reduction in this context does not refer specifically to a single piece of disinformation but rather to the consequences of the persistent flow and consumption of disinformation. These effects include decreased trust in democratic institutions, state entities, journalism, or the system as a whole. These represent a particular vulnerability of open, interconnected democratic systems; if one area of the system is compromised, it can have significant repercussions throughout the entire system.

Disinformation in Different Contexts

As mentioned earlier, deliberative democratic systems are interlinked. Thus, disinformation in one sector of society can exert widespread effects, potentially yielding negative repercussions across the entire system and contributing to its overall weakening. Hence, it is paramount to delve into these three explanatory concepts. Each concept is focused on distinct spheres of society, and through their examination and the three corresponding strands of literature, a comprehensive understanding of the influence of disinformation on various facets of society can be attained. According to the discursive institutionalist viewpoint, the distinct institutional backgrounds associated with each concept account for the visible conceptual differences. Discursive institutionalism analyses the impact of concepts and discussions on the formation and evolution of organisations.[39] This perspective underscores the connection between ideas and institutions, demonstrating how shared understandings between stakeholders impact the design and efficacy of political frameworks. This approach demonstrates that discourses not only reflect but also exert influence on the rules and norms governing political processes and outcomes. The same may hold for scientifically informed concepts developed in the interest of a particular institution.

This work adopts a discursive institutionalist perspective on the differences and commonalities of the three concepts. All three concepts share a common trait – they are inherently ambiguous. Strategic objectives are pursued through overt and covert operations that operate below the threshold of war. In cognitive warfare and hybrid threats, this threshold may be breached by some means. At the same time, most operations remain under the threshold of open conflict.[40]

Strategic objectives are pursued through overt and covert operations that operate below the threshold of war. In cognitive warfare and hybrid threats, this threshold may be breached by some means.

It can be challenging to separate the three concepts from one another clearly. They overlap in many attributes and are, in part, subordinate to one another. Especially when a far-reaching definition of hybrid threats is considered, FIMI and cognitive warfare can be seen to be subordinate concepts. It is essential to recognise that the three concepts are not competing to provide the most widely adopted conceptual framework in a given research area. Instead, the concepts can be understood as means for specific institutions, practitioners and researchers to design responses, make sense and investigate specific areas of society. Therefore, the concepts are not in competition but complement each other, and in sum, provide a more complete picture of the situation.

To effectively compare the three concepts, four relevant variables for comparison were established. The first variable considers the objective of disinformation within the concept. Disinformation employs diverse tactics that serve different goals and can target various sectors of society. Therefore, this variable evaluates the intended effect of disinformation within the analysed concept. The second variable assesses the primary target of disinformation tactics. This variable enhances our understanding of societal vulnerabilities and aids in identifying which state entity should lead in mitigating a given attack. The third variable is the primary means of disinformation, referring to the specific tactic employed to achieve the defined goal in the defined target. The fourth variable considers the tactics beyond disinformation that are synchronised with it.

Cognitive Warfare

The novel concept of Cognitive Warfare is closely associated with NATO and its allies since it is linked to institutional goals such as cognitive superiority, layered resilience and defence against information and hybrid tactics[41] and the Warfare Development Agenda.[42] NATO’s institutional context is unique. Its activities aim to fulfil three tasks: deterrence and defence, crisis prevention and management, and cooperative security. The instruments of power used by states and alliances can be modelled using the DIME[43] or MIDFIELD[44] schemes. According to NATO’s Strategic Concept 2022, Russia and the People’s Republic of China (PRC) are considered hostile actors in the information space.[45] This has made the information domain more contested since a new hostile actor — the PRC — has been added to the fray alongside Russia. NATO faces several adversarial threats in the information and cognitive domains, which must be mitigated adequately.

In the 2020 NATO Warfighting Capstone Concept (NWCC), the alliance acknowledged the crucial role of the cognitive domain and, with it, the rise of cognitive warfare. In the 2021 NATO Warfare Development Agenda, achieving cognitive superiority was identified as the most pressing challenge. Cognitive superiority is defined as: “The state of possessing and applying faster, deeper and broader understanding and more effective decision-making than adversaries (proposed term)”.[46] For the near future, regarding the cognitive domain, NATO’s Allied Command Transformation (ACT) outlines that “The Alliance must seize the initiative by recognising and contesting the threat of persistent cognitive attacks (with the resulting risk to NATO’s MIoP) and take proactive steps to shape the cognitive dimension.”[47] Collaboration with NATO partner nations, non-NATO organisations, and civil society has been mooted as one factor to improve NATO’s position in the cognitive dimension. Such collaboration would improve information flow and build cognitive and societal resilience.[48] This indicates that there is no competition among institutions and concepts but rather a need for coordination to establish common responses to reduce the problem.

In the 2021 NATO Warfare Development Agenda, achieving cognitive superiority was identified as the most pressing challenge.

The cognitive warfare concept is still under development. However, there are a few working definitions. NATO ACT published the following working definition: “Activities conducted in synchronisation with other Instruments of Power, to affect attitudes and behaviour by influencing, protecting, or disrupting individual and group cognition to gain advantage over an adversary”.[49] Alternatively, Backes and Swab claim that “Cognitive warfare is a strategy that focuses on altering how a target population thinks – and through that how they act”[50], which is not unlike the approach by Hung and Hung, who note that “Controlling others’ mental states and behaviours by manipulating environmental stimuli”.[51]

The common thread among the definitions is the objective of achieving a cognitive effect that results in changes to the behaviour of individuals or groups to align with the adversary’s goals. All disinformation capitalises on the cognitive susceptibilities of its recipients, leveraging preexisting apprehensions or convictions that render them receptive to misleading information. This necessitates the adversary’s profound comprehension of the prevailing sociopolitical intricacies and the strategic timing and methods required to infiltrate and exploit these vulnerabilities effectively.[52]

The coordinated application of modern cyber techniques associated with information warfare, the human factor, and the manipulative aspects found in psychological operations (PSYOPS) represent both the challenge and the innovation. This integrated approach poses a significant threat to democratic societies. These activities frequently involve a distorted representation of reality, often aided by digital manipulation, to promote one’s agenda. New communication technologies have opened up endless possibilities, paving the way for innovative methods and goals.

The coordinated application of modern cyber techniques associated with information warfare, the human factor, and the manipulative aspects found in psychological operations represent both the challenge and the innovation.

The landscape of informational aspects in cognitive warfare can be categorised according to NATO ACT:

  • Traditional Vectors and Enablers: Communication domains of broadcast and print mass media have long been mediums for propaganda and disinformation. An influx of diverse news outlets, coupled with the digitisation of communication, has complicated the assessment of information authenticity. Corporate, state, and political actors have also harnessed misleading narratives to pursue their ideological aspirations. Interpersonal engagement, characterised by trust-based information dissemination within close-knit circles, plays a pivotal role in information digestion, occasionally enabling authority figures to exploit this trust for manipulation.[53]
  • Existing Technology Vectors and Enablers: Integrating technology into cognitive warfare introduces both opportunities and challenges. Social media serves as a platform for disseminating information and influence. However, adversarial governments and extremist groups exploit it to foster distrust and convey false narratives. Nonetheless, the long-term cognitive impact of social media and digital communication tools is still emerging. Studies indicate that these platforms can significantly alter cognitive processes by overwhelming attention spans and memory functions. Big data’s amalgamation with AI allows targeted campaigns and manipulation based on individual digital footprints. Augmented reality and wearable devices gather personal data, rendering individuals susceptible to influence when in vulnerable states. Gaming and encrypted communication platforms, used by billions globally, can be exploited for extremist propaganda and recruitment. Encrypted communication platforms pose challenges to situational understanding.[54]
  • Emerging Technology Vectors and Enablers: Emerging technologies usher in a new era of possibilities and perils. Synthetic media, encompassing deepfakes and generative AI, leverages advanced AI to create hyper-realistic fabricated content capable of instigating political tensions and chaos. AI transforms the defence landscape, offering advantages while accelerating threats. The Metaverse, an immersive virtual reality, presents new dimensions for interaction and data collection, fostering potential cyber threats. AI-generated avatars blur the distinction between virtual personas and reality, augmenting manipulation risks.[55]

Cognitive warfare’s evolution entails an intricate interplay of traditional, existing, and emerging vectors and enablers. They introduce unprecedented vulnerabilities that demand a thorough understanding to mitigate potential adversities.

Hybrid Threats

The conceptual configuration of hybrid threats is constantly changing and adapting.[56] Hybrid threats combine conventional and unconventional means and tactics of conflict or warfare to achieve a specific objective; it differs from traditional forms of conflict in that it does not rely solely on military force to achieve its objectives.[57] Hybrid threats include actions taken by either state or non-state entities. These actions are strategically designed to undermine or damage a target by influencing its decision-making processes, operating across domains and levels. The simultaneous and coordinated use of various means, emphasising the non-military, is a functional core of the concept. It is a complex and dynamic form of conflict that uses a combination of tactics, including cyber-attacks, propaganda, disinformation campaigns, economic coercion, and political subversion, to achieve objectives without resorting to traditional forms of warfare, which can be costly and risky.[58] Hybrid threats exploit and worsen preexisting vulnerabilities in target societies, such as social divisions, economic weaknesses, and political instabilities.

Hybrid threats include actions taken by either state or non-state entities. These actions are strategically designed to undermine or damage a target by influencing its decision-making processes, operating across domains and levels.

Disinformation plays a significant role in hybrid threats. Within the realm of hybrid threats, disinformation constitutes one category among various informational tools. The goal of disinformation in hybrid threats is to undermine trust in institutions, erode social cohesion, and create a sense of chaos and uncertainty. The target audiences are often unaware of these illegitimate uses of digital media by foreign actors and their proxies. This lack of awareness amongst targets can make achieving their objectives easier for an aggressor. Disinformation can be used to divert attention from other activities, such as military operations or cyber-attacks. Disinformation within hybrid threats can take many forms, including fake news, propaganda, and conspiracy theories. It is designed to exploit existing social, political, and economic fault lines and create new ones. It is used in a coordinated manner with a range of tools focused on different areas and vulnerabilities of society.

Foreign Information Manipulation and Interference (FIMI)

The concept of FIMI has been developed by EU Institutions—particularly the European External Action Service: EEAS—in response to threats posed by Russian disinformation campaigns. The concept is thoroughly embedded in EU policy and doctrine.[59] The EU defines FIMI as “a mostly non-illegal pattern of behaviour that threatens or has the potential to negatively impact values, procedures, and political processes. Such activity is manipulative, intentional, and coordinated for effect. Actors of such activity can be state or non-state actors, including their proxies inside and outside of their own territory”.[60] The definition clearly follows a behaviour-centred approach, which is a unique attribute of the concept since “behaviour” is more challenging to operationalise than a narrow concept such as the spreading of false information.

The EEAS acknowledges that FIMI possesses significant conceptual overlaps with disinformation but also describes the concept as being simultaneously “narrower and broader”.[61] “Narrow” refers explicitly to FIMI only encompassing actions taken by adversaries outside the EU; European domestic disinformation is not included in the concept. Since FIMI is incorporated into the EU doctrine, this concept can solely address activities initiated outside the union since internal actions would fall under the auspices of a different department. The term “broader” indicates a focus on behaviour. Conceptually, FIMI does not necessitate adversaries or threat actors to disseminate false or misleading information. Instead, FIMI encompasses deceptive and manipulative conduct. Behaviour in FIMI is characterised by the use of manipulative tactics to impact values, procedures, and political processes negatively. FIMI actors use a variety of tactics, including disinformation, propaganda, and social media manipulation, to achieve their objectives. These campaigns are often conducted intentionally and coordinatedly, using various tactics such as social media manipulation, fake news, and propaganda.

The EEAS acknowledges that FIMI possesses significant conceptual overlaps with disinformation but also describes the concept as being simultaneously “narrower and broader”.

However, disinformation is one key component of FIMI.[62] It refers to creating, presenting, and disseminating false or misleading information to deceive the public or cause public harm. Disinformation can be used to manipulate information environments and interfere in democratic processes. It significantly threatens the EU’s and its member states’ values, procedures, and political processes. FIMI actors use disinformation to achieve their objectives, ranging from influencing public opinion to destabilising governments. The critical distinction in the relationship between the behaviour-centred concept of FIMI and the somewhat dichotomous concept of disinformation is that the distribution of disinformation is not a necessary precondition for FIMI, as less clearly defined actions are sufficient to be considered FIMI.

Analysis

The three concepts, hybrid threats, Foreign Information Manipulation and Interference, and Cognitive Warfare, display significant overlap in many attributes, particularly regarding the role of disinformation. At first glance, the variance of disinformation’s role within these concepts may seem low. However, each concept is focused on radically different contexts and develops its explanatory power in various applications. Thus, although the characteristics of disinformation in the three principles initially appear alike, they differ significantly as they are designed to target distinct societal areas, ultimately leading to varied outcomes. The following table provides an overview of the role of disinformation in the three concepts. The content of the table is based on recent literature on the concepts, and it aims to highlight conceptual and functional differences.

Overview of the role of disinformation in the three concepts; Source: Author.

The Role of Disinformation in Hybrid Threats, Foreign Information Manipulation and Inference, and Cognitive Warfare; Source: Author.

Although the definitions and objectives of disinformation may differ slightly within each concept, they all aim to exploit weaknesses and undermine decision-making processes to achieve strategic goals. In the context of hybrid threats, disinformation aims to influence various decision-making processes to align with the agent’s goals. FIMI conceptualises the goal of eroding the target’s values, procedures, and political processes. Cognitive warfare aims to disrupt individual or collective cognition to gain a tactical advantage over an adversary.

These concepts also have distinctive target scopes. Hybrid threats exploit systemic vulnerabilities in whole-of-society and state institutions. At the same time, FIMI targets manipulating values, procedures, and political processes. As conceived by NATO ACT, cognitive warfare emphasises defence capabilities and military personnel by disrupting decision-making processes, fracturing societal cohesion, weaponising identity and narratives, and influencing the will to fight.

The analysis of primary disinformation methods across hybrid threats, FIMI, and cognitive warfare shows significant diversity in their strategic approaches. Disinformation in hybrid threats utilise various methods, including state propaganda, social media disinformation, fake news dissemination, bot and troll astroturfing, and strategic leaks from hacking operations. Conversely, in FIMI, disinformation places significant importance on deceptive or manipulative behaviour, recognising its crucial role in shaping perceptions. Disinformation in cognitive warfare operates within a continuum of technological epochs, utilising traditional vectors like mass media, existing technology vectors such as social media and big data, and emerging technologies like synthetic media and artificial intelligence. This examination highlights the ever-changing nature of disinformation tactics as they adapt to evolving communication landscapes and strategic goals.

Disinformation in cognitive warfare operates within a continuum of technological epochs, utilising traditional vectors like mass media, existing technology vectors such as social media and big data, and emerging technologies like synthetic media and artificial intelligence.

The methods associated with hybrid threats, FIMI, and cognitive warfare demonstrate different strategic considerations within their respective frameworks. In hybrid threats, various approaches are utilised, including political and economic mechanisms, cyberattacks, intelligence operations, unconventional warfare strategies, terrorism tactics, criminal networks, energy and resource manipulation, and diplomatic pressures. The measures applied in the FIMI context can be applied in coordination with hybrid threat methods and strategies. Such application and coordination reinforce the argument that the three concepts are sector-specific and complement each other. Within the context of cognitive warfare, associated means span diverse domains. The traditional vectors comprise kinetic force, involvement of corporate, state, and political actors, as well as interpersonal engagement. Additionally, emerging technology vectors contain various possibilities, such as integrating neuroweapons or emphasising the multifaceted nature of the strategic landscape.

Despite differences in nuance, all three concepts aim to exploit vulnerabilities to influence strategic decisions for gain. Hybrid threats primarily target society and institutions. FIMI manipulates values, politics, and procedures. Cognitive warfare focuses on the cognitive effects of military impact. Disinformation tactics vary, from classic propaganda to advanced integrated campaigns. This highlights the multifaceted nature of disinformation strategies within evolving strategic landscapes.

Conclusion

Disinformation poses a persistent threat to democratic societies. Disinformation tactics extend beyond disseminating individual pieces of deliberately false information on social media. Rather, these tactics are sophisticated, continuous, adapted to specific situations, and used with other methods to maximise impact.

Within the context of NATO and the EU and their associated concepts of cognitive warfare and FIMI, it is crucial to recognise the necessity for coordinated approaches in addressing the actions of adversaries in the information environment. The democratic world’s vulnerability to disinformation requires a cohesive and collaborative response. Therefore, it is counterproductive to see competition among the three concepts, but rather a need for diverse approaches to address the entire society and develop a response to mitigate disinformation.

The democratic world’s vulnerability to disinformation requires a cohesive and collaborative response.

From a discursive institutionalist perspective, these three concepts (and in particular cognitive warfare and FIMI) serve as guidelines for addressing and understanding threats and formulating adequate responses for their mitigation – for the institutions that formulated the concepts. Hence, as argued earlier, they are not in competition but rather complement each other, each finding relevance in different sectors of society.

Thus, we can argue that cognitive warfare is intended for military or defence applications and potentially for use in contexts of warfare. FIMI, on the other hand, is intended to mitigate disinformation threats for a democratic institution like the EU, which is not primarily involved in defence activities and, therefore, prioritises approaches that are firmly outside the realm of warfare. As a concept, hybrid threats primarily serve as a defence and strategic framework to counter coordinated attacks that pose broader societal risks and may jeopardise society at large. In conclusion, when concepts like cognitive warfare, FIMI, and hybrid threats are perceived as complementary explanatory frameworks rather than competing ones, it enables a new level of inquiry.

This approach allows institutions to investigate the threats that pertain to their specific sector of society. When these analyses are combined, a more comprehensive threat landscape for a given society or alliance can be produced. When this argument is projected to disinformation, the impact of disinformation can be more accurately evaluated when sector-specific analyses are combined and when sector-specific attack vectors are taken under special consideration.

Finally, disinformation is a very versatile tactic containing many different means. When disinformation tactics are used to attack a democratic system, the consequences are difficult to anticipate. Analysing potential threats from different conceptual viewpoints can be an important step towards designing adequate responses to protect citizens, societies and democratic institutions.


Christoph Deppe, M.A., is a researcher and lecturer at the University of the Federal Armed Forces Hamburg. His research interests include Disinformation, Democratic Institutions, Parliaments and Quantitative Text Analysis. His academic fields include Political Science, Comparative Political Science, and Democracy Research. The views contained in this article are the author’s own and do not represent the views of Helmut Schmidt University/University of the German Federal Armed Forces Hamburg.


[1] Gitanjali Adlakha-Hutcheon et al., “Advancing Towards a Common Understanding of Cognitive Warfare for Science and Technology, and Identifying Future Research Trajectories” (NATO Science & Technology Board, March 28, 2023); Bernard Claverie and Francois du Cluzel, “The Cognitive Warfare Concept” (NATO ACT Innovation Hub, 2021), https://www.innovationhub-act.org/sites/default/files/2022-02/CW%20article%20Claverie%20du%20Cluzel%20final_0.pdf; Francois du Cluzel, “Cognitive Warfare” (NATO ACT Innovation Hub, 2021), https://www.innovationhub-act.org/sites/default/files/2021-01/20210122_CW%20Final.pdf.

[2] Gregory F. Treverton et al., “Addressing Hybrid Threats” (Stockholm: Swedish Defence University, 2018), https://www.hybridcoe.fi/publications/addressing-hybrid-threats/.

[3] European Union External Action, “1st EEAS Report on Foreign Information Manipulation and Interference Threats” (Bruxelles: European Union External Action, February 7, 2023), https://www.eeas.europa.eu/eeas/1st-eeas-report-foreign-information-manipulation-and-interference-threats_en.

[4] Deen Freelon and Chris Wells, “Disinformation as Political Communication,” Political Communication 37, no. 2 (March 3, 2020): 145–56, https://doi.org/10.1080/10584609.2020.1723755.

[5] Amos C. Fox, “Russian Hybrid Warfare: A Framework,” Journal of Military Studies, no. 2021 (2021), https://doi.org/10.2478/jms-2021-0004; Peter Pomerantsev, “Authoritarianism Goes Global (II): The Kremlin’s Information War,” Journal of Democracy 26, no. 4 (2015): 40–50, https://doi.org/10.1353/jod.2015.0074; Peter Pomerantsev, Nothing Is True and Everything Is Possible (FABER FABER, 2017); Flemming Splidsboel Hansen, “Russian Hybrid Warfare: A Study of Disinformation,” Research Report (DIIS Report, 2017), https://www.econstor.eu/handle/10419/197644; Christopher Whyte, “Cyber Conflict or Democracy ‘Hacked’? How Cyber Operations Enhance Information Warfare,” Journal of Cybersecurity 6, no. 1 (2020), https://doi.org/10.1093/cybsec/tyaa013.

[6] Edda Humprecht, “Where ‘Fake News’ Flourishes: A Comparison across Four Western Democracies,” Information, Communication & Society 22, no. 13 (2019): 1973–88, https://doi.org/10.1080/1369118X.2018.1474241; Spencer McKay and Chris Tenove, “Disinformation as a Threat to Deliberative Democracy,” Political Research Quarterly, 2020, 1065912920938143, https://doi.org/10.1177/1065912920938143; Gordon Ramsey and Sam Robershaw, “Weaponising News: RT, Sputnik and Targeted Disinformation” (London: Policy Institute, King’s College London, 2019), https://www.kcl.ac.uk/policy-institute/research-analysis/weaponising-news; Claire Wardle and Hossein Derakhshan, “Information Disorder: Toward an Interdisciplinary Framework for Research and Policy Making” (Strasbourg: Council of Europe, 2017), https://edoc.coe.int/en/media/7495-information-disorder-toward-an-interdisciplinary-framework-for-research-and-policy-making.html.

[7] Shashi Jayakumar, Benjamin Ang, and Nur Diyanah Anwar, “Fake News and Disinformation: Singapore Perspectives,” in Disinformation and Fake News, ed. Shashi Jayakumar, Benjamin Ang, and Nur Diyanah Anwar (Singapore: Springer, 2021), 137–58, https://doi.org/10.1007/978-981-15-5876-4_11.

[8] James Griffiths, The Great Firewall of China: How to Build and Control an Alternative Version of the Internet (Bloomsbury Publishing, 2021), https://www.bloomsbury.com/us/great-firewall-of-china-9781350265318/.

[9] Margaret E. Roberts, Censored: Distraction and Diversion Inside China’s Great Firewall (Princeton University Press, 2018), https://doi.org/10.2307/j.ctvc77b21.

[10] Gary King, Jennifer Pan, and Margaret E. Roberts, “How Censorship in China Allows Government Criticism but Silences Collective Expression,” The American Political Science Review 107, no. 2 (2013): 326–43.

[11] Edda Humprecht, “Where ‘Fake News’ Flourishes: A Comparison across Four Western Democracies,” Information, Communication & Society 22, no. 13 (2019): 1973–88, https://doi.org/10.1080/1369118X.2018.1474241; Spencer McKay and Chris Tenove, “Disinformation as a Threat to Deliberative Democracy,” Political Research Quarterly, 2020, 1065912920938143, https://doi.org/10.1177/1065912920938143; Gordon Ramsey and Sam Robershaw, “Weaponising News: RT, Sputnik and Targeted Disinformation” (London: Policy Institute, King’s College London, 2019), https://www.kcl.ac.uk/policy-institute/research-analysis/weaponising-news; Claire Wardle and Hossein Derakhshan, “Information Disorder: Toward an Interdisciplinary Framework for Research and Policy Making” (Strasbourg: Council of Europe, 2017), https://edoc.coe.int/en/media/7495-information-disorder-toward-an-interdisciplinary-framework-for-research-and-policy-making.html.

[12] Edda Humprecht, “Where ‘Fake News’ Flourishes: A Comparison across Four Western Democracies,” Information, Communication & Society 22, no. 13 (2019): 1973–88, https://doi.org/10.1080/1369118X.2018.1474241.

[13] Robert S. Mueller, “Report on the Investigation into Russian Interference in the 2016 Presidential Election” (Washington D.C.: U.S. Department of Justice, 2019); Office of the Director of National Intelligence, “Assessing Russian Activities and Intentions in Recent US Elections,” Intelligence Community Assessment (Washington, D.C: Office of the Director of National Intelligence, January 6, 2017), https://www.dni.gov/files/documents/ICA_2017_01.pdf.

[14] Idem.

[15] Idem.

[16] Hunt Allcott and Matthew Gentzkow, “Social Media and Fake News in the 2016 Election,” Journal of Economic Perspectives 31, no. 2 (May 2017): 211–36, https://doi.org/10.1257/jep.31.2.211.

[17] Patrícia Rossini, Camila Mont’Alverne, and Antonis Kalogeropoulos, “Explaining Beliefs in Electoral Misinformation in the 2022 Brazilian Election: The Role of Ideology, Political Trust, Social Media, and Messaging Apps,” Harvard Kennedy School Misinformation Review, May 16, 2023, https://doi.org/10.37016/mr-2020-115; Leonardo Avritzer, Eliara Santana, and Rachel Bragatto, eds., Eleições 2022 e a reconstrução da democracia no Brasil, 1a edição (Belo Horizonte, MG: Autêntica, 2023); Sangwon Lee and S Mo Jones-Jang, “Cynical Nonpartisans: The Role of Misinformation in Political Cynicism During the 2020 U.S. Presidential Election,” New Media & Society, August 22, 2022, 14614448221116036, https://doi.org/10.1177/14614448221116036; Patrick Mutahi and Brian Kimari, “Fake News and the 2017 Kenyan Elections,” Communicatio 46, no. 4 (2020): 31–49, https://doi.org/10.1080/02500167.2020.1723662; Fabrício H. Chagas-Bastos, “Political Realignment in Brazil: Jair Bolsonaro and the Right Turn*,” Revista de Estudios Sociales, 2019, https://doi.org/10.7440/res69.2019.08; Wardle and Derakhshan, “Information Disorder”; Emilio Ferrara, “Disinformation and Social Bot Operations in the Run Up to the 2017 French Presidential Election,” First Monday, July 31, 2017, https://doi.org/10.5210/fm.v22i8.8005.

[18] Christoph Deppe and Gary S. Schaal, “Vernetzte Desinformationskampagnen: Der Fall Nawalny,” #GIDSresearch, May 10, 2022, https://doi.org/10.24405/14603 .

[19] Shuang Xie and Oliver Boyd-Barrett, “External-National TV News Networks’ Way to America: Is the United States Losing the Global ‘Information War’?,” International Journal of Communication 9, no. 0 (January 5, 2015): 18.

[20] Mona Elswah and Philip N Howard, “‘Anything That Causes Chaos’: The Organizational Behavior of Russia Today (RT),” Journal of Communication 70, no. 5 (October 1, 2020): 623–45, https://doi.org/10.1093/joc/jqaa027; Xie and Boyd-Barrett, “External-National TV News Networks’ Way to America”; Ilya Yablokov, “Conspiracy Theories as a Russian Public Diplomacy Tool: The Case of Russia Today (RT),” Politics 35, no. 3–4 (2015): 301–15, https://doi.org/10.1111/1467-9256.12097 .

[21] Council of the EU, “EU Imposes Sanctions on State-Owned Outlets RT/Russia Today and Sputnik’s Broadcasting in the EU,” March 2, 2022, https://www.consilium.europa.eu/en/press/press-releases/2022/03/02/eu-imposes-sanctions-on-state-owned-outlets-rt-russia-today-and-sputnik-s-broadcasting-in-the-eu/.

[22] WDR and Petra Blum, “Russische Propaganda: Sanktionen oft wirkungslos,” tagesschau.de, August 24, 2023, https://www.tagesschau.de/investigativ/wdr/russland-propaganda-newsfront-eu-usa-sanktionen-ukraine-100.html.

[23] Even though the EU has taken necessary steps to remove RT and Sputnik from the visible media environment, the organizations remain active and continue to distribute content through websites, which are still accessible via a VPN connection and in chat groups on platforms such as Telegram and others.

[24] Claire Wardle and Hossein Derakhshan, “Information Disorder: Toward an Interdisciplinary Framework for Research and Policy Making” (Strasbourg: Council of Europe, 2017), https://edoc.coe.int/en/media/7495-information-disorder-toward-an-interdisciplinary-framework-for-research-and-policy-making.html.

[25] Idem.

[26] Europäische Kommission, “Bekämpfung von Desinformation Im Internet: Ein Europäisches Konzept” (Brüssel: Europäische Kommission, April 26, 2018), 4, https://eur-lex.europa.eu/legal-content/DE/TXT/?uri=CELEX%3A52018DC0236.

[27] Spencer McKay and Chris Tenove, “Disinformation as a Threat to Deliberative Democracy,” Political Research Quarterly, 2020, 1065912920938143, https://doi.org/10.1177/1065912920938143.

[28] Julia Carrasco, ed., Bild Und Botschaft: Cranach Im Dienst von Hof Und Reformation (Heidelberg: Morio-Verl., 2015), http://deposit.d-nb.de/cgi-bin/dokserv?id=5031056&prov=M&dok_var=1&dok_ext=htm; Sebastian Dohe and Veronika Spinner, “Cranachs Bilderfluten,” Konstellationen (Publikationsreihe) 3 (2022), https://doi.org/10.26013/ksw.pub_00000646.

[29] Gary S. Schaal and David Strecker, “Die politische Theorie der Deliberation: Jürgen Habermas,” in Politische Theorien der Gegenwart, ed. André Brodocz and Gary S. Schaal (Wiesbaden: VS Verlag für Sozialwissenschaften, 1999), 69–93, https://doi.org/10.1007/978-3-322-97432-7_4.

[30] Jane Mansbridge et al., “A Systemic Approach to Deliberative Democracy,” in Deliberative Systems: Deliberative Democracy at the Large Scale, ed. Jane Mansbridge and John Parkinson, Theories of Institutional Design (Cambridge: Cambridge University Press, 2012), 1–26, https://doi.org/10.1017/CBO9781139178914.002.

[31] John S. Dryzek, “The Forum, the System, and the Polity: Three Varieties of Democratic Theory,” Political Theory 45, no. 5 (2017): 610–36, https://doi.org/10.1177/0090591716659114.

[32] Selen A. Ercan, Carolyn M. Hendriks, and John Boswell, “Studying Public Deliberation after the Systemic Turn: The Crucial Role for Interpretive Research,” Policy & Politics, April 01, 2017, 1–36, https://doi.org/10.1332/030557315X14502713105886.

[33] Spencer McKay and Chris Tenove, “Disinformation as a Threat to Deliberative Democracy,” Political Research Quarterly, 2020, 1065912920938143, https://doi.org/10.1177/1065912920938143, 2.

[34] Idem.

[35] Idem.

[36] Idem.

[37] Marco Bastos and Johan Farkas, “‘Donald Trump Is My President!’: The Internet Research Agency Propaganda Machine,” Social Media + Society 5, no. 3 (July 2019): 205630511986546, https://doi.org/10.1177/2056305119865466.

[38] For a definition, see Wardle and Derakhshan, “Information Disorder.”

[39] Vivien A. Schmidt, “Discursive Institutionalism: The Explanatory Power of Ideas and Discourse,” Annual Review of Political Science 11, no. 1 (2008): 303–26, https://doi.org/10.1146/annurev.polisci.11.060606.135342; Vivien A. Schmidt, “Taking Ideas and Discourse Seriously: Explaining Change through Discursive Institutionalism as the Fourth ‘New Institutionalism,’” European Political Science Review 2, no. 1 (March 2010): 1–25, https://doi.org/10.1017/S175577390999021X.

[40] Mikael Weissmann, “Hybrid Warfare and Hybrid Threats Today and Tomorrow: Towards an Analytical Framework,” Journal on Baltic Security 5, no. 1 (June 01, 2019): 17–26, https://doi.org/10.2478/jobs-2019-0002.

[41] NATO, “NATO 2022 Strategic Concept,” June 29, 2022, https://www.nato.int/nato_static_fl2014/assets/pdf/2022/6/pdf/290622-strategic-concept.pdf; Dale F. Reding and Bryan Wells, “Cognitive Warfare: NATO, COVID-19 and the Impact of Emerging and Disruptive Technologies,” in COVID-19 Disinformation: A Multi-National, Whole of Society Perspective, ed. Ritu Gill and Rebecca Goolsby, Advanced Sciences and Technologies for Security Applications (Cham: Springer International Publishing, 2022), 25–45, https://doi.org/10.1007/978-3-030-94825-2_2; NATO, “NATO Warfighting Capstone Concept” (Norfolk, VA: NATO ACT, 2021), https://www.act.nato.int/our-work/nato-warfighting-capstone-concept/.

[42] NATO ACT, “Cognitive Warfare: Beyond Military Information Support Operations,” NATO ACT (blog), May 09, 2023, https://www.act.nato.int/article/cognitive-warfare-beyond-military-information-support-operations/.

[43] DIME: Diplomatic, Informational, Military, Economic.

[44] MIDFIELD: Military, Informational, Diplomatic, Financial, Economic, Law, Development.

[45] NATO, “NATO 2022 Strategic Concept,” June 29, 2022, https://www.nato.int/nato_static_fl2014/assets/pdf/2022/6/pdf/290622-strategic-concept.pdf, 5.

[46] NATO ACT, “Cognitive Warfare: Beyond Military Information Support Operations,” NATO ACT (blog), May 09, 2023, https://www.act.nato.int/article/cognitive-warfare-beyond-military-information-support-operations/, B-2.

[47] Ibid., 5.

[48] Ibid., 6.

[49] Ibid., 3.

[50] Oliver Backes and Andrew Swab, “Cognitive Warfare: The Russian Threat to Election Integrity in the Baltic States” (Harvard Kenney School Belfer Center for Science and International Affairs), 8, accessed February 03, 2023, https://www.belfercenter.org/publication/cognitive-warfare-russian-threat-election-integrity-baltic-states.

[51] Tzu-Chieh Hung and Tzu-Wei Hung, “How China’s Cognitive Warfare Works: A Frontline Perspective of Taiwan’s Anti-Disinformation Wars,” Journal of Global Security Studies 7, no. 4 (December 01, 2022): 1, https://doi.org/10.1093/jogss/ogac016.

[52] Francois du Cluzel, “Cognitive Warfare” (NATO ACT Innovation Hub, 2021), https://www.innovationhub-act.org/sites/default/files/2021-01/20210122_CW%20Final.pdf.

[53] NATO ACT, “Cognitive Warfare: Beyond Military Information Support Operations,” NATO ACT (blog), May 09, 2023, https://www.act.nato.int/article/cognitive-warfare-beyond-military-information-support-operations/, 19.

[54] Ibid., 20-21.

[55] Ibid., 22-23.

[56] European Commission, “JOINT COMMUNICATION TO THE EUROPEAN PARLIAMENT AND THE COUNCIL Joint Framework on Countering Hybrid Threats a European Union Response” (2016), https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX:52016JC0018.

[57] Gregory F. Treverton et al., “Addressing Hybrid Threats” (Stockholm: Swedish Defence University, 2018), https://www.hybridcoe.fi/publications/addressing-hybrid-threats/; Christopher S. Chivvis, “Understanding Russian „Hybrid Warfare“” (RAND Corporation, March 22, 2017), https://www.rand.org/pubs/testimonies/CT468.html; Amos C. Fox, “Russian Hybrid Warfare: A Framework,” Journal of Military Studies, no. 2021 (2021), https://doi.org/10.2478/jms-2021-0004.

[58] Gregory F. Treverton et al., “Addressing Hybrid Threats” (Stockholm: Swedish Defence University, 2018), https://www.hybridcoe.fi/publications/addressing-hybrid-threats/.

[59] European Commission, JOINT COMMUNICATION TO THE EUROPEAN PARLIAMENT AND THE COUNCIL Joint Framework on countering hybrid threats a European Union response; European Commission, “COMMUNICATION FROM THE COMMISSION TO THE EUROPEAN PARLIAMENT, THE COUNCIL, THE EUROPEAN ECONOMIC AND SOCIAL COMMITTEE AND THE COMMITTEE OF THE REGIONS Tackling Online Disinformation: A European Approach” (2018), https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX%3A52018DC0236; European Commission, “European Democracy Action Plan,” November 25, 2021, https://commission.europa.eu/strategy-and-policy/priorities-2019-2024/new-push-european-democracy/european-democracy-action-plan_en; European Union External Action, “A Strategic Compass for Security and Defence – For a European Union That Protects Its Citizens, Values and Interests and Contributes to International Peace and Security,” March 24, 2022, https://www.eeas.europa.eu/eeas/strategic-compass-security-and-defence-0_en.

[60] European Union External Action, “1st EEAS Report on Foreign Information Manipulation and Interference Threats,” 25.

[61] Ibid., 25.

[62] Idem.; European Union External Action, “2021 StratCom Activity Report – Strategic Communication Task Forces and Information Analysis Division” (Bruxelles, 2021), https://www.eeas.europa.eu/eeas/2021-stratcom-activity-report-strategic-communication-task-forces-and-information-analysis_en; European Union External Action, “2022 Report on EEAS Activities to Counter FIMI,” February 07, 2023, https://www.eeas.europa.eu/eeas/2022-report-eeas-activities-counter-fimi_en.

[63] Gregory F. Treverton et al., “Addressing Hybrid Threats” (Stockholm: Swedish Defence University, 2018), https://www.hybridcoe.fi/publications/addressing-hybrid-threats/, 10.

[64] European Union External Action, “2021 StratCom Activity Report – Strategic Communication Task Forces and Information Analysis Division” (Bruxelles, 2021), https://www.eeas.europa.eu/eeas/2021-stratcom-activity-report-strategic-communication-task-forces-and-information-analysis_en, 2.

[65] NATO ACT, “Cognitive Warfare: Beyond Military Information Support Operations,” NATO ACT (blog), May 09, 2023, https://www.act.nato.int/article/cognitive-warfare-beyond-military-information-support-operations/, 3.

[66] Gregory F. Treverton et al., “Addressing Hybrid Threats” (Stockholm: Swedish Defence University, 2018), https://www.hybridcoe.fi/publications/addressing-hybrid-threats/, 10.

[67] NATO ACT, “Cognitive Warfare: Beyond Military Information Support Operations,” NATO ACT (blog), May 09, 2023, https://www.act.nato.int/article/cognitive-warfare-beyond-military-information-support-operations/, 28-29.

[68] European Union External Action, “1st EEAS Report on Foreign Information Manipulation and Interference Threats,” 26.

[69] NATO ACT, “Cognitive Warfare: Beyond Military Information Support Operations,” NATO ACT (blog), May 09, 2023, https://www.act.nato.int/article/cognitive-warfare-beyond-military-information-support-operations/, 19.

[70] Idem.

You may also like

Comments are closed.