Abstract: Hybrid warfare operations embrace an “anything that gets results” strategy, including significant information operations. Western democracies need to better understand the information operations that are undertaken against them. This will need to involve more rigorous observation, monitoring and measurement of malign political campaigns undertaken against them via the internet.
Problem statement: How can hybrid information influence on conflict operations be detected, tracked and countered?
So what?: The most open societies are probably the most vulnerable to data manipulation and information operations. The community of democratic states must erect defences against malign information influence delivered through cyberspace.
A transformation in information power
More than eighty years ago the British diplomat, journalist and academic Edward Hallett Carr declared in his The Twenty Years’ Crisis that power could be exerted in three areas – military, economic and information.[1] Substituting his term soft power for power over opinion, Nye produced a similar assessment six decades later.[2] While practitioners and scholars may agree that information power is important, borrowing from Simon, one must ask, “to what extent have the operational tools of observation and measurement been provided us?”[3] The task at hand for scholars and practitioners of the geopolitical information environment is to identify how burgeoning sources of information may be processed and analysed by the novel computational methods referred to as artificial intelligence (AI).
To what extent have the operational tools of observation and measurement been provided us?
What makes for information awareness in hybrid conflict?
Resilient, accurate situational awareness of hybrid threats depends on observation and measurement in each sub-area in the hybrid arena, which blends “the lethality of state conflict with the fanatical and protracted fervour of irregular warfare.”[4] Such observation translates to monitoring many different types of activity undertaken by an adversary. Governments and other actors have created all manner of observation and measurement capacities, from social media and banking systems to computer networks and reconnaissance satellites. This new form of interstate conflict is set apart from our fading memories of the Cold War in that where data were once difficult to find, there is often now an overabundance of them.[5] New issues arise, however. Data of sufficient quality may be used to measure phenomena, and that measurement is a key step to situational awareness.[6]
Computing has given humankind a greater capacity to assign quantitative measures to all manner of phenomena. Mobile computing devices provide sensor data from images to geolocation.[7] At the outset of the February 2022 invasion of Ukraine, images of military action, largely taken from mobile devices, flooded social media.[8] Open-source intelligence (OSINT) analysts, mostly amateurs, sifted through online video and images of combat to generate a picture of the military action.[9]
As for combining inputs at a strategic level and then translating them to operational action, the most important issues will be the accuracy of the information inputs from all sources and the timeliness of their analysis. An example of success in this area is the Ukrainian missile attack on the port of Berdiansk in March 2022.[10] Russia released a propaganda video of its operations at the seaport that allowed accurate Ukrainian targeting of Russian amphibious ships there. The Ukrainian missile attack then sank one of the ships and badly damaged two others.[11] This form of OSINT may be highly useful; its incorporation into a rapid, task-oriented intelligence analysis enterprise, however, presents challenges – not least the potential for disinformation by a wary enemy.
The intelligence picture available to government, industry and individuals today differs greatly from what it was during the last period of major power competition, which ended with the demise of the Soviet Union.[12] The enormous technological advances in information and computing technologies (ICTs) have completely overhauled the craft of intelligence. Foreign agents can be recruited in chat rooms rather than back alleys. Overhead intelligence, once the province of superpowers, is now available commercially by download over the internet. There is no need to break open filing cabinets when computers may be electronically compromised, and contents pilfered by actors half a world away. A bonanza of sorts exists for the collectors of intelligence. However, for those from whom intelligence is being collected an acknowledgement of the huge value of their “digital exhaust”[13] comes only after those data are translated to action – from online censorship to artillery bombardment. The communications revolution represents a double-edged sword for high-technology societies and their high-technology militaries.
There is no need to break open filing cabinets when computers may be electronically compromised, and contents pilfered by actors half a world away.
There is no question that mobile smartphones, which perform the role of everything from calculators and cameras to media studios and flashlights, have made an enormous impact on humanity.[14] The number of cell phone subscriptions surpassed the global population sometime between 2015 and 2020.[15] Sweden’s Ericsson, the builder of the technological infrastructure that runs mobile communication, contends that some 60 per cent of the planet’s population have “äppärät” smartphones.[16] Between 2023 and 2024 the amount of data travelling between these devices and other pieces of ICT infrastructure grew by 25 per cent.[17]
On the downside these devices may be tracked, monitored and targeted by technologies that scan the electromagnetic spectrum and inspect dataflows on backbone networks and hacking tools compromising apps and operating system software. On the battlefields of the Russo-Ukrainian War they have been shown to be a huge liability. Presence on cell phone networks along the frontlines of that conflict and others is a common trigger for attack – and has been for more than a decade.[18] That Russian small unit commanders tack mobile phones to the walls of bunkers if they are found among frontline troops, as they did in one viral instance, is solid proof of the vulnerability the technology opens to military units.
One of the more surprising developments of the Russo-Ukraine War is the utility of commercial internet and cellular technology on the battlefield. That artillery fires are called in via a Starlink satellite modem is but one of the unforeseen developments of that conflict. Keeping tabs on the activities identified as hybrid or “grey zone” conflicts incorporates information from multiple platforms and systems.[19] Included in an ontology of hybrid conflict are: propaganda operations, principally undertaken online; official declarations and press reports; computer network attack and defence activity; information about military movements and exercises; and economic data (i.e. buying up fuels to prepare for war or manipulating markets to create an asymmetric advantage). As it was in the early days of the Cold War, the goal for states facing acute security issues and responsibilities is to avoid surprise.[20] Its avoidance today means that capacity must grow in analysing the flood of data we call intelligence.
Measuring hybrid influence and action
At a time when the hyperbole regarding artificial intelligence (AI) could hardly be stronger, the human capacity to understand information remains constrained by attention and time. It would take a single person 200,000 years to read the amount of information on the world wide web (www) alone. The good news for prospective hybrid warfare analysts is that not everything needs to be read, and what does can be accomplished by organizations of professionals. Analytics teams can monitor variables relevant to information operations, but the question is how.[21] The answer is tripartite, involving (a) identifying key variables; (b) baselining of what we may call “normal” activity; and (c) the weights of different variables in a machine learning algorithm for processing collected data. A framework may emerge from this for observing change in the exertion of information power.
The good news for prospective hybrid warfare analysts is that not everything needs to be read, and what does can be accomplished by organizations of professionals.
Understanding hybrid conflict involves the incorporation of manifold areas of knowledge. Much of this is encompassed in what contemporary Western military theorists call the information environment.[22] Setting bounds to that environment is daunting. It is large, much like the physical environment in which it is constructed. Much of the information now exchanged and absorbed by people is digital. This indicates enormous streams and repositories of data. The challenge lies in locating those sources that may better illuminate the exertion of power in the international system. Scholarship on the information dimension of international relations has been approached by methods of news analysis,[23] public declaration,[24] leadership analysis and related political psychology,[25] and for some time now internet communications and interactions.[26] Thanks to the continued durability of Moore’s Law in the growth of computing power, the mechanisms for enquiry in these areas may be re-engineered in light of technological advances.[27]
In the information environment of hybrid warfare, a bridge must be constructed between technical capacity and social response. Advertising may offer a shortcut to valuing information power in international competition and conflict.[28] Technology has revolutionized the advertising industry. With the arrival of ubiquitous computing, advertisements delivered by internet companies such as Alphabet (Google) and Meta (Facebook) target individuals rather than audiences.[29] Spending on political advertising in the US is projected to reach almost $3.5 billion in the 2024 election cycle, while traditional advertising spending (TV, radio, print, etc.) is still far more, at some $7.9 billion. The total amount, some $12 billion, represents an increase of nearly a third from the 2020 election cycle. Most of that growth is in what the advertising industry calls “digital”,[30] which is a pathway to discovering information power variables.
The largest growth area for political advertising spending is in what the advertising industry calls connected television. Connected TV is video delivered by the internet.[31] Services from Alphabet, Amazon, Netflix and traditional media companies like Disney deliver these advertisements to viewers. They appear in a burgeoning flood of video, as some 20 days’ worth of video is uploaded to Alphabet’s YouTube service every minute. In this exponentially growing video archive, propagandists deliver their messages to the public abroad.[32] Interestingly, the Russian government recently blocked its citizens from accessing the service.[33] It appears likely that both video content and the advertisements surrounding it are a potential threat to some states. These categories of digital data should also be tracked by those who observe hybrid conflict.
On X (formerly Twitter), Telegram and Alphabet’s Instagram many variables, including the metadata produced by those platforms, must be followed by practitioners of active measures.[34] Government officials and political candidates make use of these internet platforms to communicate their messages.[35] Propagandists are somewhat less upfront about how they spread their narrative views but work with the same technologies.[36] Where once practitioners of active measures covertly published magazines and newsletters, they now create online news and opinions,[37] often with assistance from Large Language Model (LLM) AI models.[38] Situational awareness for hybrid conflict translates to effective monitoring of sources of information designed to influence beliefs. Such activity will probably need to be undertaken for the foreseeable future. Information power still appears to be relevant.
Where once practitioners of active measures covertly published magazines and newsletters, they now create online news and opinions, often with assistance from Large Language Model (LLM) AI models.
How does influence work in the hybrid contest?
To understand whether influence operations work, consider the example of Russia’s attempts to isolate Ukraine and deprive it of Western support. Until the US Congress voted to approve a major round of assistance to Ukraine in April 2020, Russian propaganda held up US legislative action on the provision of military aid to Ukraine for months. A recent Breitbart headline, “Exclusive: [House Speaker] Johnson’s top policy advisor is former lobbyist… Clients have corporate interest in Ukraine War”, exemplifies information operations in which pro-Russia actions are camouflaged in the anti-corporate narrative.[39] Sacked Fox News commentator Tucker Carlson interviewed Vladimir Putin in Russia and lingered to film segments in which he called Moscow “much nicer than any city in my country”.[40] One long-serving Republican member of the US Congress chastised his own caucus for introducing Russian propaganda talking points as fact to the chamber’s deliberations.[41]
Hybrid conflict-oriented propaganda also targets both the national politics and militaries of targeted countries.[42] What this means in practice is their ideological compatibility with missions that may be subject to tremendous political propaganda. False reports of violence by German soldiers serving in Lithuania may be but the tip of the iceberg in the anti-NATO digital propaganda undertaken by Russia.[43] Perhaps the best indicator of its effectiveness is the presence of neo-fascistic elements in NATO militaries and their willingness to work against their own services due to malign foreign information influence propagated across cyberspace.
False reports of violence by German soldiers serving in Lithuania may be but the tip of the iceberg in the anti-NATO digital propaganda undertaken by Russia.
Although not a military conflict, the covid-19 pandemic likewise opened the doors for propagandists, including those in the US, to manipulate publics online.[44] False narratives fooled the naive and intellectually impressionable. In some cases the cost was their lives. Hybrid conflict indicators abound in the information environment, but their presence does not necessarily provide a forecast of future military conflict or covert action. Connecting the dots on information operations in a conflict that may pass from the “grey zone” to significant hostilities is required for early warning and efforts at peace. That also means that the mere bellicosity of rhetoric between two states does not necessarily add up to open conflict. Now toned down, the war of words between Japan and South Korea spoke to an old animosity but not a renewed conflict.
AI’s role in grasping understanding in a sea of data
ICTs have transformed society, particularly through the rapid proliferation of information. Perhaps the most important observation in the preparation of this essay was an oft-repeated belief that AI answers all questions, removing the need for critical thinking.[45] This could have devastating effects as we learn more about how AI performance can be biased, and how that bias can be influenced.[46]
A tremendous computational capacity for the sensemaking of digital information is at hand. The technologies to process information can be incredibly useful in bringing order to the chaos of the information environment.[47] For example, BERT, a computational-linguistic tool, can be trained to detect online propaganda through its ever-evolving linguistic model.[48] For every advance in detecting information operations, however, the propagandists will also innovate. This is the nature of technologically infused statecraft. When divided into sides, players in the international system attempt to leverage innovation for comparative advantage.
The information components of hybrid conflict can be found, and this can partly be undertaken by computers. That said, AI is no panacea. There is perhaps too much talk about AI by those who may not understand how the technology works today or will evolve. However, the evolution of the neural network machine learning process we call AI is advancing consistently. The head of Google’s Deep Mind division, the centre for the company’s AI research and development, has asserted recently that these advances will continue. He observes: “In recent years, I think machine learning has really changed our expectations of what we think of computers being able to do. If you think back 10 or 15 years ago, speech recognition kind of worked, but it wasn’t really seamless – it made lots of errors. Computers didn’t really understand images from the pixel level of what was in that image. There was a bunch of work in natural language processing, but it wasn’t really a deep understanding of language concepts and multilingual data. But I think we’ve moved from that stage to one where you actually expect computers to be able to see and perceive the world around us in a much better way than they were able to 10 years ago [author’s italics].”[49]
In recent years, I think machine learning has really changed our expectations of what we think of computers being able to do.
While Dean sees tremendous advances in computer reasoning, the data for understanding information influence or other hybrid warfare tactics will require sophisticated models. One approach is to simulate society at scale. One research group envisages the employment of High-Definition Cognitive Models representing the mindset of specific individuals.[50] The challenge with such an approach is to capture the heterodox nature of a population and understand how AI approximation may yield useful observations. Computing advances will continue, but the greater challenge may be structuring and weighting data to construct useful analytical tools. That process, let alone hybrid warfare, remains relatively immature as applied to international relations.
Growing civilian and diplomatic institutions
Hybrid conflict embraces a repertoire of actions that can produce a maximum effect while simultaneously managing escalatory dynamics. The governments of the West’s democracies employ diplomatic, intelligence and military capabilities to maintain peace and offer early warning in a way not seen before the paired catastrophes of two world wars. In the decades since 1945 those organizations have adapted to manifold threats, from denial and disinformation operations to thermonuclear warfare. Assuring security has required the contributions of many actors availing themselves of new technology and tradecraft for necessary adaptation to the methods of intelligent and motivated adversaries.
That adaptation also extends to alterations in the proverbial “rules of the game” in international relations. Deepfakes, kinetic cyberattacks and transnational criminal-terror syndicates are all realities of the contemporary security environment that would have been labelled science fiction a few decades ago. In addition to new actors and actions, the conflict now plays out on a deeply globalized geographic information tableau upon which advantage is sought while keeping escalation in check, and a significant challenge remains in directing the attention of computer algorithms to both find and analyse them. Hostile and aggressive states use the tools they have at hand. North Korea, for example, has learned how to employ cyber tools to perpetrate the first heist of a national reserve bank.[51] The capacity for innovation in a digitally interconnected world is a source of regular surprise for the community of states seeking a norms-based international order that promotes shared interests and collective security. Staying apprised of that innovation, undertaken by a growing club of authoritarian regimes increasingly willing to collaborate, must be a priority.
The capacity for innovation in a digitally interconnected world is a source of regular surprise for the community of states seeking a norms-based international order that promotes shared interests and collective security.
If there is a defining attribute of our time, it is that societies can cope with torrents of information to make sense of the world they inhabit. The information environment grows exponentially. Tracking what goes on within it will be the job of practitioners in many disciplines who can cooperate in making sense of the perception we call security. Journalists, academics and concerned citizens will be at the vanguard of discovery for hybrid warfare information operations. In the Global West governments should not get a pass just because these actors are present and capable, however. While military alliances are built on the cooperation of armed forces, Western democracies would be wise to grow civilian and diplomatic institutions for hybrid conflict in the digital domain.
What this will mean is probably a further erosion of institutional or organizational silos related to security. Police, spies, soldiers, corporations and interested citizens of all stripes will contribute to sensemaking in a world marked by hybrid conflicts. How that collaboration will function is very much a work in the earliest phases of progress. Perhaps the most important question for identifying the machinations of hybrid warfare is what it will cost those who wish to deter it in both blood and treasure.
Chris Bronk PhD is an associate professor at the University of Houston’s Hobby School of Public Affairs. He studies the intersection of information and computing technology with international relations. The views contained in this article are the author’s alone and do not represent the views of the University of Houston or the State of Texas.
[1] Edward Hallett Carr, The Twenty Years’ Crisis, 1919–1939: Reissued with a new preface from Michael Cox (Springer, 2016).
[2] Joseph S. Nye, The Future of Power, (Public Affairs, 2011).
[3] Herbert A. Simon, ‘Notes on the observation and measurement of political power’, The Journal of Politics Volume 15, Issue 4 (1953): 500–516.
[4] Sub-areas of hybrid conflict can include cyber activity, terrorism, information operations, international crime, and economic activity. Frank G. Hoffman, ‘Hybrid warfare and challenges’, in Strategic Studies, 329–337. (Routledge, 2014).
[5] Margret S. MacDonald and Anthony G. Oettinger, ‘Information overload’, Harvard International Review Volume 24, Issue 3 (2002): 44.
[6] Erhard Rahm and Hong Hai Do, ‘Data cleaning: Problems and current approaches’, IEEE Data Engineering Bulletin Volume 23, Issue 4 (2000): 3–13.
[7] Zheng Xu, Lin Mei, Kim-Kwang Raymond Choo, Zhihan Lv, Chuanping Hu, Xiangfeng Luo, and Yunhuai Liu, ‘Mobile crowd sensing of human-like intelligence using social sensors: A survey’, Neurocomputing 279 (2018): 3–10.
[8] Aaron F. Brantly, ‘Ukraine War OSINT Analysis: A Collaborative Student Report’ (2023).
[9] Generating intelligence from social media was defined almost a decade ago. OSINT has been discussed significantly since the 1990s. Laura K. Donohue, ‘The dawn of social intelligence (SOCINT)’, Drake Law Review Volume 63 (2015): 1061.
[10] Chris Bronk, Gabriel Collins, and Dan S. Wallach, ‘The Ukrainian Information and Cyber War’, The Cyber Defense Review Volume 8, Issue 3 (2023): 33–50.
[11] Brent D. Sadler, ‘Applying Lessons of the Naval War in Ukraine for a Potential War with China’, Backgrounder 3743 (2023): 1–13.
[12] Alex Roland and Philip Shiman, Strategic Computing: DARPA and the Quest for Machine Intelligence, 1983–1993 (MIT Press, 2002).
[13] Ronald J. Deibert, Reset: Reclaiming the Internet for Civil Society (House of Anansi, 2020).
[14] Muhammad Sarwar and Tariq Rahim Soomro, ‘Impact of smartphones on society’, European Journal of Scientific Research Volume 98, Issue 2 (2013): 216–226.
[15] https://www.weforum.org/agenda/2023/04/charted-there-are-more-phones-than-people-in-the-world/.
[16] The term “äppärät” is borrowed from Gary Shteyngart’s Super Sad True Love Story, a 2011 novel set in a dystopian near future where mobile devices were all-consuming of human attention. Sounds crazy. Gary Shteyngart, Super Sad True Love Story: A Novel. (Random House Trade Paperbacks, 2011).
[17] This data traffic growth of 25 per cent a year is a staggering statistic and has held true for more than a decade. Fredrik Jejdling, Ericsson Mobility Report 2024. June 2024.
[18] Chris Bronk and Gregory S. Anderson, ‘Encounter battle: Engaging ISIL in cyberspace’, The Cyber Defense Review Volume 2, Issue 1 (2017): 93–108.
[19] Michael J. Mazarr, ‘Mastering the gray zone: Understanding a changing era of conflict’, US Army War College (2015).
[20] Roberta Wohlstetter, Pearl Harbor: Warning and Decision. (Stanford University Press, 1962).
[21] The value of AI technologies for analytic teamwork is in its earliest phases. Lauro Snidaro, ‘ChatGPT Act as an Intelligence Officer’, In 2023 IEEE International Workshop on Technologies for Defense and Security (TechDefense), 449–454. IEEE, 2023.
[22] Michelangelo Conoscenti, ‘The Military’s Approach to the Information Environment’, in The Routledge Handbook of Discourse and Disinformation (Routledge, 2023), 218–238.
[23] Kalev Leetaru and Philip A. Schrodt, ‘Gdelt: Global data on events, location, and tone, 1979–2012’, in ISA annual convention, Volume 2, Issue 4: 1–49. Citeseer, 2013.
[24] Gavan Duffy and Brian Frederking, ‘Changing the rules: A speech act analysis of the end of the Cold War’, International Studies Quarterly, Volume 53, Issue 2 (2009): 325–347.
[25] Margaret G. Hermann and Charles W. Kegley Jr., ‘Rethinking democracy and international peace: Perspectives from political psychology’, International Studies Quarterly Volume 39, Issue 4 (1995): 511–533.
[26] Charli Carpenter and Daniel W. Drezner, ‘International Relations 2.0: The implications of new media for an old profession’, International Studies Perspectives, Volume 11, Issue 3 (2010): 255–272.
[27] Mark S. Lundstrom and Muhammad A. Alam, ‘Moore’s law: The journey ahead’, Science Volume 378, Issue 6621 (2022): 722–723.
[28] Garrett A. Johnson, Randall A. Lewis, and David H. Reiley, ‘When less is more: Data and power in advertising experiments’, Marketing Science Volume 36, Issue 1 (2017): 43–53.
[29] Ritam Dutt, Ashok Deb, and Emilio Ferrara, ‘“Senator, We Sell Ads”: Analysis of the 2016 Russian Facebook Ads Campaign’, in Advances in Data Science: Third International Conference on Intelligent Information Technologies, ICIIT 2018, Chennai, India, 11–14 December 2018, Proceedings 3, 151–168. Springer Singapore, 2019.
[30] Trade press publications can offer some interesting insights. The $12 billion ad spend is an amount roughly the size of Guyana’s GDP. ‘2024 Political Ad Spending Will Jump Nearly 30% vs. 2020’, EMarketer, 11 January, 2024, https://www.emarketer.com/press-releases/2024-political-ad-spending-will-jump-nearly-30-vs-2020/.
[31] Paul Murschetz, ‘Connected television: Media convergence, industry structure, and corporate strategies’, Annals of the International Communication Association Volume 40, Issue 1 (2016): 69–93.
[32] Robert W. Orttung and Elizabeth Nelson, ‘Russia Today’s Strategy and Effectiveness on YouTube’, Post-Soviet Affairs Volume 35, Issue 2 (2019): 77–92.
[33] Alexander Marrow and Gleb Stolyarov, ‘YouTube slowdown in Russia darkens freedom of speech outlook’, Reuters. 8 August 2024. YouTube was blocked by China years ago.
[34] Mylynn Felt, ‘Social media and the social sciences: How researchers employ Big Data analytics’, Big Data & Society Volume 3, Issue 1 (2016): 2053951716645828.
[35] Jason Gainous and Kevin M. Wagner, Tweeting to Power: The Social Media Revolution in American Politics (Oxford University Press, 2014).
[36] Yevgeniy Golovchenko, Cody Buntain, Gregory Eady, Megan A. Brown, and Joshua A. Tucker, ‘Cross-platform state propaganda: Russian trolls on Twitter and YouTube during the 2016 US Presidential Election’, The International Journal of Press/Politics Volume 25, Issue 3 (2020): 357–389.
[37] Thomas Rid, Active Measures: The Secret History of Disinformation and Political Warfare. Farrar, Straus and Giroux, 2020.
[38] Paweł Golik, Arkadiusz Modzelewski, and Aleksander Jochym, ‘DSHacker at CheckThat! 2024: LLMs and BERT for Check-Worthy Claims Detection with Propaganda Co-occurrence Analysis’, (2024).
[39] Wendell Husebø and Matthew Boyle, ‘Exclusive – Mike Johnson’s top policy advisor is former lobbyist: Clients have interest in Ukraine War’, Breitbart, 17 April, 2024.
[40] Dominick Mastrangelo, ‘Tucker Carlson: Moscow “so much nicer than any city in my country”’, The Hill, 13 February, 2024.
[41] Julia Ioffe, ‘McCaul to Action’, Puck, 2 April, 2024, https://puck.news/ukraine-aid-q-and-a-rep-mccaul-on-republican-support-for-bill/.
[42] Christopher Paul and Miriam Matthews, ‘The Russian “firehose of falsehood” propaganda model’, Rand Corporation Volume 2, Issue 7 (2016): 1–10.
[43] ‘Fake news campaign targets German Army’, DW. 16 February, 2017.
[44] Chris Bing and Joel Schectman, ‘Special Report: How U.S. Taxpayers Funded a “Global Propaganda” Program to Push Covid-19 Vaccine Abroad’, Reuters, 25 July, 2023.
[45] Claire Su-Yeon Park, Haejoong Kım, and Sangmin Lee, ‘Do less teaching, do more coaching: Toward critical thinking for ethical applications of artificial intelligence’, Journal of Learning and Teaching in Digital Age Volume 6, Issue 2 (2021): 97–100.
[46] Reva Schwartz, Apostol Vassilev, Kristen Greene, Lori Perine, Andrew Burt, and Patrick Hall, Towards a Standard for Identifying and Managing Bias in Aartificial Intelligence. Volume 3, US Department of Commerce, National Institute of Standards and Technology, 2022.
[47] Stephen L. Dorton and Robert A. Hall, ‘Collaborative human-AI sensemaking for intelligence analysis’, in International Conference on Human-Computer Interaction (Cham: Springer International Publishing, 2021), 185–201,.
[48] For more information about BERT: Mikhail V. Koroteev, ‘BERT: A review of applications in natural language processing and understanding’, arXiv preprint arXiv:2103.11943 (2021).
[49] Jeff Dean, ‘Exciting Trends in Machine Learning’, (Lecture), Rice University, Houston, TX, 13 February. 2024.
[50] Michael Bernard, George Backus, Matthew Glickman, Charles Gieseler, and Russel Waymire, ‘Modeling Populations of Interest in Order to Simulate Cultural Response to Influence Activities’, in Social Computing and Behavioral Modeling, 1–8. )Springer US, 2009).
[51] Seongjun Park, ‘Evading, Hacking & Laundering for Nukes: North Korea’s Financial Cybercrimes & the Missing Silver Bullet for Countering Them’, Fordham International Law Journal Volume 45 (2021): 675.