Source: shutterstock.com/TSViPhoto
Source: shutterstock.com/TSViPhoto
ArticlesEnglish Articles

Unravelling Warfare In The Age Of AI

Abstract: Humankind is on the verge of witnessing a revolutionary transformation in strategic and military affairs in the form of Artificial Intelligence (AI) based weapon systems, which have the potential to fundamentally transform the character of warfare. AI weapon systems have the potential to autonomously decide to acquire, identify, engage, destroy and carry out a battle damage assessment of the intended targets in real time. Thus, autonomous AI-based weapon systems challenge the notion of meaningful human control in the decision-making process and raise questions regarding the extent of delegation of decision-making authority to machines in warfare.

Problem Statement: Could AI-driven warfare shape a new age of uncontrollable war, resulting in massive collateral damage?

So what?: There is a greater need to regulate AI in the warfighting domain. UN-led dialogue on AI-based weapons, such as lethal autonomous weapon systems, must be strengthened. The dialogue, which has been ongoing since 2014, in places such as the Convention on Certain Conventional Weapons (CCW) and the Group of Governmental Experts (GGE) may conclude with a formalised legal and ethical framework to govern AI-based weapon systems.

Source: shutterstock.com/TSViPhoto

Source: shutterstock.com/TSViPhoto

Contextualisation

Throughout military history, warfare has been transformed by innovations such as gunpowder, tanks, aircraft, and nuclear weapons. Today, we are witnessing a transformative shift dominated by the notion of AI-powered autonomy in weapon systems. AI-based systems can impact all levels of warfare, from intelligence, surveillance and reconnaissance to decision-making through lethal autonomous weapon systems (LAWS). LAWS can engage targets based on predetermined criteria or in response to specific situations, thus reducing the need for direct human involvement in the decision to use force.[1]

In the last decade, militaries worldwide have heavily invested in AI-based weapon platforms such as LAWS, drone swarms and unmanned sea, land and air-based autonomous vehicles. Artificial Intelligence in Military Global Market Report 2024 has projected that AI in the global military market will grow from $8.45 billion in 2023 to $9.86 billion in 2024 at a compound annual growth rate of 16.6 per cent.[2] This increasing military spending on AI indicates a great surge in the acquisition of AI-based weapon systems; it also reflects a global race to dominate this technology. Consequently, AI has become an integral part of the military strategies of great and middle powers.

Militaries worldwide have heavily invested in AI-based weapon platforms such as LAWS, drone swarms and unmanned sea, land and air-based autonomous vehicles.

For instance, the People’s Republic of China (PRC) has proclaimed in its New General AI Plan that “AI is a strategic technology that will lead the future”, prompting the nation to become the world leader in AI by 2030.[3] On the other hand, the U.S. has adopted a ‘third offset’ strategy, which promises to invest heavily in AI, autonomy, and robotics to sustain its advantage in defence. Recently, the United States Department of Defence (DoD) conducted experiments with micro-drones capable of exhibiting advanced swarm behaviour such as collective decision-making, adaptive formation flying and self-healing. Asia Times reported in February 2023 that the DoD had launched the Autonomous Multi-Domain Adaptive Swarms-of-Swarms (AMASS) project to develop autonomous drone swarms that can be launched from sea, air, and land to overwhelm enemy air defences.[4]

The global race to become an AI leader, particularly between the U.S. and the PRC, signals AI’s strategic significance in warfighting. However, the absence of specific multilateral regulations on using autonomous systems on the battlefield raises security, legal, ethical, and humanitarian concerns.

AI-based Target Selection in the Israel-Hamas War

The Israeli conduct of their war in Gaza has highlighted the lethality associated with AI-based target selection, acquisition and destruction. A report by The Guardian in December 2023 revealed that Israel uses an AI-based system called Habsora, also known as Gospel, to target more than 100 target sets in a day.[5] According to the former head of Israeli Defence Forces (IDF), Aviv Kochavi, human intelligence-based systems could identify only up to 50 targets a year in Gaza. Consequently, by June 2024, Israel had destroyed 360,000 buildings.[6] It killed 40,732 Palestinians, mostly women and children while injuring 92,609 civilians by using an AI-based target selection system.[7]

Ironically, using AI-enabled weapons undermines the essence of the Fourth Geneva Convention (1949) on the ‘Protection of Civilian Persons in the Time of War’. Authorising a computer-based system to indiscriminately select and target without distinguishing between the combatants and non-combatants is a violation of International Humanitarian Law (IHL). In February 2024, the Chief Executive of Israel’s tech firm “Startup Nation Central”, Avi Hasson, noted that “the war in Gaza has provided an opportunity for the IDF to test emerging technologies which had never been used in past conflicts.”[8]

The war in Gaza has provided an opportunity for the IDF to test emerging technologies which had never been used in past conflicts.

A Challenging Quest

In 2017, the United Nations Office for Disarmament Affairs (UNODA) identified a growing trend in the number of states pursuing the development and use of autonomous weapon systems (AWS) that present the risk of an ‘uncontrollable war.'[9] According to a 2023 study on ‘Artificial Intelligence and Urban Operations’ by the University of South Florida, “the armed forces may soon be able to exploit autonomous weapon systems to monitor, strike, and kill their opponents and even civilians at will.”

South Asian Context

In South Asia, AI-based weapon systems could have a severe impact on security dynamics, given the existence of longstanding disputes between two nuclear-armed neighbours: Pakistan and India. India is pursuing AI-based weapons and surveillance systems. In June 2022, the Indian Ministry of Defence organised the ‘AI in Defense’ (AIDef) symposium and exhibition where Defence Minister Shri Rajnath Singh introduced 75 AI-based weapon platforms, including robotics, automation tools, and intelligence and surveillance systems.

While Pakistan does not officially claim to use AI for military and defence purposes, Islamabad has recently introduced the Centre for Artificial Intelligence and Computing (CENTAIC) under the auspices of the Pakistan Air Force (PAF).[12] CENTAIC is considered the country’s vanguard of military AI development, enabling the armed forces to integrate AI into their operational and strategic domain.

CENTAIC is considered the country’s vanguard of military AI development, enabling the armed forces to integrate AI into their operational and strategic domain.

Consequently, AI-based autonomous weapons systems are considered an intensifying force which may result in an AI arms race, eroding the traditional boundaries between nuclear and conventional weapon systems. An AI arms race could lead to catastrophic consequences for the South Asian region, which is already prone to territorial and geopolitical disputes, resulting in three major wars between India and Pakistan. Historically, when India pursued the path of nuclearisation in South Asia and conducted its first test in 1974, Pakistan followed suit and developed its nuclear weapons. If left unregulated and ungoverned through international treaties and arms control measures, the AI arms race can lead to catastrophic consequences in the South Asian region. There is also a threat of non-state actors acquiring AI-based weapons and integrating them into their terror activities. International regulation coupled with a state monopoly on AI-based weapon systems could avert the dangers.

Nuclear Attack Over Peace

In January 2024, a group of researchers from four U.S. universities found, while simulating a war scenario using five AI programmes, including OpenAI and Meta’s Llama programme, that all models chose nuclear attacks over peace with their adversary.[13] The findings of this study are a wake-up call for world leaders and scientists to come together in a multilateral setting to strengthen the UN’s efforts to regulate AI in warfare. UN Secretary-General Antonio Guterres has also highlighted the urgency of addressing this issue in the ‘2023 New Agenda for Peace’ by underscoring that “there is a necessity to conclude a legally binding instrument to prohibit the development and deployment of autonomous weapon systems by 2026.”[14]

 


Jawad Ali Shah is a Research Officer at the Center for International Strategic Studies Sindh (CISSS), based in Karachi, Pakistan. His areas of research interest are global security and stability, emerging technologies of war, and the great power rivalry. The views contained in this article are the author’s alone.


[1] “The Role of International Law in the Development of Autonomous Weapons Systems,” Emory International Law Review 34, no. 2 (2020): 49-78, https://scholarlycommons.law.emory.edu/cgi/viewcontent.cgi?article=1236&context=eilr.

[2] “Artificial Intelligence in Military Global Market Report,” The Business Research Company, 2024, https://www.thebusinessresearchcompany.com/report/artificial-intelligence-in-military-global-market-report.

[3] “AI at War,” War on the Rocks, April 04, 2023, https://warontherocks.com/2023/04/ai-at-war/.

[4] “US Drone Swarm Program Could Redefine Modern War,” Asia Times, September 14, 2023, https://asiatimes.com/2023/09/us-drone-swarm-program-could-redefine-modern-war/. ​

[5] Harry Davies, Bethan McKernan, and Dan Sabbagh, “The Gospel: how Israel uses AI to select bombing targets in Gaza,” The Guardian 1 (2023): 2023, https://www.theguardian.com/world/2023/dec/01/the-gospel-how-israel-uses-ai-to-select-bombing-targets.

[6] Necva Taştan, “360,000 Buildings Damaged or Destroyed in Gaza Strip: ESCWA,” Anadolu Agency,  July 20, 2023, https://www.aa.com.tr/en/middle-east/360-000-buildings-damaged-or-destroyed-in-gaza-strip-escwa/3214148.

[7] “Israel-Hamas War in Maps and Charts: Live Tracker,” Al Jazeera, October 09, 2023, https://www.aljazeera.com/news/longform/2023/10/9/israel-hamas-war-in-maps-and-charts-live-tracker.

[8] “Israel Deploys New Military AI in Gaza War,” France 24, February 10, 2024, https://www.france24.com/en/live-news/20240210-israel-deploys-new-military-ai-in-gaza-war.

[9] “Perspective on Lethal Autonomous Weapon Systems,” UNODA Occasional Papers, no. 30, November 2017, https://front.un-arm.org/wp-content/uploads/2017/11/op30.pdf.

[10] Umaima Ali, “Comparing the AI-Military Integration by India and Pakistan,” Centre for Strategic and Contemporary Research, September 07, 2023, https://cscr.pk/explore/themes/defense-security/comparing-the-ai-military-integration-by-india-and-pakistan/.

[11] Juan-Pablo Rivera, Gabriel Mukobi, Anka Reuel, Max Lamparth, Chandler Smith, and Jacquelyn Schneider, “Escalation risks from language models in military and diplomatic decision-making,” The 2024 ACM Conference on Fairness, Accountability, and Transparency, 836-898.

[12] “A New Agenda for Peace,” United Nations Department of Political and Peacebuilding Affairs, 2023, https://dppa.un.org/en/a-new-agenda-for-peace.

You may also like

Comments are closed.