Lethal Autonomous Weapons’ Conundrum and the State of Play

The world stands at a high-end technological breakthrough in the fields of civil and military technological developments. While some states mull weapons “modernisation” programs to extend the age of arms technology into the next century, others insist on balance and stability while keeping a considerate pace of strategic defence buildup. Recent military conflicts experienced autonomous weapons demonstrations, such as using drones in the Armenia-Azerbaijan and Russia-Ukraine conflicts. These developments in weapons technology ask for an informed debate on the risks, concerns, and associated implications for global and regional stability. Subsequently, Lethal Autonomous Weapons (LAWS) inductions will massively impact military planning and international strategic stability.

LAWS are a “special class of weapon systems that use sensor suites and computer algorithms to independently identify a target and employ an onboard weapon system to engage and destroy it without manual human control of the system”. The cyber domain is at the forefront of this revolution with advancements in military technologies such as facial recognition and computer vision, autonomous navigation in crowded environments, cooperative autonomy, and swarming. The merits and demerits of LAWS must be lined up when probing the plausibility of the stakes involved in the continued development or deployment of these weapons. There is an emphasis on technical, military, legal, and ethical issues with the weaponisation of increasingly autonomous technologies. Including advanced disruptive technologies such as directed energy weapons (DEWs) which lie in the domain of future technologies, they coincide with the fourth industrial revolution that is underway.

The impact of LAWS on both conventional and strategic domains has significant implications for global and regional strategic stability. Three core aspects underscore the technicalities and risks associated with LAWS.

There is an emphasis on technical, military, legal, and ethical issues with the weaponisation of increasingly autonomous technologies.

The first aspect has two technical viewpoints. Firstly, it holds that these in-built systems and their technical efficacy offer the feasibility of precise targeting and reducing the human factor on the battlefield. The other view sees the risks and concerns about unintentional escalation, miscalculation, misperceptions, and global unrest more realistically. Moreover, loose-ended weapons inherit the danger of easy access to non-state actors for malicious activities.

The second aspect is the threat LAWS poses to the global and regional strategic stability calculus. Hypothetically, military engagement will dramatically change and intensify in the nuclear sphere if any military crisis includes autonomous and Artificial Intelligence (AI) elements comprising high-precision tactical and hypersonic weapons with novel warheads. Similarly, unmanned spacecraft, low-orbit surveillance, and satellite communication systems are used in space. Cyber weaponry and automatic hacking systems are also becoming more common. It is a concerning factor for both nuclear and non-nuclear states due to their likelihood of enhancing vulnerabilities in controlling escalation-prone crises.

The third aspect is the high-impact risks and threats resulting from the dramatically reduced time allotted for strategic decision-making within the military Communications, Command, and Control (C3) and Intelligence, Surveillance, and Reconnaissance (ISR) systems. The main disadvantage of human oversight on a computer is that it takes too long for the human intellect to analyse the situation and come to the correct conclusion. It becomes an advantage when it comes to avoiding devastating catastrophes across the world.

In this regard, we can take the example of the Pentagon’s Maven, COMPASS, and Diamond Shield, which are just a few of the many military programs that aim to have supercomputers for data analysis and scenario development for political and military leadership. That includes the risk of time-bound decision-making while having limited human control over the strategic decisions, which are instead based on machine learning and mathematical algorithms rather than human thinking response. Therefore, the development and employment of LAWS can amplify the prevailing military balances to imbalances between countries, which not only makes them vulnerable to respond to any military crisis but also compels them into an arms race. For example, in the Eastern European security framework, emerging technologies appear to be an additional challenge to neutralising conflict dynamics. It can also relate to North and South Korean issues in the Korean Peninsula.

In the present state of play, LAWS are in their initial development phase, with significant gaps, including definitional challenges, risks, concerns, aspirations, and objectives. They have possible effects on strategic stability and nuclear risk in regional contexts and between great powers.

The High Contracting Parties (HCP), States that have signed or ratified a treaty, are extremely divided on the issue of regulation vs prohibition/ban on LAWS. Four groups have emerged in the Convention on Certain Conventional Weapons (CCW) as a result of this divide. There have been discussions at the CCW on an additional protocol for LAWS.

The first group consists of states that consider LAWS a new tool for the stability and promotion of responsible state behaviour. They believe these technologies offer speed, accuracy, and flawless coordination, which serve as a powerful “force multiplier” and help reduce collateral harm. The second group advocates a pre-emptive ban because LAWS cannot understand the changing features of actual warfare, such as crisis escalation, decision-making, information processing, precautions, proportionality, the chain of command, target identification, selection, and engagement. The third group of states emphasises building consensus and common ground on key concepts and definitions before deciding about regulation or ban. The fourth and largest group of states is the Non-Aligned Movement (NAM), which calls for a legally binding instrument stipulating prohibitions and regulations of such weapons to ensure meaningful human control over the critical functions of a weapon system.

There are a few regulatory challenges to making a framework for LAWS. Nearly ten-year-long global regulatory initiatives surrounding LAWS are stuck in a corner. The development of a regulatory framework for LAWS has undoubtedly been impeded by ongoing annual debates to regulate them.

Lethal autonomous robots (LARs) were the subject of a ground-breaking report in 2013, and a few years later, the conversation about LAWS was started at the CCW  and formalised in the form of a Group of Governmental Experts (GGE). Since 2017, the GGE on LAWS has had meetings annually, but until the end of their mandate in 2021, they were unable to agree on a normative and operational framework. Even though the Sixth Review Conference of the CCW extended its mandate in 2021 to continue discussions, there were no noteworthy breakthroughs in 2022. The meetings continued into 2023, with the latest being held in May.

Thus, LAWS are still under discussion to get a formal definitional clarity and regulatory process establishment. In the Conference of Disarmament (CD), the GGE on LAWS is the multilateral venue where the issue is being debated. The consensus-based approach of the GGE, which requires that every HCP in the group, gaining a majority, be able to concur on a single aspect of the GGE processes, is a gap that could be one factor due to which an impasse may last.

Subsequently, regarding LAWS, each HCP at GGE has its unique national agendas and interests. For states like Russia, South Korea, Israel, and the United States presently investigating, producing, testing, deploying, and/or trading, the absence of any laws may provide a favourable environment. On the other side, states that may wish for a strong regulatory framework to be built quickly, even if they lack the resources or the national interest to do so, are at risk owing to an adversary’s use of LAWS or both. Few guiding principles are merely predicated on the least common denominators to make themselves agreeable to several opposing HCPs. As a result, the GGE largely continues to be an exclusive group where LAWS are continually discussed at the policy level, involving fundamental issues such as definitions and concepts of autonomy, meaningful human control, and technical and ethical issues.

In conclusion, arms control regimes and mechanisms are tools and processes which serve to achieve the objectives of peace, risk reduction, tensions, crisis management, and conflict resolution. Even though the discussions on LAWS originated in the UN Human Rights Council (UNHRC) as a UN problem and some members have insisted on their return, many of these governments seek to achieve a regulatory balance between military necessity and humanitarian considerations at a security-oriented body like the GGE.

Globally, assessing the strategic significance of LAWS underlines several aspects that directly and indirectly impact the security and strategic discourse. Regardless of origin, strategic experts agree that LAWS are a double-edged sword. On the one hand, LAWS technology-related AI could enhance nuclear command and control, early warning, ISR, and the physical security of nuclear capabilities, among other areas, and can improve states’ sense of security. On the contrary, these technological advancements may raise concerns about the effectiveness of their second-strike capabilities. This apprehension may prompt more assertive nuclear stances, thereby escalating the risk of nuclear threats.

For the foreseeable future, the use of LAWS will be a part of military capabilities. Therefore, the global and regional strategic security calculi will be more complex. By involving border sharing and competitive and confronting states with aggressive approaches, the bilateral and multilateral dynamics between them may aggravate. Hence, defence and security-related experts and forums are required to look into these emerging issues to maintain regional and global strategic stability.

Huma Rehman

Huma Rehman is a Senior Defense and Foreign Affairs Analyst and a former Fellow of the Middlebury Institute for International Studies (MIIS) Monterey, California, US. She can be reached at hoomarehman9@gmail.com and on Twitter @HumaRehman1.

Leave a Comment

Login

Welcome! Login in to your account

Remember me Lost your password?

Lost Password