Loading Events

« All Events

  • This event has passed.

Report of the Monday Morning Meeting on “Role of Legal Autonomous Weapons Systems in Modern Day Warfare”

December 22, 2025 @ 8:00 am - 5:00 pm

Ms. Meghna Pradhan, Research Analyst, spoke on “Role of Legal Autonomous Weapons Systems in Modern Day Warfare” at the Monday Morning Meeting held on 22 December 2025. Mr. Rohit Sharma, Research Analyst, moderated the meeting. The scholars of the Institute attended the discussion

Executive Summary

The presentation examined the growing operational relevance of Lethal Autonomous Weapon Systems (LAWS) and the strategic, ethical, and legal challenges arising from their integration into modern warfare. LAWS were defined as weapon systems that, once activated, can independently select, and engage targets without further human intervention. A key emphasis of the presentation was the absence of a universally accepted definition of LAWS, with autonomy understood as a spectrum rather than a fixed threshold. Consequently, current debates focus less on specific technologies such as artificial intelligence and more on the functional effects and operational consequences of delegating lethal decision-making authority to machines.

Detailed Report

Mr. Rohit Sharma, moderator, highlighted why LAWS have become a critical concern for contemporary militaries and policymakers. Central to this is the compression of decision-making time in combat environments, particularly in high-speed and highly contested domains such as air defence, missile interception, and electronic warfare.

Ms. Meghna Pradhan observed that by accelerating the Observe–Orient–Decide–Act (OODA) loop, autonomous systems fundamentally alter the tempo of warfare, reducing opportunities for human deliberation and increasing the risk of inadvertent escalation. This compression also shifts lethal authority from human operators to machine-driven processes, raising profound questions about accountability, command responsibility, and escalation control. Ongoing conflicts were characterised as real-world testing grounds where these systems are being deployed, refined, and evaluated even as international regulatory frameworks remain unsettled.

A conceptual clarification between automatic, automated, and autonomous systems formed an important analytical foundation of the presentation. Automatic systems operate on basic trigger–response mechanisms, while automated systems execute predefined rules without discretion. Autonomous systems, by contrast, possess a degree of discretion within programmed parameters, effectively substituting human judgement rather than merely human execution. This distinction was presented as central to understanding why LAWS represent a qualitative shift in warfare, rather than a continuation of existing automation trends. Autonomy, in this sense, marks a transition from machines assisting human decisions to machines shaping or executing those decisions independently.

The presentation further situated LAWS within the OODA loop to illustrate how autonomy increasingly permeates all stages of military decision-making. While machines have long contributed to observation and orientation through sensors and data processing, LAWS become particularly consequential at the decision and action stages, where lethal force is authorised

 

and applied. Models of human control- ranging from human-in-the-loop to human-out-of-the-loop- were discussed to demonstrate the varying degrees of human involvement in autonomous systems. The concept of “meaningful human control” was identified as a central yet unresolved issue, with no agreed standard for what constitutes sufficient human oversight, especially when operational timelines are reduced to seconds or milliseconds.

The enabling technological ecosystem underlying LAWS was presented as broader than artificial intelligence alone. Autonomy is derived from a combination of sensor fusion, decision algorithms, edge computing, secure communications, human–machine interfaces, and integrated command-and-control networks. Importantly, the presentation stressed that LAWS do not inherently require artificial intelligence, and that many autonomous functions can be achieved through rule-based or hybrid systems. At the same time, these enabling technologies also support civilian and non-lethal military applications, complicating regulatory efforts that seek to restrict autonomy without inhibiting broader technological development.

A significant portion of the presentation addressed the risks associated with LAWS. These include the potential for unlawful harm arising from faulty data, sensor degradation, or biased datasets, as well as the challenge of attributing responsibility when autonomous systems cause unintended damage. The “black box” nature of learning systems was highlighted as a particular concern, as it limits transparency regarding how decisions are made once systems are deployed in dynamic battlefield environments. Cyber vulnerabilities and the rapid proliferation of autonomous technologies to non-state actors were identified as additional destabilising factors. Nevertheless, the presentation also acknowledged potential operational advantages, including force multiplication, reduced exposure of personnel to hostile environments, enhanced endurance in extreme terrains, and, under certain conditions, the possibility of reducing collateral damage through precision targeting.

The applicability of International Humanitarian Law (IHL) to LAWS was examined in depth. The presentation reaffirmed the widely accepted position that IHL applies fully to all weapon systems, including autonomous ones. However, it underscored persistent disagreements over whether LAWS can reliably comply with core IHL principles such as distinction, proportionality, precautions in attack, military necessity, humanity, and accountability. Particular scepticism was expressed regarding the ability of machines to make context-sensitive judgements, such as distinguishing civilians from combatants in complex environments or assessing proportionality where military advantage must be weighed against potential civilian harm.

The global governance debate on LAWS was analysed through the ongoing discussions under the UN Convention on Certain Conventional Weapons (CCW), particularly within the Group of Governmental Experts (GGE). The presentation traced the evolution of these discussions from early informal consultations to the adoption of guiding principles in 2019, followed by increasing polarisation between advocates of a legally binding prohibition and those favouring regulation within existing legal frameworks. Current negotiations focus on issues such as defining LAWS, clarifying the scope of prohibitions, determining acceptable levels of autonomy, and deciding whether outcomes should be binding or political in nature. The year 2026 was identified as a critical juncture for determining the future direction of the CCW process.

India’s position on LAWS was presented as cautious and calibrated. India maintains that existing IHL frameworks are sufficient to address the challenges posed by autonomous systems and argues against premature bans that could constrain technological development. The emphasis remains on retaining human responsibility, regulating the use and effects of systems rather than the underlying technologies, and avoiding parallel regulatory processes outside the CCW. The presentation also outlined India’s evolving capabilities, including the use of loitering munitions, air defence systems, and autonomous subsystems across multiple platforms, while noting that these do not yet meet contested definitions of fully autonomous weapon systems.

In conclusion, the presentation argued that LAWS are already an operational reality rather than a future contingency. The core unresolved issue remains the extent to which lethal decision-making can be delegated to machines without undermining legal accountability, ethical restraint, and strategic stability. While there is broad agreement that IHL applies, consensus on definitions, thresholds of autonomy, and governance mechanisms remains elusive. The presentation concluded that a calibrated, IHL-based regulatory approach within the CCW framework currently offers the most pragmatic path forward, with forthcoming negotiations likely to shape the future trajectory of autonomous warfare.

The question-and-answer session reflected the depth of conceptual and policy-level contestation surrounding lethal autonomous weapon systems. Participants raised concerns regarding the ambiguity of autonomy thresholds, particularly the blurred line between decision-support systems and autonomous targeting systems, with reference to contemporary operational examples. Questions focused on accountability and state responsibility, especially in scenarios where learning systems evolve in unpredictable ways once deployed, and whether responsibility should rest with operators, commanders, developers, or the deploying state. The discussion also engaged with the feasibility of meaningful human control under compressed decision-making timelines, noting that in high-speed missile defence or drone interception scenarios, decisions are effectively pre-delegated to machines. Participants debated whether machines could, in some contexts, reduce human bias and error, while others emphasised that opacity, data bias, and cyber vulnerabilities introduce new risks rather than eliminating existing ones. The session further examined divergent state positions on regulation, highlighting how major military powers favour non-binding political commitments grounded in existing International Humanitarian Law, while smaller states advocate clearer prohibitions to mitigate asymmetries in technological capability. Overall, the discussion underscored that while there is broad agreement on the applicability of IHL, fundamental disagreements persist over definitions, accountability, and the governance of autonomy, reinforcing the view that consensus on LAWS remains politically and technically elusive.

The Report was prepared by Ms. Khyati Singh, Research Analyst, North America and Strategic Technologies Centre, MP-IDSA.

Details

  • Date: December 22, 2025
  • Time:
    8:00 am - 5:00 pm
  • Event Category: