Delegation of the European Union to Colombia

30/08/2024 | Press release | Distributed by Public on 30/08/2024 19:02

Group of Governmental Experts on emerging technologies in the area of Lethal Autonomous Weapons Systems - EU statement

Group of Governmental Experts on emerging technologies in the area

of Lethal Autonomous Weapons Systems

Geneva, 26-30 August 2024

EU Statement on risk mitigation measures

Risk mitigation measures

  • In the study, development, acquisition or adoption of a new weapon, means or method of warfare, determination must be madewhether its employment would, in some or all circumstances,be prohibited byinternational law.
  • Anaccountabilityframeworkshouldbe established byStatesengaged in the life cycle of emerging technologies inthe area of LAWS.
  • Risk assessments and mitigation measures should be part of the design, development, testing and deployment cycle of emerging technologies in any weapons systems. Tailored risk mitigation measures should be adopted and implemented across the entire life-cycle of the system.
  • During the design, development, testing, deployment, and use of lethal autonomous weapons systems, States must consider, as appropriate, risks such as civilian casualties and take precautions to minimize the risk of harm to civilians and civilian objects, as well as other types of risks including but not limited to the risk of unintended engagements, risk of loss of control of the system, risk of diversion to unauthorized users including terrorist groups, and risk of acquisition by terrorist groupsand taking into account relevant ethical principles.
  • Risk mitigation measures could include:
  • rigorous testing and evaluation to inform an assessment of how the weapon system will perform in the anticipated circumstances of its use;
  • legal reviews and the sharing of best practices;
  • sufficient understanding, depending on the role and level of responsibilities, of the system's way of operating, its effect and likely interaction with the environment;
  • readily understandable human-machine interfaces and controls;
  • comprehensive and systematic training of personnel;
  • deactivation mechanisms, where appropriate;
  • establishing doctrine and procedures;
  • circumscribing weapons use through rules of engagement;
  • and assessment by the users.

Bias in the algorithms used in lethal autonomous weapons systems

  • The EU recognises the critical role that data plays for AI-based technologies. Social biases that could have a potential impact on emerging technologies, for example through gender, age, racial, and disabilitybias in algorithms, should also be given due consideration.
  • Risk assessment and mitigation measures should prevent unintended bias in the development and use of the weapon system, including gender, ethnicity, age, and disability.
  • Requirements to mitigate risks associated with bias in the algorithms used in lethal autonomous weapons systems couldinclude:
  • comprehensive testing and reviewsto identify and correct potential biases;
  • training of operators;
  • rigorous documentation of datasets used in LAWS, in conformity with national legislation;
  • testing of algorithmic models against benchmarks that evaluate their operation against potential bias;
  • as appropriate, transparency on how datasets are acquired and handled, in conformity with national legislation;
  • particular care and specific measures with regard to the integrity, veracity and quality of data, in conformity with national legislation.