Challenges posed by autonomous weapons systems and ways to address them: a perspective from Latin American academia and civil society

This is a summary of a document that emerged from a collaborative effort with experts who are members of ECPAT Guatemala (María Eugenia Villarreal); Perú por el Desarme (Gisela Luján); the Feminist AI Research Network – Latin America Chapter (Paola Ricaurte, Mexico/Ecuador, Mariana Díaz and Wanda Muñoz, Mexico); and Anderson Henao (Colombia), Jesús Martínez (El Salvador), experts in the rights of persons with disabilities and international humanitarian law.

The purpose of this summary is to encourage further reflection on both the national and international levels regarding the challenges inherent to autonomous weapons systems, underlining the urgent need to begin negotiations to establish a binding legal framework in a truly inclusive and representative forum. The full, original text in Spanish with references can be found here.

These contributions constitute our response to the invitation proffered by the United Nations Office for Disarmament Affairs, which called on States, international and regional bodies, the ICRC, civil society, and academia, among others, to provide the Secretary General with opinions and perspectives “on ways to address the challenges and concerns that arise with respect to lethal autonomous weapons systems from humanitarian, legal, security, technological, and ethical perspectives, and on the human role in the use of force”.

Our report consists of two sections: 1) Challenges and concerns with respect to AWS from humanitarian, legal, and ethical perspectives; and 2) Ways to address these challenges.

1. Challenges and concerns with respect to autonomous weapons systems

1.1. Autonomous weapons will have a disproportionate impact on groups and populations with marginalised identities and characteristics, particularly women, Afro-descendant/racialised persons, indigenous/native peoples, children, and persons with disabilities, among others.

  • There are several examples from the civilian sector that show that not only do emerging technologies pose risks, but that they have already caused damage and violated human rights.
  • The use of these technologies in weaponry will likely cause disproportionate damage to the aforementioned populations.
  • Understanding the difficulties and the differentiated negative impact of artificial intelligence systems is critical to analysing autonomous weapons systems, since these are the types of problems that could be replicated with the use of AI and emerging technologies in the military sector.
  • Additionally, it is important to consider the risk of transferring autonomous technologies to forces of law and order, which could contribute to racial profiling in surveillance, and even to political repression.

1.2 Autonomous weapons will increase the barriers to accessing justice and compensation for victims of violations of human rights and International Humanitarian Law.

  • The characteristics of autonomous weapons systems —including those related to the lack of predictability and explainability of emerging technologies and to applications of artificial intelligence, among others— will further hinder accountability, resources, compensation, and more generally, access to justice for persons with disabilities, one of the groups most affected in conflicts and who have more difficulties in accessing justice.
  • Remote war already has a disproportionate impact on certain groups. Not knowing when or where an attack will occur nor who might be a target is affecting different groups in different ways, and those effects are exacerbated in persons with a combination of marginalised identities and characteristics.

1.3 Autonomy in weapons systems is increasing and is already being used, a case in point being Israel in Gaza.

  • Emerging technologies are already having a specific negative and differentiated impact in conflict zones. Similarly, autonomy in targeting and attack decisions is increasing.
  • The most recent and flagrant case is unfolding in the context of the destruction of Gaza by Israel (echoing the words of Special Rapporteur Francesca Albanese on the human rights situation in the Palestine territories occupied since 1967).
  • During the first months of the campaign, the Israeli army used 25,000 tons of explosives (equivalent to two nuclear bombs) on countless buildings, many of which were identified using artificial intelligence. What is more, the Israeli government’s use of AI-driven technology has led to attacks against 11,000 targets in Gaza since the beginning of the most recent conflict on October 7, 2023.
  • Two highly troubling examples are the Habsora (“Gospel”) and Lavender systems, which use AI and automation to identify and generate targets en masse

2. Ways of addressing the challenges and concerns of autonomous weapons systems

2.1 International Humanitarian Law and International Human Rights Law apply to autonomous weapons systems, and a legally binding instrument specifically addressing autonomy in weapons systems is needed.

  • Currently, there is no legally binding international framework specific to such systems that ensures meaningful human control over the use of force. This is a serious legal vacuum for two reasons: a) There is no way to prevent the development and use of weapons with autonomy in the critical functions of targeting and engaging, and b) It makes it difficult for victims (affected persons, their families, and communities) to demand accountability, guarantees of non-repetition, and compensation for damages.
  • From our perspective, the only credible way to address autonomy in weapons systems is through the adoption of a new legally binding instrument. The fundamental goal would be to regulate the autonomy of weapons systems in keeping with International Human Rights Law, International Humanitarian Law, and International Criminal Law.

2.2 Characteristics of the legally binding instrument needed to respond to the challenges of autonomous weapons systems.

  • A legally binding instrument on autonomous weapons must include clear prohibitions and regulations, aim to maintain meaningful human control over force, and include effective implementation, monitoring, and accountability measures.
  • This regulatory instrument must prohibit those weapons systems that: a) would delegate targeting and attack decisions to autonomous functions; b) would target human beings and civilian infrastructure; and c) would profile humans as targets.
  • Regulations must refer to autonomy in other functions.
  • Said instrument must recognize the differentiated and disproportionate impact that these weapons would have on different population groups.
  • Additionally, the impact of these weapons on the environment must be considered, particularly in terms of extensive, lasting, serious, and irreversible damage

2.3 Characteristics of the forum where said instrument should be negotiated.

  • United Nations General Assembly (UNGA) Resolution A/C.1/78/L.56 is clear evidence of the majority opinion —146 States— on the “urgent need for the international community to address the challenges and concerns raised by autonomous weapons systems”.
  • This majority voice could only make itself heard in a democratic and participatory space such as the UNGA.
  • However, this has not been possible in the CCW —where the topic of autonomous weapons has been addressed for more than a decade— among other factors, because that forum allows the exercise of a veto under disguise of consensus.
  • It is necessary to shift the deliberations on autonomous weapons to other forums, particularly the UNGA, whose rules facilitate more equal participation of a greater number of countries.
  • Recent regional meetings (Costa Rica, Trinidad and Tobago the Philippines, Sierra Leone) show that it is possible to make progress in inclusive forums, and that it is important to create spaces that truly allow, reflect, and value diverse perspectives.
  • Furthermore, it is essential that all forums on autonomous weapons take specific measures to ensure the meaningful, free, and informed participation of civil society in all its diversity, particularly organisations representative of marginalised groups.
  • It is necessary to advocate the inclusion of military and defence topics in UN work on artificial intelligence and other technologies, and in the framework of other Conventions on Human Rights and regional bodies.

Concluding thoughts

  • Allowing more time to elapse before beginning negotiations for a legally binding instrument on autonomous weapons system that ensures human control over the significant use of force and prohibits those systems that attack human beings, only benefits highly militarised countries or military industries that continue to develop, test, and deploy such technologies in the absence of any regulatory framework.
  • We consider it unacceptable that a minority of countries is able to block the beginning of such negotiations, which is already negatively affecting those who face the consequences of increased autonomy in these weapons, as is currently happening in Gaza.

Please, find below the full report in Spanish:

Deja un comentario

Tu dirección de correo electrónico no será publicada. Los campos obligatorios están marcados con *