Today we are joining our voices to address an urgent humanitarian priority. The United Nations and the International Committee of the Red Cross (ICRC) call on States to establish specific prohibitions and restrictions on autonomous weapon systems, to shield present and future generations from the consequences of their use. In the current security landscape, setting clear international red lines will benefit all States.
Autonomous weapon systems – generally understood as weapon systems that select targets and apply force without human intervention – pose serious humanitarian, legal, ethical and security concerns.
Their development and proliferation have the potential to significantly change the way wars are fought and contribute to global instability and heightened international tensions. By creating a perception of reduced risk to military forces and to civilians, they may lower the threshold for engaging in conflicts, inadvertently escalating violence.
We must act now to preserve human control over the use of force. Human control must be retained in life and death decisions. The autonomous targeting of humans by machines is a moral line that we must not cross. Machines with the power and discretion to take lives without human involvement should be prohibited by international law.
Our concerns have only been heightened by the increasing availability and accessibility of sophisticated new and emerging technologies, such as in robotics and Artificial Intelligence technologies, that could be integrated into autonomous weapons.
The very scientists and industry leaders responsible for such technological advances have also been sounding the alarm. If we are to harness new technologies for the good of humanity, we must first address the most urgent risks and avoid irreparable consequences.
This means prohibiting autonomous weapon systems which function in such a way that their effects cannot be predicted. For example, allowing autonomous weapons to be controlled by machine learning algorithms – fundamentally unpredictable software which writes itself – is an unacceptably dangerous proposition.
In addition, clear restrictions are needed for all other types of autonomous weapons, to ensure compliance with international law and ethical acceptability. These include limiting where, when and for how long they are used, the types of targets they strike and the scale of force used, as well as ensuring the ability for effective human supervision, and timely intervention and deactivation.
Despite the increasing reports of testing and use of various types of autonomous weapon systems, it is not too late to take action. After more than a decade of discussions within the United Nations, including in the Human Rights Council, under the Convention on Certain Conventional Weapons and at the General Assembly, the foundation has been laid for the adoption of explicit prohibitions and restrictions. Now, States must build on this groundwork, and come together constructively to negotiate new rules that address the tangible threats posed by these weapon technologies.
International law, particularly international humanitarian law, prohibits certain weapons and sets general restrictions on the use of all others, and States and individuals remain accountable for any violations. However, without a specific international agreement governing autonomous weapon systems, States can hold different views about how these general rules apply. New international rules on autonomous weapons are therefore needed to clarify and strengthen existing law. They will be a preventive measure, an opportunity to protect those that may be affected by such weapons and essential to avoiding terrible consequences for humanity.
We call on world leaders to launch negotiations of a new legally binding instrument to set clear prohibitions and restrictions on autonomous weapon systems and to conclude such negotiations by 2026. We urge Members States to take decisive action now to protect humanity.
We acknowledge Source link for the information.