Geneva: Countries will meet on Monday at the United Nations to revive efforts to regulate the types of autonomous weapons controlled by AI increasingly used in modern war, since experts warn that time is being exhausted to put the railings in the new lethal technology.
Autonomous and artificial intelligence weapons systems are already playing a more important role in conflicts from Ukraine to Gaza. And the increase in defense spending worldwide promises to provide an additional increase for the flourishing military technology assisted by AI-AI.
However, progress towards the establishment of global rules that govern its development and use has not maintained the rhythm. And international link standards remain practically non -existent.
Since 2014, countries that are part of the Convention on Conventional Weapons (CCW) have gathered in Geneva to discuss a possible prohibition of totally autonomous systems that operate without significant and regular human control to others.
Ai -AI Autonomous Weapons Systems are playing a more important role in conflicts
The UN Secretary General, Antonio Guterres, has established a deadline of 2026 for states to establish clear rules on the use of AI weapons. But human rights groups warn that there is a consensus among governments.
Alexander Kmentt, head of weapons control at the Austrian Foreign Ministry, said that should change rapidly.
“Time is really running out to put some railings so that the nightmare scenarios that some of the most noticeable experts warn that they do not happen,” he said.
Monday’s meeting of the UN General Assembly in New York will be the first meeting of the body dedicated to autonomous weapons.
Although it is not legally binding, diplomatic officials want the consultations to increase the pressure on the military powers that resist regulation due to the concerns of the rules could opilate the advantages of the battlefield of technology.
The campaign groups expect the meeting, which also addresses critical issues not covered by the CCW, including concerns of ethical and human rights and the use of autonomous weapons by non -state actors, will promote states to agree on a legal instrument.
They see it as a crucial fire test on whether countries can join divisions before the next round of CCW conversations in September.
“This problem needs clarification through a legally binding treaty. Technology is moving very fast,” said Patrick Wilcken, an international amnesty researcher on military, security and police.
“The idea that you would not want to rule out the delegation of life or death decisions … a machine seems extraordinary.”
Arms race
New York conversations occur after 164 states supported a resolution of the 2023 General Assembly asking the international community to urgently address the risks raised by autonomous weapons.
Although many countries support a binding global framework, the United States, Russia, China and India prefer national guidelines or existing international laws, according to amnesty.
“We have not been convinced that the existing law is insufficient,” said a pentagon spokesman for the United States, adding that autonomous weapons could represent less risk for civilians than conventional weapons.
The governments of India, Russia and China did not respond to the requests for comments.
In the absence of regulation, autonomous systems are proliferating.
The Arms of Future of Life Institute have tracked the deployment of approximately 200 autonomous weapons systems in Ukraine, the Middle East and Africa.
Russian forces, for example, have deployed about 3,000 Kamikaze veter drones, capable of detecting and involving autonomously, according to Ukraine, according to their data.
Meanwhile, Ukraine has used semi -autonomous drones in the conflict. The Ukrainian government declined to comment.
Israel has used AI systems to identify objectives in Gaza. His mission in Geneva said he supported multilateral discussions and uses data technologies in full compliance with international law.
Human Rights Watch, however, said that crucial issues of responsibility according to international law remain without resolving and warned in a report last month that unregulated autonomous weapons have a variety of threats to human rights and could cause an arms race if they are not controlled.
And activists such as Laura Nolan de Stop Killer Robots concern that there is currently little to ensure that defense companies develop weapons driven by AI in a responsible manner.
“In general, we do not trust industries to self -regulate … There is no reason why defense or technology companies should be more worthy of trust,” he said.
Posted in Dawn, May 13, 2025