The Vatican has told a UN meeting in Geneva that the use of "killer robots" and other lethal autonomous weapons systems that use artificial intelligence violates international treaties because innocent civilians could be erroneously targeted.
The Holy See addressed the Group of Governmental Experts on Lethal Autonomous Weapons Systems of the Convention on Certain Conventional Weapons on Aug 4.
"The use of swarms in urban areas could lead to high risks for civilians," the statement said on the debate centering around the potential pitfalls stemming from the weaponization of artificial intelligence.
"If functioning without any direct human supervision, such systems could make mistakes in identifying the intended targets due to some unidentified 'bias' induced by their 'self-learning capabilities' developed from a limited set of data samples."
The Vatican, particularly its mission in Geneva, has warned for years against the use and development of LAWS or so-called killer robots. They include military drones, uncrewed vehicles and tanks, and artificially intelligent missiles, Catholic News Service reported.
The Vatican mission said lethal autonomous weapons systems could violate international humanitarian conventions and treaties, emphasizing the need for "interpretation, good faith, and prudential judgment" during armed combat.
"These aspects are, in part, informed by and based on the evolving context of operations, for which the human person is irreplaceable," the statement said.
"In addition to the concerns expressed by several delegations, there is an emerging awareness of these issues also among prominent scientists, engineers, researchers, military personnel, ethicists, and the larger civil society community," noted the Catholic Church statement.
"There are increasing instances of employees and entrepreneurs objecting on ethical grounds to certain projects dealing with the weaponization of artificial intelligence."
The use of advanced weaponry, devoid of human reason when applying the principles of "distinction, proportionality, precaution, necessity and expected military advantage" during combat, could lead to violations in established rules of engagement, the Vatican said.
At the same meeting, the International Committee of the Red Cross recommended that states adopt new, legally binding rules to regulate autonomous weapon systems to ensure that sufficient human control and judgment is retained in the use of force.
"Worryingly, the use of artificial intelligence and machine learning software to control the critical functions of selecting and applying force is being increasingly explored, said the Red Cross.
Such force would aggravate the already difficult task that users have in anticipating and limiting the effects of an autonomous weapon system.
"The ICRC recommends that States adopt new, legally binding rules to regulate autonomous weapon systems to ensure that sufficient human control and judgement is retained in the use of force."
The Group of Governmental Experts meeting is taking place Aug. 3-13.