Killer Robots Aren’t Science Fiction. A Push to Ban Them Is Growing.
By Satariano, Cumming-Bruce & Gladstone, 12/17/21, NY Times
It may have seemed like an obscure U.N. conclave, but a
meeting this week in Geneva was followed intently by experts
in artificial intelligence, military strategy, disarmament
& humanitarian law.
The reason for the interest? Killer robots — drones, guns &
bombs that decide on their own, with artificial brains,
whether to attack & kill — & what should be done, if anything,
to regulate or ban them.
Once the domain of sci-fi films like the “Terminator” series
& “RoboCop,” killer robots, more technically known as Lethal
Autonomous Weapons Systems, have been invented & tested at
an accelerated pace with little oversight. Some prototypes
have even been used in actual conflicts.
The evolution of these machines is considered a potentially
seismic event in warfare, akin to the invention of gunpowder
& nuclear bombs. This year, for the first time, a majority
of the 125 nations that belong to an agreement called the
Convention on Certain Conventional Weapons, or C.C.W., said
they wanted curbs on killer robots. But they were opposed by
members that are developing these weapons, most notably the
U.S. and Russia.
The group’s conference concluded on Friday with only a vague
statement about considering possible measures acceptable to
all. The Campaign to Stop Killer Robots, a disarmament group,
said the outcome fell “drastically short.”
What is the Convention on Certain Conventional Weapons? ------------------------------------------------------
The C.C.W., sometimes known as the Inhumane Weapons Convention,
is a framework of rules that ban or restrict weapons considered
to cause unnecessary, unjustifiable & indiscriminate suffering,
such as incendiary explosives, blinding lasers & booby traps
that don’t distinguish between fighters & civilians. The
convention has no provisions for killer robots.
What exactly are killer robots?
------------------------------
Opinions differ on an exact definition, but they are widely
considered to be weapons that make decisions with little or
no human involvement. Rapid improvements in robotics, A.I. &
image recognition are making such armaments possible.
The drones the U.S. has used extensively in Afghanistan,
Iraq & elsewhere aren't considered robots because they are
operated remotely by people, who choose targets & decide
whether to shoot.
Why are they considered attractive?
---------------------------------
To war planners, the weapons offer the promise of keeping
soldiers out of harm’s way, & making faster decisions than a
human would, by giving more battlefield responsibilities to
autonomous systems like pilotless drones & driverless tanks
that independently decide when to strike.
What are the objections?
-------------------------
Critics argue it's morally repugnant to assign lethal decision-
making to machines, regardless of technological sophistication.
How does a machine differentiate an adult from a child, a
fighter with a bazooka from a civilian with a broom, a hostile
combatant from a wounded or surrendering soldier?
“Fundamentally, autonomous weapon systems raise ethical
concerns for society about substituting human decisions about
life & death with sensor, software & machine processes,”
Peter Maurer, the president of the Int'l Committee of the
Red Cross & an outspoken opponent of killer robots, told the
Geneva conference.
In advance of the conference, Human Rights Watch & Harvard
Law School’s Int'l Human Rights Clinic called for steps
toward a legally binding agreement that requires human control
at all times.
“Robots lack the compassion, empathy, mercy, & judgment
necessary to treat humans humanely, & they can't understand
the inherent worth of human life,” the groups argued in a
briefing paper to support their recommendations.
Others said autonomous weapons, rather than reducing the risk
of war, could do the opposite — by providing antagonists with
ways of inflicting harm that minimize risks to their own
soldiers. “Mass produced killer robots could lower the
threshold for war by taking humans out of the kill chain &
unleashing machines that could engage a human target without
any human at the controls,” said Phil Twyford, New Zealand’s
disarmament minister.
Why was the Geneva conference important?
---------------------------------
The conference was widely considered by disarmament experts
to be the best opportunity so far to devise ways to regulate,
if not prohibit, the use of killer robots under the C.C.W.
It was the culmination of years of discussions by a group of
experts who'd been asked to identify the challenges & possible
approaches to reducing the threats from killer robots. But
the experts couldn't even reach agreement on basic questions.
What do opponents of a new treaty say?
-------------------------------------
Some, like Russia, insist that any decisions on limits must
be unanimous — in effect giving opponents a veto.
The U.S. argues that existing int'l laws are sufficient & that
banning autonomous weapons tech would be premature. The chief
U.S. delegate to the conference, Joshua Dorosin, proposed a
nonbinding “code of conduct” for use of killer robots — an
idea that disarmament advocates dismissed as a delaying tactic.
The American military has invested heavily in A.I., working
with the biggest defense contractors, including Lockheed Martin,
Boeing, Raytheon & Northrop Grumman. The work has included
projects to develop long-range missiles that detect moving
targets based on radio frequency, swarm drones that can identify
& attack a target, & automated missile-defense systems, acc.
to research by opponents of the weapons systems.
The complexity & varying uses of A.I. make it more difficult
to regulate than nukes or land mines, said Maaike Verbruggen,
an expert on emerging military security tech at the Centre
for Security, Diplomacy & Strategy in Brussels. She said lack
of transparency about what different countries are building
has created “fear & concern” among military leaders that they
must keep up.
“It’s very hard to get a sense of what another country is
doing,” said Verbruggen, who's working toward a Ph.D. on the
topic. “There's a lot of uncertainty & that drives military
innovation.”
Franz-Stefan Gady, a research fellow at the Int'l Inst. for
Strategic Studies, said the “arms race for autonomous weapons
systems is already underway & won’t be called off any time soon.”
Is there conflict in the defense establishment about killer robots? --------------------------------
Yes. Even as the tech becomes more advanced, there's been
reluctance to use autonomous weapons in combat because of
fears of mistakes, said Gady.
“Can military commanders trust the judgment of autonomous
weapon systems? Here the answer at the moment is clearly ‘no’
& will remain so for the near future,” he said.
The debate over autonomous weapons has spilled into Silicon
Valley. In 2018, Google said it wouldn't renew a contract with
the Pentagon after thousands of its employees signed a letter
protesting the company’s work on a program using A.I. to
interpret images that could be used to choose drone targets.
The company also created new ethical guidelines prohibiting
the use of its technology for weapons and surveillance.
Others believe the U.S. is not going far enough to compete
with rivals.
In Oct, the former chief software officer for the Air Force,
Nicolas Chaillan, told the Financial Times that he had resigned
because of what he saw as weak technological progress inside
the American military, particularly the use of A.I. He said
policymakers are slowed down by questions about ethics, while
countries like China press ahead.
Where have autonomous weapons been used?
--------------------------------------
There aren't many verified battlefield examples, but critics
point to a few incidents that show the technology’s potential.
In March, U.N. investigators said a “lethal autonomous
weapons system” had been used by govt-backed forces in Libya
against militia fighters. A drone called Kargu-2, made by a
Turkish defense contractor, tracked & attacked the fighters
as they fled a rocket attack, acc. to the report, which left
unclear whether any human controlled the drones.
In the 2020 war in Nagorno-Karabakh, Azerbaijan fought
Armenia with attack drones & missiles that loiter in the air
until detecting the signal of an assigned target.
What happens now?
-----------------
Many disarmament advocates said the outcome of the conference
had hardened what they described as a resolve to push for a
new treaty in the next few years, like those that prohibit
land mines and cluster munitions.
Daan Kayser, an autonomous weapons expert at PAX, a Netherlands-
based peace advocacy group, said the conference’s failure to
agree to even negotiate on killer robots was “a really plain
signal that the C.C.W. isn’t up to the job.”
Noel Sharkey, an A.I. expert & chairman of the Int'l Committee
for Robot Arms Control, said the meeting had demonstrated
that a new treaty was preferable to further C.C.W. deliberations.
“There was a sense of urgency in the room,” he said, that
“if there’s no movement, we’re not prepared to stay on this treadmill.”
https://www.nytimes.com/2021/12/17/world/robot-drone-ban.html
--- SoupGate-Win32 v1.05
* Origin: fsxNet Usenet Gateway (21:1/5)