Page 56 - The Economist Asia January 2018
P. 56
SPECIAL REPORT
THE FUTURE OF WAR
2 experts and NGOs from the Campaign to Stop Killer Robots, Much of the discussion
which wants a legally binding international treaty banning about “teaming” with robotic Offer to readers
LAWs, just as cluster munitions, landmines and blinding lasers systems revolves around hu- Reprints of this special report are available.
A minimum order of five copies is required.
have been banned in the past. mans’ place in the “observe, Please contact: Jill Kaletha at Foster
The trouble is that autonomous weapons range all the way orient, decide, act” (OODA) deci- Printing Tel: +1 866 879 9144 Ext: 168
from missiles capable ofselective targetingto learning machines sion-making loop. The operator e-mail: jillk@fosterprinting.com
with the cognitive skillsto decide whom, when and howto fight. of a remotely piloted armed
Corporate offer
Mostpeople agree thatwhen lethal force isused, humans should Reaper drone is in the OODA
Corporate orders of 100 copies or more are
be involved in initiating it. But determining what sort of human loop because he decides where
available. We also offer a customisation
control might be appropriate is trickier, and the technology is it goes and what it does when it service. Please contact us to discuss your
movingso fast that it is leavinginternational diplomacy behind. gets there. An on-the-loop sys- requirements.
To complicate matters, the most dramatic advances in AI tem, by contrast, will carry out Tel: +44 (0)20 7576 8148
e-mail: rights@economist.com
and autonomous machines are being made by private firms most of its mission without a
with commercial motives. Even if agreement on banning mili- human operator, but a human For more information on how to order special
reports, reprints or any copyright queries
tary robots could be reached, the technology enabling autono- can intercede at any time, for ex- you may have, please contact:
mous weapons will be both pervasive and easily transferable. ample byabortingthe mission if The Rights and Syndication Department
Moreover, governments have a duty to keep their citizens the target has changed. A fully 20 Cabot Square
secure. Concluding that they can manage quite well without autonomous system, in which London E14 4QW
chemical weapons or cluster bombs is one thing. Allowing po- the human operator merely Tel: +44 (0)20 7576 8148
Fax: +44 (0)20 7576 8492
tential adversaries a monopoly on technologies that could en- presses the start button, has re- e-mail: rights@economist.com
able them to launch a crushing attack because some campaign sponsibility for carrying www.economist.com/rights
groups have raised concerns is quite another. through every part of the mis-
As Peter Singer notes, the AI arms race is propelled by un- sion, including target selection, Future special reports
stoppable forces: geopolitical competition, science pushing at so it is off the loop. An on-the- Autonomous vehicles February 24th
the frontiers of knowledge, and profit-seeking technology busi- loop driver of an autonomous Technology Quarterly: Oceans March 3rd
nesses. So the question iswhetherand howsome ofitsmore dis- car would let it do most of the The geopolitics of energy March 17th
turbing aspects can be constrained. At its simplest, most people work but would be ready to re-
are appalled by the idea of thinking machines being allowed to sume control should the need Previous special reports and a list of
forthcoming ones can be found online:
make their own choices about killing human beings. And al- arise. Yetifthe carmerely had its economist.com/specialreports
though the ultimate nightmare of a robot uprising in which ma- destination chosen by the user
chines take a genocidal dislike to the human race is still science and travelled there without any
fiction, otherfears have substance. furtherintervention, the human
would be offthe loop.
Nightmare scenarios For now, Western armed forces are determined to keep hu-
Paul Scharre is concerned that autonomous systems might mans either in or on the loop. In 2012 the Pentagon issued a poli-
malfunction, perhaps because of badly written code or because cy directive: “These [autonomous] systems shall be designed to
of a cyber attack by an adversary. That could cause fratricidal at- allow commanders and operators to exercise appropriate levels
tackson theirown side’shuman forcesorescalation so rapid that ofhuman judgmentoverthe use offorce. Personswho authorise
humans would not be able to respond. Testing autonomous the use of, direct the use of, or operate, these systems must do so
weapons for reliability is tricky. Thinking machines may do with appropriate care and in accordance with the lawof war, ap-
things in ways that theirhuman controllers neverenvisaged. plicable treaties, weapons-systems safety rules and applicable
rules ofengagement.”
That remains the policy. But James
Most people Miller, the former under-secretary of De-
fence for Policy at the Pentagon, says that
agree that although America will try to keep a hu-
when lethal man in or on the loop, adversaries may
not. They might, for example, decide on
force is used, pre-delegated decision-making at hyper-
humans speed if their command-and-control
nodes are attacked. Russia is believed to
should be operate a “dead hand” that will automati-
involved. But cally launch its nuclear missiles if its seis-
mic, light, radioactivity and pressure sen-
what sort of sors detect a nuclearattack.
human Mr Miller thinks that ifautonomous
systems are operating in highly contested
control is space, the temptation to let the machine
appropriate? take over will become overwhelming:
“Someone will crossthe line ofsensibility
and morality.” And when they do, others
will surelyfollow. Nothingismore certain
aboutthe future ofwarfare than that tech-
nological possibilities will always shape
the struggle foradvantage. 7
16 The Economist January 27th 2018