One Congressman’s Crusade to Save the World From Killer Robots

Robots could one day keep troops out of combat, but opponents say deploying them will have dangerous consequences.

National Journal
Alex Brown
Add to Briefcase
See more stories about...
Alex Brown
July 17, 2014, 4:15 p.m.

If a ro­bot sol­dier com­mits a war crime, who is held ac­count­able?

You can’t pun­ish a col­lec­tion of parts and cod­ing al­gorithms. But can you blame a hu­man com­mand­er, who gave a leg­al or­der only to see the ro­bot carry it out in­cor­rectly? And what about the de­fense man­u­fac­tur­ers, which are of­ten im­mune from the kind of law­suits that would plague ci­vil­ian out­fits if their products cost lives.

The culp­ab­il­ity ques­tion is one of a host of thorny mor­al di­lem­mas presen­ted by leth­al ro­bots. On the one hand, if ef­fect­ive, ro­bot sol­diers could re­place ground troops and pre­vent thou­sands of Amer­ic­an cas­u­al­ties. And ro­bots aren’t sus­cept­ible to many of the weak­nesses that plague hu­mans: ex­haus­tion, sick­ness, in­fec­tion, emo­tion, in­de­cision.

But even if ro­bot war­ri­ors can keep Amer­ic­an lives out of danger, can they be trus­ted with the com­plic­ated com­bat de­cisions now left to hu­man judg­ment?

Rep. Jim McGov­ern thinks not.

The Mas­sachu­setts Demo­crat is part of a cru­sade for an in­ter­na­tion­al ban on killer ro­bots — ma­chines that can de­cide without hu­man in­put whom to tar­get and when to use force.

The only way to stop killer ro­bots, said McGov­ern and a series of pan­el­ists he as­sembled for a Cap­it­ol Hill brief­ing this week, is to ban them be­fore they even ex­ist. Much like drones, once someone gets a killer ro­bot, it’s only a mat­ter of time be­fore every­one else is ra­cing to catch up. And des­pite some coun­tries’ com­mit­ment to eval­u­at­ing the tech­no­logy re­spons­ibly, good in­ten­tions nev­er won an arms race.

“The only thing harder than get­ting a ban in place is get­ting a ban in place after something is de­veloped,” McGov­ern said.

McGov­ern is ra­cing tech­no­logy, but he be­lieves he has time: He thinks it will take an­oth­er two to three dec­ades be­fore the tech­no­logy would be avail­able.

McGov­ern’s Tues­day pan­el is part of an on­go­ing ef­fort by anti-ro­bot act­iv­ists to raise aware­ness about the is­sue. They hope law­makers will share their con­cerns and join their push for a world­wide ban. “The U.S. should show lead­er­ship on this,” said the Hu­man Rights Watch’s Steve Goose. “If the U.S. were able to get out in front “¦ it would lead the way for many oth­er na­tions.”

So why is it so im­port­ant that ro­bots nev­er see the bat­tle­field? For some of the pan­el­ists, the is­sue is a mor­al one. “Do we really want to es­tab­lish a pre­ced­ent where it’s OK for ma­chines to take the lives of hu­man be­ings?” said Dr. Peter As­aro, a founder of the In­ter­na­tion­al Com­mit­tee for Ro­bot Arms Con­trol.

For most, though, the chief worry is judg­ment, and hu­mans’ in­nate abil­ity to read con­text. “Sol­diers have to rely on in­ten­tion or subtle clues,” said Bon­nie Docherty, an arms ex­pert at Hu­man Rights Watch and a lec­turer at Har­vard Law. “We have ser­i­ous con­cerns that a fully autonom­ous weapon could ever reach that level.”

Es­pe­cially in bat­tle­fields where sol­diers aren’t al­ways wear­ing dis­tin­guish­ing uni­forms, the abil­ity to re­cog­nize ac­tions from oth­er hu­mans be­comes im­port­ant. Even in cases where a ro­bot can tell friend from foe, it might have trouble re­cog­niz­ing if the en­emy is sur­ren­der­ing or is wounded.

Me­dia de­pic­tions like Ter­min­at­or have an­thro­po­morph­ized war­ri­or ro­bots, which “im­plies a level of cog­nit­ive abil­ity that these ma­chines do not have,” said Paul Scharre, who has worked on the De­fense De­part­ment’s autonom­ous-weapon policies. “Im­ages from sci­ence fic­tion are not very ac­cur­ate or very help­ful.”

Killer ro­bots won’t look like hu­mans, and they prob­ably won’t act like them either. “What [ro­bots] really lack is a mean­ing­ful un­der­stand­ing of con­text and situ­ation,” As­aro said. “It’s hard to be­lieve that a ma­chine could be mak­ing those kinds of mean­ing­ful choices about life and death.”

Oth­er con­cerns in­clude the pos­sib­il­ity of ma­li­cious hack­ers tak­ing over a ro­bot army. And then there’s the pos­sib­il­ity of a “flash war” start­ing over a mis­take. If one ro­bot mal­func­tions and fires, ro­bots on the oth­er side could re­turn fire auto­mat­ic­ally, start­ing a con­flict at the speed of cir­cuitry be­fore a hu­man could in­ter­vene.

The arms-race worry is very real, As­aro said. Un­like nuc­le­ar weapons, which re­quire ex­treme tech­nic­al soph­ist­ic­a­tion, killer ro­bots won’t be hard to rep­lic­ate. “Once these tech­no­lo­gies are in ex­ist­ence, they’ll pro­lif­er­ate widely,” he said. “There are even soft­ware sys­tems that could be im­ple­men­ted through the In­ter­net.”

Des­pite all these con­cerns, ro­bot ad­voc­ates say the rush to ban the tech­no­logy out­right is ill-con­ceived. While preach­ing cau­tion on de­vel­op­ment, they also say it’s im­port­ant to test the sys­tems’ lim­its be­fore craft­ing policy.

They fear a ban based on ima­gin­a­tions of an an­droid tot­ing a ma­chine gun could in­ter­fere with lifesav­ing tech­no­lo­gies like rap­id-re­sponse air-de­fense mis­siles. And while con­text re­cog­ni­tion re­mains a huge chal­lenge, ad­voc­ates say it’s at least worth ex­plor­ing wheth­er ro­bot war­ri­ors can ac­tu­ally re­duce ci­vil­ian cas­u­al­ties in some cir­cum­stances.

There’s also the chal­lenge of en­force­ment. Even if a ban were en­acted, it would be hard to tell if a drone fired a mis­sile on its own or some oth­er weapons sys­tem was op­er­at­ing un­der the com­mands of a hu­man or an al­gorithm.

It’s not that any­one has killer-ro­bot plans just yet. In fact, the pan­el­ists agreed the U.S. has been thought­ful and re­spons­ible in ap­proach­ing the is­sue. The De­fense De­part­ment even is­sued a policy state­ment on the tech­no­logy in late 2012 that es­tab­lished a five- to 10-year morator­i­um on de­vel­op­ing killer ro­bots.

But an Amer­ic­an stand-alone policy might not be enough. Ac­cord­ing to Scharre, at least 23 coun­tries have joined the race to build armed drones. It’s not hard to ima­gine a sim­il­ar push to build ma­chines that could re­place com­bat sol­diers — with or without U.S. in­volve­ment.

Mean­while, the is­sue will get more and more tricky. We won’t make the jump from a flesh-and-blood sol­dier to a T-1000, but some com­bat sys­tems could gradu­ally phase in more and more autonomy.

Some ro­bots will have in-the-loop sys­tems, where hu­man op­er­at­ors mon­it­or ac­tions and can over­ride at any point. The longer-term pro­spect is an out-of-the-loop ro­bot, one that car­ries out mis­sions with min­im­al su­per­vi­sion and no pos­sib­il­ity for hu­man con­trol.

Pan­el­ists agreed that the best chance for a ban will prob­ably come wrapped in lan­guage oth­er than “ro­bot ban.” They hope to per­suade coun­tries to agree to something in more pos­it­ive lan­guage — that their autonom­ous weapons will have a hu­man op­er­at­or mon­it­or­ing them and ready to take over at any time.

Re­gard­less of just what is al­lowed, it’s im­port­ant that mil­it­ar­ies know where to draw the line be­fore they have the tech­no­logy to build killer ro­bots. A treaty “frees up weapons de­velopers to know what they’re al­lowed to do,” Scharre said.

As ro­bots get more com­plex — and bet­ter able to read and re­spond to hu­man cues — it’s likely some ad­voc­ates will ar­gue they de­serve a more prom­in­ent place in com­bat. But for McGov­ern and his al­lies, such weapons would have to meet a chal­lenge they now deem im­possible: Can you build a ro­bot not only with a brain but with a soul?

What We're Following See More »
AT ISSUE: COMEY FIRING, SESSIONS’S RECUSAL
Mueller Seeks Documents from DOJ
2 days ago
THE LATEST

Special counsel Robert Mueller "is now demanding documents from the department overseeing his investigation." A source tells ABC News that "Mueller's investigators are keen to obtain emails related to the firing of FBI Director James Comey and the earlier decision of Attorney General Jeff Sessions to recuse himself from the entire matter."

Source:
MULVANEY SAYS PROVISION ISN’T A DEALBREAKER
Trump May Be OK with Dropping Mandate Repeal
2 days ago
THE LATEST

"President Donald Trump would not insist on including repeal of an Obama-era health insurance mandate in a bill intended to enact the biggest overhaul of the tax code since the 1980s, a senior White House aide said on Sunday. The version of tax legislation put forward by Senate Republican leaders would remove a requirement in former President Barack Obama’s signature healthcare law that taxes Americans who decline to buy health insurance."

Source:
FRANKEN JUST THE BEGINNING?
Media Devoting More Resources to Lawmakers’ Sexual Misconduct
2 days ago
THE LATEST

"Members of Congress with histories of mistreating women should be extremely nervous. Major outlets, including CNN, are dedicating substantial newsroom resources to investigating sexual harassment allegations against numerous lawmakers. A Republican source told me he's gotten calls from well-known D.C. reporters who are gathering stories about sleazy members."

Source:
STARTS LEGAL FUND FOR WH STAFF
Trump to Begin Covering His Own Legal Bills
4 days ago
THE DETAILS
DISCUSSED THE MATTER FOR A NEW BOOK
Steele Says Follow the Money
4 days ago
STAFF PICKS

"Christopher Steele, the former British intelligence officer who wrote the explosive dossier alleging ties between Donald Trump and Russia," says in a new book by The Guardian's Luke Harding that "Trump's land and hotel deals with Russians needed to be examined. ... Steele did not go into further detail, Harding said, but seemed to be referring to a 2008 home sale to the Russian oligarch Dmitry Rybolovlev. Richard Dearlove, who headed the UK foreign-intelligence unit MI6 between 1999 and 2004, said in April that Trump borrowed money from Russia for his business during the 2008 financial crisis."

Source:
×
×

Welcome to National Journal!

You are currently accessing National Journal from IP access. Please login to access this feature. If you have any questions, please contact your Dedicated Advisor.

Login