One Congressman’s Crusade to Save the World From Killer Robots

Robots could one day keep troops out of combat, but opponents say deploying them will have dangerous consequences.

National Journal
Alex Brown
July 17, 2014, 4:15 p.m.

If a ro­bot sol­dier com­mits a war crime, who is held ac­count­able?

You can’t pun­ish a col­lec­tion of parts and cod­ing al­gorithms. But can you blame a hu­man com­mand­er, who gave a leg­al or­der only to see the ro­bot carry it out in­cor­rectly? And what about the de­fense man­u­fac­tur­ers, which are of­ten im­mune from the kind of law­suits that would plague ci­vil­ian out­fits if their products cost lives.

The culp­ab­il­ity ques­tion is one of a host of thorny mor­al di­lem­mas presen­ted by leth­al ro­bots. On the one hand, if ef­fect­ive, ro­bot sol­diers could re­place ground troops and pre­vent thou­sands of Amer­ic­an cas­u­al­ties. And ro­bots aren’t sus­cept­ible to many of the weak­nesses that plague hu­mans: ex­haus­tion, sick­ness, in­fec­tion, emo­tion, in­de­cision.

But even if ro­bot war­ri­ors can keep Amer­ic­an lives out of danger, can they be trus­ted with the com­plic­ated com­bat de­cisions now left to hu­man judg­ment?

Rep. Jim McGov­ern thinks not.

The Mas­sachu­setts Demo­crat is part of a cru­sade for an in­ter­na­tion­al ban on killer ro­bots — ma­chines that can de­cide without hu­man in­put whom to tar­get and when to use force.

The only way to stop killer ro­bots, said McGov­ern and a series of pan­el­ists he as­sembled for a Cap­it­ol Hill brief­ing this week, is to ban them be­fore they even ex­ist. Much like drones, once someone gets a killer ro­bot, it’s only a mat­ter of time be­fore every­one else is ra­cing to catch up. And des­pite some coun­tries’ com­mit­ment to eval­u­at­ing the tech­no­logy re­spons­ibly, good in­ten­tions nev­er won an arms race.

“The only thing harder than get­ting a ban in place is get­ting a ban in place after something is de­veloped,” McGov­ern said.

McGov­ern is ra­cing tech­no­logy, but he be­lieves he has time: He thinks it will take an­oth­er two to three dec­ades be­fore the tech­no­logy would be avail­able.

McGov­ern’s Tues­day pan­el is part of an on­go­ing ef­fort by anti-ro­bot act­iv­ists to raise aware­ness about the is­sue. They hope law­makers will share their con­cerns and join their push for a world­wide ban. “The U.S. should show lead­er­ship on this,” said the Hu­man Rights Watch’s Steve Goose. “If the U.S. were able to get out in front “¦ it would lead the way for many oth­er na­tions.”

So why is it so im­port­ant that ro­bots nev­er see the bat­tle­field? For some of the pan­el­ists, the is­sue is a mor­al one. “Do we really want to es­tab­lish a pre­ced­ent where it’s OK for ma­chines to take the lives of hu­man be­ings?” said Dr. Peter As­aro, a founder of the In­ter­na­tion­al Com­mit­tee for Ro­bot Arms Con­trol.

For most, though, the chief worry is judg­ment, and hu­mans’ in­nate abil­ity to read con­text. “Sol­diers have to rely on in­ten­tion or subtle clues,” said Bon­nie Docherty, an arms ex­pert at Hu­man Rights Watch and a lec­turer at Har­vard Law. “We have ser­i­ous con­cerns that a fully autonom­ous weapon could ever reach that level.”

Es­pe­cially in bat­tle­fields where sol­diers aren’t al­ways wear­ing dis­tin­guish­ing uni­forms, the abil­ity to re­cog­nize ac­tions from oth­er hu­mans be­comes im­port­ant. Even in cases where a ro­bot can tell friend from foe, it might have trouble re­cog­niz­ing if the en­emy is sur­ren­der­ing or is wounded.

Me­dia de­pic­tions like Ter­min­at­or have an­thro­po­morph­ized war­ri­or ro­bots, which “im­plies a level of cog­nit­ive abil­ity that these ma­chines do not have,” said Paul Scharre, who has worked on the De­fense De­part­ment’s autonom­ous-weapon policies. “Im­ages from sci­ence fic­tion are not very ac­cur­ate or very help­ful.”

Killer ro­bots won’t look like hu­mans, and they prob­ably won’t act like them either. “What [ro­bots] really lack is a mean­ing­ful un­der­stand­ing of con­text and situ­ation,” As­aro said. “It’s hard to be­lieve that a ma­chine could be mak­ing those kinds of mean­ing­ful choices about life and death.”

Oth­er con­cerns in­clude the pos­sib­il­ity of ma­li­cious hack­ers tak­ing over a ro­bot army. And then there’s the pos­sib­il­ity of a “flash war” start­ing over a mis­take. If one ro­bot mal­func­tions and fires, ro­bots on the oth­er side could re­turn fire auto­mat­ic­ally, start­ing a con­flict at the speed of cir­cuitry be­fore a hu­man could in­ter­vene.

The arms-race worry is very real, As­aro said. Un­like nuc­le­ar weapons, which re­quire ex­treme tech­nic­al soph­ist­ic­a­tion, killer ro­bots won’t be hard to rep­lic­ate. “Once these tech­no­lo­gies are in ex­ist­ence, they’ll pro­lif­er­ate widely,” he said. “There are even soft­ware sys­tems that could be im­ple­men­ted through the In­ter­net.”

Des­pite all these con­cerns, ro­bot ad­voc­ates say the rush to ban the tech­no­logy out­right is ill-con­ceived. While preach­ing cau­tion on de­vel­op­ment, they also say it’s im­port­ant to test the sys­tems’ lim­its be­fore craft­ing policy.

They fear a ban based on ima­gin­a­tions of an an­droid tot­ing a ma­chine gun could in­ter­fere with lifesav­ing tech­no­lo­gies like rap­id-re­sponse air-de­fense mis­siles. And while con­text re­cog­ni­tion re­mains a huge chal­lenge, ad­voc­ates say it’s at least worth ex­plor­ing wheth­er ro­bot war­ri­ors can ac­tu­ally re­duce ci­vil­ian cas­u­al­ties in some cir­cum­stances.

There’s also the chal­lenge of en­force­ment. Even if a ban were en­acted, it would be hard to tell if a drone fired a mis­sile on its own or some oth­er weapons sys­tem was op­er­at­ing un­der the com­mands of a hu­man or an al­gorithm.

It’s not that any­one has killer-ro­bot plans just yet. In fact, the pan­el­ists agreed the U.S. has been thought­ful and re­spons­ible in ap­proach­ing the is­sue. The De­fense De­part­ment even is­sued a policy state­ment on the tech­no­logy in late 2012 that es­tab­lished a five- to 10-year morator­i­um on de­vel­op­ing killer ro­bots.

But an Amer­ic­an stand-alone policy might not be enough. Ac­cord­ing to Scharre, at least 23 coun­tries have joined the race to build armed drones. It’s not hard to ima­gine a sim­il­ar push to build ma­chines that could re­place com­bat sol­diers — with or without U.S. in­volve­ment.

Mean­while, the is­sue will get more and more tricky. We won’t make the jump from a flesh-and-blood sol­dier to a T-1000, but some com­bat sys­tems could gradu­ally phase in more and more autonomy.

Some ro­bots will have in-the-loop sys­tems, where hu­man op­er­at­ors mon­it­or ac­tions and can over­ride at any point. The longer-term pro­spect is an out-of-the-loop ro­bot, one that car­ries out mis­sions with min­im­al su­per­vi­sion and no pos­sib­il­ity for hu­man con­trol.

Pan­el­ists agreed that the best chance for a ban will prob­ably come wrapped in lan­guage oth­er than “ro­bot ban.” They hope to per­suade coun­tries to agree to something in more pos­it­ive lan­guage — that their autonom­ous weapons will have a hu­man op­er­at­or mon­it­or­ing them and ready to take over at any time.

Re­gard­less of just what is al­lowed, it’s im­port­ant that mil­it­ar­ies know where to draw the line be­fore they have the tech­no­logy to build killer ro­bots. A treaty “frees up weapons de­velopers to know what they’re al­lowed to do,” Scharre said.

As ro­bots get more com­plex — and bet­ter able to read and re­spond to hu­man cues — it’s likely some ad­voc­ates will ar­gue they de­serve a more prom­in­ent place in com­bat. But for McGov­ern and his al­lies, such weapons would have to meet a chal­lenge they now deem im­possible: Can you build a ro­bot not only with a brain but with a soul?

What We're Following See More »
THE QUESTION
How Much Did the IRS Overpay in Earned Income Tax Credit Benefits?
12 minutes ago
THE ANSWER

An estimated $15.6 billion, "according to a Treasury Inspector General for Tax Administration report."

Source:
TIES TO CLINTON GLOBAL INITIATIVE
McAuliffe Under Investigation for Fundraising
16 minutes ago
WHY WE CARE

Virginia Gov. Terry McAuliffe (D) “is the subject of an ongoing investigation by the FBI and … the Justice Department” for potentially improper contributions to his 2013 campaign, including while he was a Clinton Global Initiative board member. ... Among the McAuliffe donations that drew the interest of the investigators was $120,000 from” former Chinese legislator Wang Wenliang. “U.S. election law prohibits foreign nationals from donating to … elections. … But Wang holds U.S. permanent resident status.”

Source:
RAISES SEX ASSAULT, VINCE FOSTER
Trump Takes Aim at Bill Clinton
30 minutes ago
THE LATEST

"Donald Trump is reviving some of the ugliest political chapters of the 1990s with escalating personal attacks on Bill Clinton's character, part of a concerted effort to smother Hillary Clinton 's campaign message with the weight of decades of controversy. Trump's latest shot came Monday when he released an incendiary Instagram video that includes the voices of two women who accused the former president of sexual assault, underscoring the presumptive Republican nominee's willingness to go far beyond political norms in his critique of his likely Democratic rival. ...In one recent interview, Trump said another topic of potential concern is the suicide of former White House aide Vincent Foster, which remains the focus of intense and far-fetched conspiracy theories on the Internet."

Source:
FUROR AFTER HOUSE OVERSIGHT HEARING
Head of Security for TSA Has Been Reassigned
40 minutes ago
THE DETAILS

"The head of security for the Transportation Security Administration, Kelly Hoggan, has been removed from his position after a hearing about the agency's management, the House Oversight Committee says." Deputy assistant administrator Darby LaJoye will take over for Hoggan on a temporary basis.

Source:
FORMERLY THE DEPT’S TOP ATTORNEY
Transportation Sec. Names Special Adviser for Metro System
2 hours ago
THE LATEST

"Transportation Secretary Anthony Foxx has appointed a veteran legal insider with strong personal ties to the Obama administration to serve as his special adviser focused exclusively on fixing the Washington region’s troubled Metro system. Kathryn Thomson, who was expected to leave her job as the Department of Transportation’s top lawyer, instead will stay on as Foxx’s special adviser on Metro oversight." She'll start this week.

Source:
×