One Congressman’s Crusade to Save the World From Killer Robots

Robots could one day keep troops out of combat, but opponents say deploying them will have dangerous consequences.

National Journal
Alex Brown
Add to Briefcase
See more stories about...
Alex Brown
July 17, 2014, 4:15 p.m.

If a ro­bot sol­dier com­mits a war crime, who is held ac­count­able?

You can’t pun­ish a col­lec­tion of parts and cod­ing al­gorithms. But can you blame a hu­man com­mand­er, who gave a leg­al or­der only to see the ro­bot carry it out in­cor­rectly? And what about the de­fense man­u­fac­tur­ers, which are of­ten im­mune from the kind of law­suits that would plague ci­vil­ian out­fits if their products cost lives.

The culp­ab­il­ity ques­tion is one of a host of thorny mor­al di­lem­mas presen­ted by leth­al ro­bots. On the one hand, if ef­fect­ive, ro­bot sol­diers could re­place ground troops and pre­vent thou­sands of Amer­ic­an cas­u­al­ties. And ro­bots aren’t sus­cept­ible to many of the weak­nesses that plague hu­mans: ex­haus­tion, sick­ness, in­fec­tion, emo­tion, in­de­cision.

But even if ro­bot war­ri­ors can keep Amer­ic­an lives out of danger, can they be trus­ted with the com­plic­ated com­bat de­cisions now left to hu­man judg­ment?

Rep. Jim McGov­ern thinks not.

The Mas­sachu­setts Demo­crat is part of a cru­sade for an in­ter­na­tion­al ban on killer ro­bots — ma­chines that can de­cide without hu­man in­put whom to tar­get and when to use force.

The only way to stop killer ro­bots, said McGov­ern and a series of pan­el­ists he as­sembled for a Cap­it­ol Hill brief­ing this week, is to ban them be­fore they even ex­ist. Much like drones, once someone gets a killer ro­bot, it’s only a mat­ter of time be­fore every­one else is ra­cing to catch up. And des­pite some coun­tries’ com­mit­ment to eval­u­at­ing the tech­no­logy re­spons­ibly, good in­ten­tions nev­er won an arms race.

“The only thing harder than get­ting a ban in place is get­ting a ban in place after something is de­veloped,” McGov­ern said.

McGov­ern is ra­cing tech­no­logy, but he be­lieves he has time: He thinks it will take an­oth­er two to three dec­ades be­fore the tech­no­logy would be avail­able.

McGov­ern’s Tues­day pan­el is part of an on­go­ing ef­fort by anti-ro­bot act­iv­ists to raise aware­ness about the is­sue. They hope law­makers will share their con­cerns and join their push for a world­wide ban. “The U.S. should show lead­er­ship on this,” said the Hu­man Rights Watch’s Steve Goose. “If the U.S. were able to get out in front “¦ it would lead the way for many oth­er na­tions.”

So why is it so im­port­ant that ro­bots nev­er see the bat­tle­field? For some of the pan­el­ists, the is­sue is a mor­al one. “Do we really want to es­tab­lish a pre­ced­ent where it’s OK for ma­chines to take the lives of hu­man be­ings?” said Dr. Peter As­aro, a founder of the In­ter­na­tion­al Com­mit­tee for Ro­bot Arms Con­trol.

For most, though, the chief worry is judg­ment, and hu­mans’ in­nate abil­ity to read con­text. “Sol­diers have to rely on in­ten­tion or subtle clues,” said Bon­nie Docherty, an arms ex­pert at Hu­man Rights Watch and a lec­turer at Har­vard Law. “We have ser­i­ous con­cerns that a fully autonom­ous weapon could ever reach that level.”

Es­pe­cially in bat­tle­fields where sol­diers aren’t al­ways wear­ing dis­tin­guish­ing uni­forms, the abil­ity to re­cog­nize ac­tions from oth­er hu­mans be­comes im­port­ant. Even in cases where a ro­bot can tell friend from foe, it might have trouble re­cog­niz­ing if the en­emy is sur­ren­der­ing or is wounded.

Me­dia de­pic­tions like Ter­min­at­or have an­thro­po­morph­ized war­ri­or ro­bots, which “im­plies a level of cog­nit­ive abil­ity that these ma­chines do not have,” said Paul Scharre, who has worked on the De­fense De­part­ment’s autonom­ous-weapon policies. “Im­ages from sci­ence fic­tion are not very ac­cur­ate or very help­ful.”

Killer ro­bots won’t look like hu­mans, and they prob­ably won’t act like them either. “What [ro­bots] really lack is a mean­ing­ful un­der­stand­ing of con­text and situ­ation,” As­aro said. “It’s hard to be­lieve that a ma­chine could be mak­ing those kinds of mean­ing­ful choices about life and death.”

Oth­er con­cerns in­clude the pos­sib­il­ity of ma­li­cious hack­ers tak­ing over a ro­bot army. And then there’s the pos­sib­il­ity of a “flash war” start­ing over a mis­take. If one ro­bot mal­func­tions and fires, ro­bots on the oth­er side could re­turn fire auto­mat­ic­ally, start­ing a con­flict at the speed of cir­cuitry be­fore a hu­man could in­ter­vene.

The arms-race worry is very real, As­aro said. Un­like nuc­le­ar weapons, which re­quire ex­treme tech­nic­al soph­ist­ic­a­tion, killer ro­bots won’t be hard to rep­lic­ate. “Once these tech­no­lo­gies are in ex­ist­ence, they’ll pro­lif­er­ate widely,” he said. “There are even soft­ware sys­tems that could be im­ple­men­ted through the In­ter­net.”

Des­pite all these con­cerns, ro­bot ad­voc­ates say the rush to ban the tech­no­logy out­right is ill-con­ceived. While preach­ing cau­tion on de­vel­op­ment, they also say it’s im­port­ant to test the sys­tems’ lim­its be­fore craft­ing policy.

They fear a ban based on ima­gin­a­tions of an an­droid tot­ing a ma­chine gun could in­ter­fere with lifesav­ing tech­no­lo­gies like rap­id-re­sponse air-de­fense mis­siles. And while con­text re­cog­ni­tion re­mains a huge chal­lenge, ad­voc­ates say it’s at least worth ex­plor­ing wheth­er ro­bot war­ri­ors can ac­tu­ally re­duce ci­vil­ian cas­u­al­ties in some cir­cum­stances.

There’s also the chal­lenge of en­force­ment. Even if a ban were en­acted, it would be hard to tell if a drone fired a mis­sile on its own or some oth­er weapons sys­tem was op­er­at­ing un­der the com­mands of a hu­man or an al­gorithm.

It’s not that any­one has killer-ro­bot plans just yet. In fact, the pan­el­ists agreed the U.S. has been thought­ful and re­spons­ible in ap­proach­ing the is­sue. The De­fense De­part­ment even is­sued a policy state­ment on the tech­no­logy in late 2012 that es­tab­lished a five- to 10-year morator­i­um on de­vel­op­ing killer ro­bots.

But an Amer­ic­an stand-alone policy might not be enough. Ac­cord­ing to Scharre, at least 23 coun­tries have joined the race to build armed drones. It’s not hard to ima­gine a sim­il­ar push to build ma­chines that could re­place com­bat sol­diers — with or without U.S. in­volve­ment.

Mean­while, the is­sue will get more and more tricky. We won’t make the jump from a flesh-and-blood sol­dier to a T-1000, but some com­bat sys­tems could gradu­ally phase in more and more autonomy.

Some ro­bots will have in-the-loop sys­tems, where hu­man op­er­at­ors mon­it­or ac­tions and can over­ride at any point. The longer-term pro­spect is an out-of-the-loop ro­bot, one that car­ries out mis­sions with min­im­al su­per­vi­sion and no pos­sib­il­ity for hu­man con­trol.

Pan­el­ists agreed that the best chance for a ban will prob­ably come wrapped in lan­guage oth­er than “ro­bot ban.” They hope to per­suade coun­tries to agree to something in more pos­it­ive lan­guage — that their autonom­ous weapons will have a hu­man op­er­at­or mon­it­or­ing them and ready to take over at any time.

Re­gard­less of just what is al­lowed, it’s im­port­ant that mil­it­ar­ies know where to draw the line be­fore they have the tech­no­logy to build killer ro­bots. A treaty “frees up weapons de­velopers to know what they’re al­lowed to do,” Scharre said.

As ro­bots get more com­plex — and bet­ter able to read and re­spond to hu­man cues — it’s likely some ad­voc­ates will ar­gue they de­serve a more prom­in­ent place in com­bat. But for McGov­ern and his al­lies, such weapons would have to meet a chal­lenge they now deem im­possible: Can you build a ro­bot not only with a brain but with a soul?

What We're Following See More »
TWO MONTHS AFTER REFUSING AT CONVENTION
Cruz to Back Trump
1 days ago
THE LATEST
WHO TO BELIEVE?
Two Polls for Clinton, One for Trump
1 days ago
THE LATEST

With three days until the first debate, the polls are coming fast and furious. The latest round:

  • An Associated Press/Gfk poll of registered voters found very few voters committed, with Clin­ton lead­ing Trump, 37% to 29%, and Gary John­son at 7%.
  • A Mc­Clatchy-Mar­ist poll gave Clin­ton a six-point edge, 45% to 39%, in a four-way bal­lot test. Johnson pulls 10% support, with Jill Stein at 4%.
  • Rasmussen, which has drawn criticism for continually showing Donald Trump doing much better than he does in other polls, is at it again. A new survey gives Trump a five-point lead, 44%-39%.
NO SURPRISE
Trump Eschewing Briefing Materials in Debate Prep
1 days ago
THE DETAILS

In contrast to Hillary Clinton's meticulous debate practice sessions, Donald Trump "is largely shun­ning tra­di­tion­al de­bate pre­par­a­tions, but has been watch­ing video of…Clin­ton’s best and worst de­bate mo­ments, look­ing for her vul­ner­ab­il­it­ies.” Trump “has paid only curs­ory at­ten­tion to brief­ing ma­ter­i­als. He has re­fused to use lecterns in mock de­bate ses­sions des­pite the ur­ging of his ad­visers. He prefers spit­balling ideas with his team rather than hon­ing them in­to crisp, two-minute an­swers.”

Source:
TRUMP NO HABLA ESPANOL
Trump Makes No Outreach to Spanish Speakers
1 days ago
WHY WE CARE

Donald Trump "is on the precipice of becoming the only major-party presidential candidate this century not to reach out to millions of American voters whose dominant, first or just preferred language is Spanish. Trump has not only failed to buy any Spanish-language television or radio ads, he so far has avoided even offering a translation of his website into Spanish, breaking with two decades of bipartisan tradition."

Source:
$1.16 MILLION
Clintons Buy the House Next Door in Chappaqua
1 days ago
WHY WE CARE

Bill and Hillary Clinton have purchased the home next door to their primary residence in tony Chappaqua, New York, for $1.16 million. "By purchasing the new home, the Clinton's now own the entire cul-de-sac at the end of the road in the leafy New York suburb. The purchase makes it easier for the United States Secret Service to protect the former president and possible future commander in chief."

Source:
×