Skip Navigation

Close and don't show again.

Your browser is out of date.

You may not get the full experience here on National Journal.

Please upgrade your browser to any of the following supported browsers:

FBI's Facial Recognition Software Could Fail 20 Percent of the Time FBI's Facial Recognition Software Could Fail 20 Percent of the Time

This ad will end in seconds
Close X

Want access to this content? Learn More »

Forget Your Password?

Don't have an account? Register »

Reveal Navigation


FBI's Facial Recognition Software Could Fail 20 Percent of the Time


Newly released documents show the FBI's facial-recognition software could fail 20 percent of the time.(iStock)

The Federal Bureau of Investigation's facial-recognition technology, used to identify individuals and assist in investigations, could fail one in every five times it is used, new documents show.

A 2010 report recently made public by the Electronic Privacy Information Center through a Freedom of Information Act request states that the facial-recognition technology "shall return an incorrect candidate a maximum of 20% of the time." When the technology is used against a searchable repository, it "shall return the correct candidate a minimum of 85% of the time."


"An innocent person may become part of an investigation because the technology isn't completely accurate," said Jeramie Scott, an attorney with EPIC who reviewed the documents, citing the Boston Marathon bombings as an example. "They're pushing it forward even though the technology isn't ready for prime time."

FBI officials could not be reached for comment, perhaps owing to the government shutdown.

The numbers posted by facial-recognition software compare unfavorably with other identification techniques used as part of the bureau's Next Generation Identification program, including fingerprinting, which yields an accurate match 99 percent of the time when combing a database, and iris scans, which field correct matches 98 percent of the time.


Currently, no federal laws limit the use of facial-recognition software by private-sector companies or the government, and it is used in varied applications, from law enforcement, to social networks and by motor vehicle departments in several states.

Last year, Sen. Al Franken, D-Minn., expressed concern during a Senate Judiciary subcommittee hearing that facial-recognition technology could be used to violate a person's privacy without their knowledge.

In terms of law enforcement, the failure rate may undermine the FBI's intention to use next-generation technology to identify criminal suspects being sought in active investigations, Scott said.

Documents dug up by EPIC earlier this year revealed a Homeland Security Department initiative known as the Biometric Optical Surveillance System that is working to use computers and cameras to quickly scan crowds and identify people. Though face-scanning technology has improved dramatically in recent years—as demonstrated by Facebook, which for years has featured an "auto-tag" feature that scans uploaded photos—accurately pulling a face from a crowd is still a difficult challenge.


Jerome Pender, who was then the deputy assistant director of the FBI's criminal-justice information services division, told Congress last year that the bureau was planning to update its 2008 Privacy Impact Assessment, which would "address all evolutionary changes" in the use of facial-recognition technology over the past several years. But the FBI has yet to release a revised assessment, Scott said.

This article appears in the October 15, 2013 edition of NJ Daily.

comments powered by Disqus