top of page
  • Mike McCormick

Facing Up To Bad Biometrics

Updated: Sep 27, 2021


The ACLU conducted a controversial experiment last week with Amazon’s popular face matching software Rekognition. They ran photos of every member of Congress against a database of criminal mugshots. Twenty-eight members of Congress matched as potential criminals, many of them people of color. One of the 28, Rep. John Lewis, wrote a letter to Amazon asking them to “address the defects of this technology in order to prevent inaccurate outcomes.”

If ACLU’s intent was to focus congressional attention on the poor state of face biometric products, however briefly, they succeeded. But this has come up before. Last May the Congressional Black Caucus raised concerns to Amazon that local police use of Rekognition in black communities could lead to racial profiling.

Microsoft jumped into the debate last week too, when president Brad Smith blogged that, since face biometric “technology will be deployed more broadly across society, the only way to regulate this broad use is for the government to do so.” Users of Microsoft’s face recognition products include federal agencies such as ICE. So, when a company like Microsoft asks for government regulation, that gets people’s attention.

In the ACLU test, Rekognition demonstrated a false match rate (FMR) of 38% for people of color (versus 5% overall). Congress is a small sample size (535) but this rate is dramatically higher than the FMR one expects from a properly tuned biometric system (typically around 1% in verification applications, 5% in identification/surveillance applications).

The most likely explanation for elevated FMR among people of color is they were underrepresented in the initial training dataset. Biometric matching systems, often based on artificial neural networks, learn to recognize faces by looking at thousands of images. If the training dataset isn’t representative of the general population in some way (ethnicity, age, gender, facial hair) then the resulting system can perform poorly for that subgroup.

Amazon's response doesn't offer a specific explanation of the ACLU results, but rather recommend they rerun the test with the confidence level setting adjusted from its 80% default level to 99%. They claim 99% is recommended in Rekognition product documentation, yet don't explain why the default setting is so much lower. Security should always be the default.

Because Rekognition has been widely adopted in US police departments, it’s natural for groups like ACLU to focus on that product. But Amazon isn’t the only company making weak face recognition products. Google’s Photos app reportedly labeled black people as gorillas. And here on this blog I posted my concerns about Apple FaceID in the new iPhone X.

Apple boasts it used a dataset of one million face images to train FaceID’s neural net. How many of those faces were black, Hispanic, or Asian? Among other things, my letter to Apple asked them to “disclose details about neural net training and data set”. I received a polite thank you from Apple, but they provided no information and apparently took no further action.

Biometrics can be implemented well. Systems deployed in industrial and military settings use hi-def sensors, sophisticated liveness testing to prevent spoofs, supervised enrollment, conservative match thresholds, extensive training on large diverse datasets, plus adaptation and tuning to minimize FMR. These are costly, so consumer grade solutions tend to be less robust.

Rekognition may in fact be adequate for consumer use. But if Amazon markets it to law enforcement agencies, they should work with them to tune it for their environments and achieve acceptable FMR. They should also warn them that face recognition systems are highly fallible. A face match by itself is not admissible legal evidence or even grounds for arrest. And most important, they should fix the software to better match faces across all ethnic groups.

This is an area where some light government regulation could be helpful, as Microsoft suggests. For example, defining how biometric tools can or cannot be used in law enforcement and criminal courts.

If you’re concerned about Amazon marketing Rekognition to government agencies and police departments, you can sign the ACLU petition asking them to stop.

 

Michael McCormick is an information security consultant, researcher, and founder of Taproot Security.

bottom of page