Lower Tax Rate And Fewer Deductions. Questionable Help For Middle Class Taxpayers
Futurism: Your Life Without Net Neutrality Protections

The Limitations And Issues With Facial Recognition Software

We've all seen television shows where police technicians use facial recognition software to swiftly and accurately identify suspects, or catch the bad guys. How accurate is that? An article in The Guardian newspaper discussed the promises, limitations, and issues with facial recognition software used by law enforcement:

"The software, which has taken an expanding role among law enforcement agencies in the US over the last several years, has been mired in controversy because of its effect on people of color. Experts fear that the new technology may actually be hurting the communities the police claims they are trying to protect... "It’s considered an imperfect biometric," said Clare Garvie, who in 2016 created a study on facial recognition software, published by the Center on Privacy and Technology at Georgetown Law, called The Perpetual Line-Up. "There’s no consensus in the scientific community that it provides a positive identification of somebody"... [Garvie's] report found that black individuals, as with so many aspects of the justice system, were the most likely to be scrutinized by facial recognition software in cases. It also suggested that software was most likely to be incorrect when used on black individuals – a finding corroborated by the FBI's own research. This combination, which is making Lynch’s and other black Americans’ lives excruciatingly difficult, is born from another race issue that has become a subject of national discourse: the lack of diversity in the technology sector... According to a 2011 study by the National Institute of Standards and Technologies (Nist), facial recognition software is actually more accurate on Asian faces when it’s created by firms in Asian countries, suggesting that who makes the software strongly affects how it works... Law enforcement agencies often don’t review their software to check for baked-in racial bias – and there aren’t laws or regulations forcing them to."

Comments

Feed You can follow this conversation by subscribing to the comment feed for this post.

The comments to this entry are closed.