Hacker news

  • Top
  • New
  • Past
  • Ask
  • Show
  • Jobs

Essex police pause facial recognition camera use after study finds racial bias (https://www.theguardian.com)

47 points by Brajeshwar 1 day ago | 45 comments | View on ycombinator

bsenftner 1 day ago |

Former author of one of the top 5 facial recognition servers in the world for multiple years running, here's what's going on: the industry has solved this issue, but the potential clients are seeking the lowest bidder, and picking the newer companies, the nepostically created not really players but well connected, and those companies have terrible implementations. This is not a case of the technology not there yet, we solved all these racial bias issues 10 years ago. But new companies with new training sets and new ML engineers that do not know any of the industry's history are now landing contracts with terrible quality models, but well connected sales channels.

OJFord 1 day ago |

This is actually more (socially/ethically/philosophically) interesting than one might assume from the headline: it's not false positives, it's that it's more effective (correctly identifies someone is on a watch-list) for one group than another within a protected characteristic.

So essentially they're pausing the use of it because it works too well for group A / not well enough for group B, potentially leading to disproportionate (albeit correct) arrests of group A.

blitzar 1 day ago |

> the system was more likely to correctly identify men than women and it was “statistically significantly more likely to correctly identify black participants than participants from other ethnic groups”

Technology has moved on a lot no doubt, however, studies were finding the opposite (and with order of magnitude errors) as recently as 2020 with a lazy google literature search

> these algorithms were found to be between 10 and 100 times more likely to misidentify a Black or East Asian face than a white face

https://jolt.law.harvard.edu/digest/why-racial-bias-is-preva...

ap99 1 day ago |

> more likely to correctly identify men than women.

> more likely to correctly identify black participants than participants from other ethnic groups.

> AI surveillance that is experimental, untested, inaccurate or potentially biased has no place on our streets.

I wonder if they're more worried about putting too many men in prison or too many black people.

undefined 1 day ago |

undefined

ghusto 1 day ago |

> the system was more likely to correctly identify men than women and it was “statistically significantly more likely to correctly identify black participants than participants from other ethnic groups”.

I am genuinely unsure what's going on.

My understanding of the article is that the system is problematic because it is more likely to correctly identify black people than "other ethnic groups". Is that right?

glyco about 21 hours ago |

This seems like an easy problem to solve - when the system informs you of a black criminal, just roll dice to ignore them and let them get away.

pingou 1 day ago |

If the suspect is Black, the software should automatically return zero matches in 30% of cases. Problem solved.

moi2388 1 day ago |

“statistically significantly more likely to correctly identify black participants than participants from other ethnic groups”.

Great. Wasn’t the problem before always that it couldn’t correctly identify non-white people? It does it accurately now. That is somehow also a problem? It should make more mistakes?

bloqs 1 day ago |

Correlation does not indicate causation

gib444 1 day ago |

Alternative headlines:

Essex police, well aware of all the issues before using it, pause use until expected bad publicity dies down

Or

Essex police chosen as force to take some flack for the issues while other forces steam ahead