47 points by Brajeshwar 1 day ago | 45 comments | View on ycombinator
bsenftner 1 day ago |
OJFord 1 day ago |
So essentially they're pausing the use of it because it works too well for group A / not well enough for group B, potentially leading to disproportionate (albeit correct) arrests of group A.
blitzar 1 day ago |
Technology has moved on a lot no doubt, however, studies were finding the opposite (and with order of magnitude errors) as recently as 2020 with a lazy google literature search
> these algorithms were found to be between 10 and 100 times more likely to misidentify a Black or East Asian face than a white face
https://jolt.law.harvard.edu/digest/why-racial-bias-is-preva...
ap99 1 day ago |
> more likely to correctly identify black participants than participants from other ethnic groups.
> AI surveillance that is experimental, untested, inaccurate or potentially biased has no place on our streets.
I wonder if they're more worried about putting too many men in prison or too many black people.
undefined 1 day ago |
ghusto 1 day ago |
I am genuinely unsure what's going on.
My understanding of the article is that the system is problematic because it is more likely to correctly identify black people than "other ethnic groups". Is that right?
glyco about 21 hours ago |
pingou 1 day ago |
moi2388 1 day ago |
Great. Wasn’t the problem before always that it couldn’t correctly identify non-white people? It does it accurately now. That is somehow also a problem? It should make more mistakes?
bloqs 1 day ago |
gib444 1 day ago |
Essex police, well aware of all the issues before using it, pause use until expected bad publicity dies down
Or
Essex police chosen as force to take some flack for the issues while other forces steam ahead