Hacker News

Innocent woman arrested in false-positive in facial recognition system in Brazil

An innocent woman was arrested today (while working in an informal job without any personally identifiable documentation) in Rio de Janeiro. The automated facial recognition systems (FRS) identified her as another woman, wanted for murder - it turns out the actual criminal is _already in jail_ and the police organization operating the FRS wasn't informed about it yet due to delays between the systems.

This is the second day of operation of the system, according to the press. However, a restricted version of the same FRS was used for 15 days during the Carnaval festivities with no alarming failures and reportedly resulted in a few arrests.

It has now been deployed in 25 locations in Copacabana. (Anedoctally, I live here and can't tell you where they are - the equipment, cameras, etc. must be well hidden, or I'm terrible at spotting them).

Source, in Portuguese: "Facial recognition fails in its second day and innocent woman confused with criminal is arrested" https://oglobo.globo.com/rio/reconhecimento-facial-falha-em-segundo-dia-mulher-inocente-confundida-com-criminosa-ja-presa-23798913

Related previous submission (with zero comments): https://news.ycombinator.com/item?id=19074434

93 pointsgota posted 4 months ago47 Comments
47 Comments:
drtillberg said 4 months ago:

And in other news, 25% of all stranger identifications by human eyewitnesses also were found to be erroneous.[1]

[1] https://californiainnocenceproject.org/issues-we-face/eyewit...

gota said 4 months ago:

I agree that automated systems applied to the same task will (eventually) beat human performance by a lot, but there are 2 issues that make this less relevant in this case.

The first is one relates to this application of facial recognition targeting poor people. This woman had no documentation on her. I'd wager this is really common in Brazil. How could she possibly prove she is _NOT_ the person the software with 99.999..% accuracy says she is?

If the actual criminal wasn't already in jail, would she be set free in the same day she was arrested?

The second issue is perhaps more universal to other automations, but certainly potentializes the first one, dealing with _scale_. Human cops can't check every person they see against the entire database of wanted people.

25% of false-positives from every cop is a hell of a lot less than 0.00001% of an automated system when applied to every crowd, everywhere, all the time.

droithomme said 4 months ago:

> 99.999% accuracy

OK, so for a given facial profile, 0.001% of people passing in a crowd will be incorrectly identified as a match. That's 1 in 100,000 people who pass by will be incorrectly matched against a particular profile, correct? How many people walk past this camera? 1000 in a day? How many cameras are deployed in an area? How many facial profiles does it have in its database that it is checking against?

It seems to me that if you have a lot of faces you are looking for and a big crowd, or even a modest crowd, then you are going to get a lot of false positives. Now if the wanted person walks past the camera then it has a chance to get its identification correct. If they don't walk past that camera there's no chance to get its identification correct. A given streetcorner with a camera where 1000 people walk past in a day may be in a city with 5 million people, only one of whom is the wanted person, and who might never pass that streetcorner.

Given all this, if a match is triggered, what are the actual odds that the person is the right person? Is it more than 1 in 100? More than 1 in 1000? Does an officer have probable cause for an arrest if there is a 0.1% chance someone has done something? Do they have reasonable suspicion for a detention? Is the officer justified in shooting and killing the person who has had a facial match if they attempt to arrest the person and the person runs away?

EForEndeavour said 4 months ago:

Maybe an unexpected benefit of AI-augmented law enforcement is that police, lawyers, and judges will be required to learn (more) applied statistics.

dragonwriter said 4 months ago:

> How could she possibly prove she is _NOT_ the person the software with 99.999..% accuracy says she is?

She couldn't prove it any more of the source of the erroneous identification was a human looking at a wanted picture and matching her against it.

Any reasonable criminal justice system incorporates the fact that innocent people will be arrested because it's impractical to impose high proof requirements for arrest, but at the same time will provide due process and have high proof requirements for conviction.

If a justice system has a problem on either of those dimensions z they need urgently to be addressed independently of the use of facial recognition as an input to arrest decisions.

frogpelt said 4 months ago:

Scalability is definitely the problem here.

said 4 months ago:
[deleted]
gtirloni said 4 months ago:

For the first issue, carry your documents. I've been taught that since I first got my ID at 12yo.

My understanding is that the automation is to verify people who are needed by law, right? They don't voluntarily hear a beep and go into the jail. There are police officers that still have to arrest them and check their documents.

The situation is so bad in Rio that society simply will have to deal with any deficiencies in the automated system. It will still be better than the current reality.

The fact that police officers are probably not giving people of color enough flexibility in situations like these is beyond the scope of the automation system.

EDIT: Carrying your documents is not a popular opinion among the US audience, I understand. Still, I find it very reasonable and I think the situation that happened to this woman is being blown out of proportion.

NikolaNovak said 4 months ago:

Indeed, notion of carrying your documents is very culturally dependent, from a rational decision to quasi-religious fervently-held belief.

Coming from Eastern Europe originally, I carry my ID everywhere as a matter of habit in Canada; but in turn get very very very nervous and uncomfortable if it's taken from me (for example, at hotels in some countries, or at the border for a quick check).

droithomme said 4 months ago:

The US tries not to be a papers please police state where it's a crime not to carry papers with you and present them on demand. We consider systems like that to be totalitarian and despotic based on our past experiences observing and waging war against hostile oppressive nations with such systems.

foofoo55 said 4 months ago:

> We consider systems like that to be totalitarian

Who is "we"? Non-citizens of the USA are required to carry their immigration status documents with them, [1] and checkpoints within the 100-mile border zone (where 2/3 of the population lives [2]) are one place where they must be presented. If you are a US citizen without proof, and are stopped, and the official thinks you are lying, I believe you are going to have A Bad Day.

[1] https://www.aclu.org/blog/immigrants-rights/immigrants-right...

[2] https://www.aclu.org/other/constitution-100-mile-border-zone

wcoenen said 4 months ago:

> If you are a US citizen without proof, and are stopped, and the official thinks you are lying, I believe you are going to have A Bad Day.

e.g. US citizen Valeria Alvarado was shot and killed when she panicked and tried to get away from plainclothes CBP agents.

AdrianB1 said 4 months ago:

Well, in most cases "we" means the permanent legal inhabitants, not tourists. Legislation is not created by tourists.

facorreia said 4 months ago:

Permanent legal residents are required to carry their IDs.

patrick5415 said 4 months ago:

He/she is not making this up. This is pretty much exactly how I think. When somebody says, “don’t want to be arrested by a bot? Just carry you papers!”, that sounds like absolutely evil tyranny to me.

said 4 months ago:
[deleted]
facorreia said 4 months ago:

The US requires permanent residents to carry their green cards at all times.

xphilter said 4 months ago:

What’s the point of bringing that up? They’re not mutually exclusive. The difference though is that the accused is granted the right to test the veracity of the witness. Will the accused in cases like this be allowed to challenge the code? How can a poor person (who will undoubtedly be the one affected by this issue) afford that? And even if they can, the truth will be muddied by “experts” and a judge without any training or knowledge about facial recognition.

SiempreViernes said 4 months ago:

What is there to challenge, than you look like someone else?

xphilter said 4 months ago:

Or, as one example, that the algorithm was never tested on people of color and so the false positive rate is higher?

gtirloni said 4 months ago:

You're assuming the "algorithm" has the same faulty memory as witnesses. There are recorded images that can be triple checked by humans before any verdict is given (in the case of a conviction).

A judge is not simply going "oh sorry, the system tells me it was you and I can't verify that. Gotta trust it".

xphilter said 4 months ago:

You’re equally assuming plenty. If someone is arrested based on a false positive, they would likely be interrogated by the police. How crazy would it be for that same person to admit to a crime they didn’t do? There are plenty of hypos. My point is that there needs to be strong checks against the use of facial recognition. Citizens should see the algorithm, the training data, and results. The public should test the software rather than a poor criminal defendant.

said 4 months ago:
[deleted]
mikece said 4 months ago:

As long as biometrics alone are the basis for an arrest, we'll see more and more of this. I'm curious what the accuracy score for the biometric source "evidence" was on which she was arrested, and what the match score was between her face and the source evidence.

gtirloni said 4 months ago:

She wasn't arrested. She was detained and taken to the police station and her family brought her documents to prove she wasn't the other woman they were after.

Retric said 4 months ago:

Being detained is in the same category as being arrested. She was forcibly taken into custody only to be released without any form of compensation for lost time.

In many countries people can be held for significant amounts of time without being formally charged with any crime.

gtirloni said 4 months ago:

Similar category but very different outcomes, rights, etc.

https://criminal-law.freeadvice.com/criminal-law/arrests_and...

typenil said 4 months ago:

You're making a pedantic argument and backing it up with a link specific to the US court system. What the US supreme court says had no bearing in Brazil.

gtirloni said 4 months ago:

No, I'm not. The Brazilian supreme court says pretty much the same thing.

EForEndeavour said 4 months ago:

I feel you're missing the point entirely: to the person who was forcibly taken into custody by police and likely didn't have a clear and confident understanding of their rights (a fair assumption for the general populace, I'd say), it doesn't matter what you call it.

gtirloni said 4 months ago:

Of course it's upsetting. I don't think anyone is making the case that it's just a regular day and you shouldn't complain. The automation system needs to improve, that's blatantly obvious. And the police officers need to learn to work with it.

This is what happened (detention): "I was working and the police arrived. They confounded me with someone else they were looking for and I had to go to the police station and prove I wasn't that person. What a day".

Not this (arrest): ""I was working and the police arrived. They confounded me with someone else they were looking for and I had to go to the police station and prove I wasn't that person. They still didn't believe me and I'm in a jail cell waiting for a judge to hear my case. I don't think I'll be home for a few days".

brandonmenc said 4 months ago:

> She wasn't arrested. She was detained

Good thing no one is ever harmed while being detained but not yet arrested. /s

kabwj said 4 months ago:

Being detained doesn’t equal being beaten up.

DanBC said 4 months ago:

> She wasn't arrested.

> She was detained and taken to the police station

What's that if it's not an arrest?

gtirloni said 4 months ago:

That's a detention.

An arrest is when you're charged with a crime. It's a much more serious situation.

ainar-g said 4 months ago:

I severely dislike public facial recognition systems, but wouldn't it basically be the same if a police officer mistook her for someone else based on a sketch or a composite? The scale is obviously different though.

dv_dt said 4 months ago:

With a sketch or composite, the identifying individual is the one with the full responsibility to make the match. With a recognition system, the software externalizes the responsibility, at best giving an individual the responsibility to reject the match. That setup results in the recognition system being inherently given a strong tendency to override individual judgement.

droithomme said 4 months ago:

> the software externalizes the responsibility

The arresting officer still has the responsibility though to decide whether or not to use deadly force against the person should they resist the false arrest.

mikece said 4 months ago:

Human cops don't have implicit faith in the accuracy of their human colleagues. But if a COMPUTER lists you as a biometric match for a person of interest/suspect, then clearly you did _something_...

jolmg said 4 months ago:

Right. Many people believe that computers can't make mistakes. It might not be true, but you'll still have a harder time convincing them that despite the computer identifying you, the computer is mistaken.

duxup said 4 months ago:

And humans are inclined to believe CSI type magic.

srameshc said 4 months ago:

Pardon my ignorance, but isn't this a human rights issue ? Can the UN or other organizations not question the governments who are in such a rush to enforce these system regardless of all the evidence that it isn't reliable yet.

mateus1 said 4 months ago:

Yes it is.

But under the current authoritarian Brazilian president "human rights" has been demonized.

Rio de Janeiro's Governor has advocated for snipers to shoot-to-kill people identified as criminals in the favelas. It's horrible.

said 4 months ago:
[deleted]
meruru said 4 months ago:

I'm not concerned with false positives. That's something the legal system can expect and handle properly. My issue with this kind of technology is the potential for abuse by incumbent powers. What happens when the government starts using this to track whistleblowers?

sudoaza said 4 months ago:

In Brazil simply being preto can get you in trouble with military police, can only imagine it has gotten much worse with bolzonaro

magwa101 said 4 months ago:

There's a first, wrong person apprehended