Remix.run Logo
anigbrowl 6 hours ago

It is an AI error, but also an error on the part of the cops, the prosecutors, the judge, and the county sheriff (who is responsible for the jail inmates). I hope everyone involved in this travesty is sued into oblivion and unable to hide behind their immunity defenses. Facial recognition should never be the sole basis for a warrant.

idle_zealot 6 hours ago | parent | next [-]

> It is an AI error, but also an error on the part of the cops, the prosecutors, the judge, and the county sheriff

Yes, it's critical to remember that multiple parties can be at fault. In a case like this, it is true that

a) law enforcement misused a tool and demonstrated extreme negligence

b) the judiciary didn't catch this, which suggests systemic negligence there too when it comes to their oversight responsibilities

c) the company selling/providing this AI tool should have known it was likely to be misused and is responsible for damages caused by such predictable usage

We cannot have a just world until our laws and norms result in loss of jobs and legitimacy as punishment for this sort of normalized failure, from all three parties. Immunity is a failed experiment.

recursivecaveat 6 hours ago | parent | prev | next [-]

Even if she was a read ringer (clearly not the same person to any human who glances at the image), common sense should tell you that among 340,000,000 Americans there are a lot of lookalikes. Clearly there's a kind of stupid belief in the mystic powers of an AI and a callous disregard for the well being of suspects. No one should be dragged 1000 miles and held for months based on a facial match, especially when exculpatory evidence was easily available.

causal 6 hours ago | parent | prev | next [-]

This x1000. We need to suspend this shared fiction that AI has any agency. Only humans can be responsible. Full stop.

irishcoffee 5 hours ago | parent [-]

ICE detains innocent woman 1200 miles away based on AI

Same comment?

causal an hour ago | parent | next [-]

This question doesn't even make sense. Why wouldn't humans still be the ones responsible? Bot account?

GuinansEyebrows 5 hours ago | parent | prev [-]

respectfully, can you elaborate on why the answer would not be yes? or am i just misreading your comment?

mekoka 5 hours ago | parent | prev [-]

> It is an AI error

The software identified the person as Angela Lipps. According to the court documents, the Fargo detective working the case then looked at Lipps' social media accounts and Tennessee driver's license photo.

In his charging document, the detective wrote that Lipps appeared to be the suspect based on facial features, body type and hairstyle and color.

The software worked exactly as intended. It's a filtering tool that sifts through data for common patterns to provide leads, not matches. It raises a flag on persons of interest. You can be a "match" anywhere between 0 and 100% and only relative to some specific input (like that picture taken from the top of the woman at the teller). In that sens mismatches are within acceptable parameters and have been known to happen.

A "match" is a pronouncement ultimately made by the humans that uses the tool, after they've checked out the leads. Someone slept at the wheel here.