Remix.run Logo
tony_cannistra 6 hours ago

Completely infuriating, but more of a commentary on the sad state of incompetent power-hungry law enforcement with tools they don't know how to use than the tools themselves.

Though, the question remains: are the tools built in such a way as to deceive the user into a false sense of trust or certainty?

_Some_ of the blame lies on the UX here. It must.

sidrag22 6 hours ago | parent | next [-]

It must land as human's fault or this will become more and more of a pattern to avoid accountability.

paulhebert 6 hours ago | parent [-]

It’s both.

The cops need to be held accountable.

But it’s glaringly obvious that if you build tools like this and give them to the US police this is the outcome you will get. The toolmakers deserve blame too.

throw_m239339 6 hours ago | parent | prev | next [-]

> they don't know how to use than the tools themselves.

No, the tools work perfectly as they were design to work. The problem is that the tools are flawed.

Ultimately, every single of these decisions should be approved by a human, which should be responsible for the fuck up no matter what the consequences are.

> _Some_ of the blame lies on the UX here. It must.

No, the blame lies with the person or the group who approve the usage of these tools, without understanding their shortcomings.

jolmg 6 hours ago | parent | next [-]

>> are the tools built in such a way as to deceive the user into a false sense of trust or certainty? _Some_ of the blame lies on the UX here. It must.

> No, the blame lies with the person or the group who approve the usage of these tools, without understanding their shortcomings.

The person who approved the tools might've understood, but that doesn't mean the user understands. _Some_ of the reason why the user doesn't understand the shortcomings of the tool might be because of misleading UX.

Pxtl 6 hours ago | parent | prev [-]

I miss the days of earlier AI image-recognition software that would emit a confidence percentage.

New LLM-related AIs are all supremely confident in every assertion, no matter how wrong.

janalsncm 6 hours ago | parent [-]

I don’t know what tool they used, but it was very likely not an LLM. They probably have some database of drivers’ licenses and they ran a similarity search against the surveillance footage. This poor lady happened to be the top match.

Even if it also output a score, that score depends on how the model was trained. And the cops might ignore it anyways.

ImPostingOnHN 6 hours ago | parent | prev | next [-]

> are the tools built in such a way as to deceive the user into a false sense of trust or certainty? _Some_ of the blame lies on the UX here. It must.

Are AI code assist tools built in such a way as to deceive the user into a false sense of trust or certainty? Very much so (even if that isn't a primary objective).

Does any part of the blame lie on the UX if a dev submits a bad change? No, none.

You are ultimately, solely responsible for your work output, regardless of which tool you choose to use. If using your tool wrong means you make someone homeless, car-less, and also you kill their dog, then you should be a lot more cautious and perform a lot more verification than the average senior engineer.

tony_cannistra 6 hours ago | parent [-]

I agree with all that. Maybe the word isn't "blame," then. Surely there must be some code, perhaps moral or ethical, but ideally more rigorously enforcible, which ought to prevent the development of intentionally deceiving tools. Sure you could say this about all software, but that which can cause actual physical harm ought to be held to a higher standard.

ImPostingOnHN 6 hours ago | parent [-]

Yes, unfortunately technology is advancing faster than the average human brain evolves more neurons, so it will only become less comprehensible to the average person.

That's setting aside the tendency for police to hire from the left side of the bell curve to avoid independent thinkers that might question authority, refuse to do bad shit, etc.

hsbauauvhabzb 6 hours ago | parent | prev [-]

Spoken like someone who isn’t built for a sales role at said company.

Sales will sell the dream, who cares if the real world outcomes don’t align?