Remix.run Logo
ricktdotorg 2 days ago

this is exactly the same as people who drive their car into a river because google maps told them to.

heavyset_go 2 days ago | parent | next [-]

If you don't listen to Google Maps and drive into a river, you're going to be left behind.

danielbln 2 days ago | parent [-]

That's why I drive with 10 pounds of paper maps in my car. I won't have any of this new fangled GPS tech atrophy my map reading skills that I've honed so much.

defrost 2 days ago | parent [-]

If you're carrying ten pounds of paper maps, you're doing no GPS / no digital maps navigation wrong.

trehalose 2 days ago | parent | prev [-]

If you were driving on an unmarked, unbarricaded bridge that Google Maps directed you over in a dark and rainy night, are you 100% certain you'd be driving slowly, undistracted, and checking to make sure the bridge isn't collapsed?

gruez 2 days ago | parent [-]

This analogy doesn't work because you can assume that by a bridge existing, and not having traffic cones/barriers, it's probably built by humans and is fit for use (ie. isn't half built). The same doesn't exist for LLM outputs, which is wholly generated by AI. If I was in some simulation where the environment is vibecoded by AI, I'd be very careful too.

trehalose 2 days ago | parent [-]

That's kind of what I was trying to say, or at least it kind of goes along with it. This meme of "somebody drove into a river just because Google Maps told them to" is a grossly distorted retelling of a fatal accident. One could twist any tragedy into a glib soundbite about how the dead stupidly trusted other people. The street could collapse under my feet as I'm crossing it and I drown in the sewer, and people on the internet would be laughing about how I dived into the sewer just because a traffic light told me to. There were some cracks in the asphalt, so obviously I should have known it wasn't safe to walk across, but I wasn't thinking for myself.

I suppose part of the reason so many people are so dangerously trustful of LLMs is because they assume that if the LLM was put out there by decently responsible humans (doubtful, but understandable), then so too should the LLM be decently responsible? The analogy does break down there.