Remix.run Logo
cinntaile 5 hours ago

Don't leave us hanging. What happened?

camillomiller 4 hours ago | parent | next [-]

A CTO sent me a message that opened with:

“Here’s a friendly message that will perfectly convey what you want to say”.

A double PhD friend says she has to talk to chatGPT for all sort of advice and can’t feel safe not doing it, “because you know I’m single and don’t have a companion to spitball my ideas”. She let chatGPT decide which way to take to get to a certain island, and she got stranded because the suggested service didn’t exist.

I have more examples. It’s a fucking mind virus.

sigseg1v 4 hours ago | parent [-]

How is the getting stranded example different than asking on a travel forum how to get somewhere, and an active and well intentioned user that isn't familiar with your area of travel answers, gives you wrong instructions, and you get lost?

andrewflnr 3 hours ago | parent | next [-]

The key missing step is where the traveler exercises critical thinking and checks the advice they get. Some people seem to turn that off for LLMs.

shahbaby 4 hours ago | parent | prev | next [-]

Because they aren't probabilistic parrots? If they get it wrong, there's usually an understandable reason behind it.

kibwen 4 hours ago | parent | prev [-]

Because the vast and overwhelmingly majority of the time, if you ask a question into the ether that nobody has a good answer to, most people will gloss over it and not bother answering, as attested by decades of relatable memes ( https://xkcd.com/979/ ). In contrast, the chatbot is trained to always attempt to give an answer, and is seemingly disincentivized via its training set to just shrug and say "I don't know, good luck fam".

dude250711 4 hours ago | parent | prev [-]

They stop thinking and they stop verifying output too.