| ▲ | camillomiller 5 hours ago | |||||||||||||||||||||||||||||||||||||||||||
A CTO sent me a message that opened with: “Here’s a friendly message that will perfectly convey what you want to say”. A double PhD friend says she has to talk to chatGPT for all sort of advice and can’t feel safe not doing it, “because you know I’m single and don’t have a companion to spitball my ideas”. She let chatGPT decide which way to take to get to a certain island, and she got stranded because the suggested service didn’t exist. I have more examples. It’s a fucking mind virus. | ||||||||||||||||||||||||||||||||||||||||||||
| ▲ | sigseg1v 4 hours ago | parent [-] | |||||||||||||||||||||||||||||||||||||||||||
How is the getting stranded example different than asking on a travel forum how to get somewhere, and an active and well intentioned user that isn't familiar with your area of travel answers, gives you wrong instructions, and you get lost? | ||||||||||||||||||||||||||||||||||||||||||||
| ||||||||||||||||||||||||||||||||||||||||||||