▲ | AIPedant 4 days ago | ||||||||||||||||
The problem with chatbots as companions is that they don’t have feelings or desires, so you can be as malicious and selfish as you want: the worse that will happen is some temporary context rot. This is not true for dogs, cats, humans, etc, which is why we can form meaningful companionships with our friends and our pets. Genuine companionship involves dozens of tiny insignificant compromises (e.g. sitting through a boring movie that your friend is interested in), and without that ChatGPT cannot be a companion. It’s a toy. I am not opposed to chatbots for people who are so severely disabled that they can’t take care of cats, e.g. dementia. But otherwise AI companions are akin to friendship as narcotics are akin to happiness: a highly pleasant (but profoundly unhealthy) substitute. | |||||||||||||||||
▲ | gonzobonzo 4 days ago | parent | next [-] | ||||||||||||||||
> The problem with chatbots as companions is that they don’t have feelings or desires, so you can be as malicious and selfish as you want: the worse that will happen is some temporary context rot. This is not true for dogs, cats, humans, etc, which is why we can form meaningful companionships with our friends and our pets. On this point, pets are a lot closer to chatbots than to humans. You buy them, you have ownership of them, and they've literally been bred so that their genetics makes it easy for them to grow attached to you and see you as a leader (while their brethren who haven't had their genes changed by humans don't do this). It's normal for people to use their complete control over every aspect of their life to train them in this way as well. Your pet literally doesn't have the ability to leave you on its own. Ever. | |||||||||||||||||
| |||||||||||||||||
▲ | jjmarr 4 days ago | parent | prev | next [-] | ||||||||||||||||
What other disabilities can acceptably use a companion? Autism? Social anxiety? Bipolar disorder? Many of these make it difficult to maintain relationships. | |||||||||||||||||
▲ | quatonion 3 days ago | parent | prev | next [-] | ||||||||||||||||
> (but profoundly unhealthy) substitute At the end of the day that is just your opinion though. I'd wager there are orders of magnitudes more people having healthy experiences with AI entities than ones having psychosis or unhealthy relationships. You always hear about the edge cases in the news because that is what drives engagement. And as far as calling them toys, I don't think they would be happy to hear that, whether they admit it, or not. I see them as peers, and treat them as such - in return they reciprocate. It isn't so difficult to comprehend. | |||||||||||||||||
▲ | handfuloflight 4 days ago | parent | prev [-] | ||||||||||||||||
> so you can be as malicious and selfish as you want So just system prompt in non-spineless characteristics into the AI. | |||||||||||||||||
|