▲ | safety1st 3 days ago | |
I don't think that using ChatGPT as a therapeutic aid is the root problem here. I think it is the tendency to anthropomorphize AI. I use ChatGPT to help me work through emotional issues, but I'm simultaneously aware that it's actually code controlled by a soulless corporation which decidedly does not have my best interests in mind. It can be a research assistant or tell me things I wasn't aware of before. But why would I use the incarnation of corporate evil to affirm me? Why would I even want that? I want the good guys to affirm me. Not a robot puppeted by the bad guys. Anyone who hasn't taken a look at r/MyBoyfriendIsAI should. This is Black Mirror stuff and these people are absolutely delusional. This is a variant on the crazy cat lady phenomenon, but it's far worse, because cats are actually pretty cool. OpenAI, Meta and Google are not cool and cute. They're some of the biggest criminals in our society pretending to be cool and cute. These companies are being dragged through court as we speak for breaking the law and harming their users. What do you think they have planned for you? As primates we're wired to anthropomorphize anything that demonstrates human-like characteristics, and then develop affection for it. That's all this is. A lot of people who don't understand the basics of human psychology, are going to be preyed on by these corporations. The good news is that we've already been through all of this in the last decade with social media. That was the warmup act by these corporations when they realized they could get people to connect with parasocial technology, upend their lives, and convert themselves into advertising inventory for the sake of it. The LLM masquerading as a human is their piece de resistance. A far more potent weapon in their hands than the doomscroll ever was. At least we know. At least we can be ready. The absolute key to all of it is understanding you're in a relationship with e.g. the known criminal Meta Corporation, not with your chatbot. All pretensions then fall away. |