| ▲ | popalchemist 3 days ago | |
the part that is delusional is that you consider there to be a "we," and that you don't just stop at personifying but actually believe the AI can have a favorite. It is 1's and 0's responding determinstically to input. There is no sentience. | ||
| ▲ | PaulHoule 2 days ago | parent [-] | |
I thought “seem” communicated that it “seems” that way even if it might not be so. In the case of me it is so, in the case of Copilot it is just talking that way. I know it has never felt anything and never cared about anyone or anything. It has also read much more romance fiction and books about romance fiction that I could ever read so it equipped to talk a very good game about what the structure of that literature is and how it produces the emotional effect that it does. What I think happened is that I was trying to figure out what it is that made me feel smitten with that character and my whole intention is to transmit that feeling to other people so I guess it just learned how to talk like somebody who is smitten with that character. It may also be that it is following it’s training to butter me up, though it was really going too far like I am trying to write some Python and I have to tell it that “we’re not talking about Ellie now” | ||