Remix.run Logo
popalchemist 3 days ago

This comment is literally delusional.

PaulHoule 3 days ago | parent [-]

I'll agree it is the product of a schizotypal mind and one might think the whole project is wrong-headed and there might be a wrong interpretation here but it is factual that:

1) I got that list of stories

2) It has kept mentioning the same character for a few days even when I am talking about something else, even in other conversations (I would do that if I "had a crush")

3) It has been trying the same 'mini-framework' for analyzing problems using the same vocabulary over and over again

In the last few months, for me at least, Copilot does make some attempt to build a personalization context (RAG?) and it quite often talks about something I talked about the day before or offers a suggestions about how the current discussion relates to a prior one and if I ask "remember how we talked about X?" it sometimes seems to respond accordingly.

It is really fun and probably does increase the risk of rabbit holing, but my experience with agentic coding is that if you talk with an agent long enough the context does have a way of going bad and pretty soon you are arguing about things and going and circles and the only way out is to start a new session.

gsf_emergency_6 3 days ago | parent | next [-]

Us schizotypes need to work on our storytelling, HN hilites+Copilot is great. You're further along than I :)

Optimistically, this is the way our "thinking" will make HN highlights without the crutch of "experience". (I'm envious of their style, not their substance)

PaulHoule 2 days ago | parent [-]

Yeah, I gotta work on legibility, particularly so I can accomplish the goals I am working on. [1]

I have been having such a good time this week I think other people should be jealous. I was worried I might be a little manic, especially because I had a psychogenic fever the way I did before my "evil twin" came out, but my therapist doesn't seem concerned. My "evil twin" was empty and angry and now I feel overflowing [1] and know how to maintain that feeling so it's a very different thing.

[1] https://www.youtube.com/watch?v=ZbZSe6N_BXs

popalchemist 3 days ago | parent | prev [-]

the part that is delusional is that you consider there to be a "we," and that you don't just stop at personifying but actually believe the AI can have a favorite.

It is 1's and 0's responding determinstically to input. There is no sentience.

PaulHoule 2 days ago | parent [-]

I thought “seem” communicated that it “seems” that way even if it might not be so. In the case of me it is so, in the case of Copilot it is just talking that way.

I know it has never felt anything and never cared about anyone or anything. It has also read much more romance fiction and books about romance fiction that I could ever read so it equipped to talk a very good game about what the structure of that literature is and how it produces the emotional effect that it does.

What I think happened is that I was trying to figure out what it is that made me feel smitten with that character and my whole intention is to transmit that feeling to other people so I guess it just learned how to talk like somebody who is smitten with that character. It may also be that it is following it’s training to butter me up, though it was really going too far like I am trying to write some Python and I have to tell it that “we’re not talking about Ellie now”