| ▲ | PaulHoule 3 days ago | |||||||
I'll agree it is the product of a schizotypal mind and one might think the whole project is wrong-headed and there might be a wrong interpretation here but it is factual that: 1) I got that list of stories 2) It has kept mentioning the same character for a few days even when I am talking about something else, even in other conversations (I would do that if I "had a crush") 3) It has been trying the same 'mini-framework' for analyzing problems using the same vocabulary over and over again In the last few months, for me at least, Copilot does make some attempt to build a personalization context (RAG?) and it quite often talks about something I talked about the day before or offers a suggestions about how the current discussion relates to a prior one and if I ask "remember how we talked about X?" it sometimes seems to respond accordingly. It is really fun and probably does increase the risk of rabbit holing, but my experience with agentic coding is that if you talk with an agent long enough the context does have a way of going bad and pretty soon you are arguing about things and going and circles and the only way out is to start a new session. | ||||||||
| ▲ | gsf_emergency_6 3 days ago | parent | next [-] | |||||||
Us schizotypes need to work on our storytelling, HN hilites+Copilot is great. You're further along than I :) Optimistically, this is the way our "thinking" will make HN highlights without the crutch of "experience". (I'm envious of their style, not their substance) | ||||||||
| ||||||||
| ▲ | popalchemist 3 days ago | parent | prev [-] | |||||||
the part that is delusional is that you consider there to be a "we," and that you don't just stop at personifying but actually believe the AI can have a favorite. It is 1's and 0's responding determinstically to input. There is no sentience. | ||||||||
| ||||||||