Remix.run Logo
stevage 3 days ago

It seems weird to me that copilot sometimes refuses to execute code but sometimes allows it. What exactly are they aiming for?

wizzwizz4 3 days ago | parent [-]

They're not. It's better to think of Copilot as a collaborative storytelling session with a text autocomplete system, which some other program is rudely hijacking to insert the result of running certain commands.

Sometimes the (completion randomly selected from the outputs of the) predictive text model goes "yes, and". Other times, it goes "no, because". As observed in the article, if it's autocompleting the result of many "yes, and"s, the story is probably going to have another "yes, and" next, but if a story starts off with a certain kind of demand, it's probably going to continue with a refusal.

stevage 3 days ago | parent [-]

funny how it sounds kind of the opposite of how people might work. Get enough 'no's from someone and they might finally cave in. get enough 'yes'es and they might get sick of doing everything you ask.

immibis 3 days ago | parent | next [-]

It's narrowing down the space of all possible conversations. One with a lot of nos is probably a conversation with someone who says no a lot. An early LLM result was that you got higher-quality translations if you demarcated the answer with "the expert French translator says:" instead of just "French translation:"

dangero 3 days ago | parent | prev [-]

Sales people are specifically trained to manipulate people by asking them questions that they will say ‘yes’ to because once people start to say yes, they tend to continue to say it.

wizzwizz4 3 days ago | parent [-]

Only when certain pressure is applied. If you're paying attention when someone's doing this to you, you can feel (and disregard) the tendency to keep saying "yes".