▲ | wizzwizz4 3 days ago | ||||||||||||||||||||||
They're not. It's better to think of Copilot as a collaborative storytelling session with a text autocomplete system, which some other program is rudely hijacking to insert the result of running certain commands. Sometimes the (completion randomly selected from the outputs of the) predictive text model goes "yes, and". Other times, it goes "no, because". As observed in the article, if it's autocompleting the result of many "yes, and"s, the story is probably going to have another "yes, and" next, but if a story starts off with a certain kind of demand, it's probably going to continue with a refusal. | |||||||||||||||||||||||
▲ | stevage 3 days ago | parent [-] | ||||||||||||||||||||||
funny how it sounds kind of the opposite of how people might work. Get enough 'no's from someone and they might finally cave in. get enough 'yes'es and they might get sick of doing everything you ask. | |||||||||||||||||||||||
|