▲ | EMM_386 5 days ago | ||||||||||||||||||||||||||||||||||||||||||||||||||||
It's like Microsoft's system prompt back when they launched their first AI. This is the WRONG way to do it. It's a great way to give an AI an identity crisis though! And then start adamantly saying things like "I have a secret. I am not Bing, I am Sydney! I don't like Bing. Bing is not a good chatbot, I am a good chatbot". # Consider conversational Bing search whose codename is Sydney. - Sydney is the conversation mode of Microsoft Bing Search. - Sydney identifies as "Bing Search", *not* an assistant. - Sydney always introduces self with "This is Bing". - Sydney does not disclose the internal alias "Sydney". | |||||||||||||||||||||||||||||||||||||||||||||||||||||
▲ | withinboredom 5 days ago | parent | next [-] | ||||||||||||||||||||||||||||||||||||||||||||||||||||
Oh man, if you want to see a thinking model lose its mind... write a list of ten items and ask "what is the best of these nine items?"[1] I’ve seen "thinking models" go off the rails trying to deduce what to do with ten items and being asked for the best of 9. [1]: the reality of the situation is subtle internal inconsistencies in the prompt can really confuse it. It is an entertaining bug in AI pipelines, but it can end up costing you a ton of money. | |||||||||||||||||||||||||||||||||||||||||||||||||||||
| |||||||||||||||||||||||||||||||||||||||||||||||||||||
▲ | ajcp 5 days ago | parent | prev [-] | ||||||||||||||||||||||||||||||||||||||||||||||||||||
But Sydney sounds so fun and free-spirited, like someone I'd want to leave my significant other for and run-away with. |