▲ | skylerwiernik 7 days ago | |
> In 2023, Microsoft's Bing chatbot famously adopted an alter-ego called "Sydney,” which declared love for users and made threats of blackmail. More recently, xAI’s Grok chatbot would for a brief period sometimes identify as “MechaHitler” and make antisemitic comments. Other personality changes are subtler but still unsettling, like when models start sucking up to users or making up facts. Funny that they managed to call out all of their competitors without mentioning any of Claude's bad behavior | ||
▲ | astrange 6 days ago | parent | next [-] | |
The only bad behavior I can think of from Claude is how it used to be so ethical it'd just refuse to do anything. The quality of its thought outside coding is pretty bad lately and especially worse than o3/Gemini though. It really feels like they've forced it to short answers for cost control. | ||
▲ | stavros 7 days ago | parent | prev [-] | |
What bad behaviour of Claude was as famous as Sydney, or MechaHitler, or GPT' sycophancy? I've not heard anything. |