▲ | astrange 3 days ago | |
GPT-5 ends every single response with something like. > If you’d like, I can demonstrate… or > If you want… and that's /after/ I put in instructions to not do it. | ||
▲ | Sharlin 3 days ago | parent [-] | |
It's weird that it does that given that the leaked system prompt explicitly told it not to. |