| ▲ | al_borland 12 hours ago | |
> part of the pitch was getting the buyer to come to a particular idea "all on their own" then make them feel good on how smart they were. I can usually tell when someone is leading like this and I resent them for trying to manipulate me. I start giving the opposite answer they’re looking for out of spite. I’ve also had AI do this to me. At the end of it all, I asked why it didn’t just give me the answer up front. It was a bit of a conspiracy theory, and it said I’d believe it more if I was lead there to think I got there on my own with a bunch of context, rather than being told something fairly outlandish from the start. That fact that AI does this to better reinforce the belief in conspiracy theories is not good. | ||
| ▲ | 1bpp 12 hours ago | parent | next [-] | |
An LLM cannot explain itself and its explanations have no relation to what actually caused the text to be generated. | ||
| ▲ | 12 hours ago | parent | prev [-] | |
| [deleted] | ||