| ▲ | wat10000 2 hours ago | |
The only rational capacity that LLMs have is that which has been trained into it. They've also been trained on mountains of gut reactions, tribalism, and propaganda. These things aren't Data from Star Trek. They're not coldly logical. In fact, it's a struggle to get them to be logical at all. | ||
| ▲ | sys32768 2 hours ago | parent [-] | |
You must be using an LLM that cannot navigate formal logic puzzles or hasn't undergone chain-of-thought optimization. | ||