| ▲ | simianwords 10 hours ago | |||||||||||||||||||||||||
challenge: provide a single example where the LLM can only provide the output and not the steps? (in text scenario) | ||||||||||||||||||||||||||
| ▲ | latexr 9 hours ago | parent [-] | |||||||||||||||||||||||||
An LLM can always output steps, but it doesn’t mean they are true, they are great at making up bullshit. When the “how many ‘r’ in ‘strawberry’” question was all the rage, you could definitely get LLMs to explain the steps of counting, too. It was still wrong. | ||||||||||||||||||||||||||
| ||||||||||||||||||||||||||