▲ | andrepd 13 hours ago | |||||||
I have no idea what this means, can someone give the eli5? | ||||||||
▲ | a_bonobo 10 hours ago | parent | next [-] | |||||||
Anthropic has a nice press release that summarises it in simpler terms: https://www.anthropic.com/research/reasoning-models-dont-say... | ||||||||
▲ | meesles 12 hours ago | parent | prev | next [-] | |||||||
Ask an LLM! | ||||||||
▲ | otabdeveloper4 8 hours ago | parent | prev [-] | |||||||
I don't either, but chain of thought is obviously bullshit and just more LLM hallucination. LLMs will routinely "reason" through a solution and then proceed to give out a final answer that is completely unrelated to the preceding "reasoning". | ||||||||
|