▲ | reaperman a day ago | |
Huge thank you for correcting me. Do you have any good resources I could look at to learn how the previous CoT is included in the input tokens and treated differently? | ||
▲ | wahnfrieden a day ago | parent [-] | |
I've only read the marketing materials of closed models. So they could be lying, too. But I don't think CoT is something you can do with pre-CoT models via prompting and context manipulation. You can do something that looks a little like CoT, but the model won't have been trained specifically on how to make good use of it and will treat it like Q&A context. |