| ▲ | 0xbadcafebee 2 days ago | |
My dude, when people say LLMs are non-deterministic, this is what they mean. You cannot expect an LLM to always follow your prompts. When this happens, end your session and try again. If it keeps happening, change your model settings to lower temp, top_k, top_p. (https://www.geeksforgeeks.org/artificial-intelligence/graph-...) | ||
| ▲ | SparkyMcUnicorn 2 days ago | parent [-] | |
temperature, top_k, and top_p don't exist on Opus 4.7 (or 4.6?). Related: https://xcancel.com/bcherny/status/2044831910388695325#m https://platform.claude.com/docs/en/api/messages/create#crea... | ||