| ▲ | softwaredoug 10 hours ago | |||||||
Honestly defining what to teach is hard People need to learn to do research with LLMs, code with LLMs, how to evaluate artifacts created by AI. They need to learn how agents work at a high level, the limitations on context, that they hallucinate and become sycophantic. How they need guardrails and strict feedback mechanisms if let loose. AI Safety connecting to external systems etc etc. You're right that few high school educators would have any sense of all that. | ||||||||
| ▲ | WalterBright 9 hours ago | parent | next [-] | |||||||
I don't know anyone who learned arithmetic from a calculator. I do know people who would get egregiously wrong answers from misusing a calculator and insisted it couldn't be wrong. | ||||||||
| ||||||||
| ▲ | ndriscoll 3 hours ago | parent | prev [-] | |||||||
The sycophancy is an artifact of how they RLHF train the popular chat models to appeal to normies, not fundamental to the tool. I can't remember encountering it at all since I've started using codex, and in fact it regularly fills in gaps in my knowledge/corrects areas that I misunderstand. The professional tool has a vastly more professional demeanor. None of the "that's the key insight!" crap. | ||||||||