Remix.run Logo
umbra07 4 days ago

You can literally just system-prompt-fix that kind of problem though.

xphos 3 days ago | parent [-]

Then why do chat bots still make stuff up? If it is so easily fixable. I think knowing what is actually true is very difficult and chat bots are struggling against a hard problem

umbra07 3 days ago | parent | next [-]

I mean, if you use SOTA models, they just about never hallucinate on these sorts of topics (elementary through lower-division college math), school-level history, science, etc. I only ever see hallucinations when using non-SOTA models, when using an intermediary product, asking about obscure information, or occasionally coding. I don't think any of that applies to being a elementary-high-school tutor.

Writing is probably the most difficult subject, because even now it's difficult to prompt an LLM to write in a human intonation - but they will do a perfectly good job at explaining how to write (here's a basic essay structure, here are words we avoid in formal writing, etc). You can even augment a writing tutor chatbot by making it use specific, human-written excerpts from English textbooks, instead of allowing it to generate example paragraphs and essays.

umbra07 3 days ago | parent | prev [-]

Cuato