Remix.run Logo
siliconc0w 4 days ago

The other side is with AI tutoring, you don't even really need exams. You have a constant real time map of where the student is strong and where they're weak.

xphos 4 days ago | parent [-]

But like you don't have that because AI will lie to you but kindly. People who are learning don't have the tools to point out those types of mistakes yet either so they will learn things the wrong way. Not to say teachers dont teach the wrong thing most of math is learning the wrong way only to later distill the right lesson but I think the automation of it is much worse

umbra07 4 days ago | parent [-]

You can literally just system-prompt-fix that kind of problem though.

xphos 3 days ago | parent [-]

Then why do chat bots still make stuff up? If it is so easily fixable. I think knowing what is actually true is very difficult and chat bots are struggling against a hard problem

umbra07 3 days ago | parent | next [-]

I mean, if you use SOTA models, they just about never hallucinate on these sorts of topics (elementary through lower-division college math), school-level history, science, etc. I only ever see hallucinations when using non-SOTA models, when using an intermediary product, asking about obscure information, or occasionally coding. I don't think any of that applies to being a elementary-high-school tutor.

Writing is probably the most difficult subject, because even now it's difficult to prompt an LLM to write in a human intonation - but they will do a perfectly good job at explaining how to write (here's a basic essay structure, here are words we avoid in formal writing, etc). You can even augment a writing tutor chatbot by making it use specific, human-written excerpts from English textbooks, instead of allowing it to generate example paragraphs and essays.

umbra07 3 days ago | parent | prev [-]

Cuato