Remix.run Logo
hehehheh 4 days ago

It has to be the same as all AI: you need someone thorough to check what it did.

LLM generated code needs to be read line by line. It is still useful to do that with code because reading is faster than googling then typing.

You can't detect hallucinations in general.

bambax 4 days ago | parent [-]

A (costly) way is to compare responses from different models, as they don't hallucinate in exactly the same way.