Remix.run Logo
more_corn 7 months ago

AI is really bad at law right now. Hallucinated legal citations build legal arguments on a foundation of mud. No lawyer should be using chat gpt or any of the common public models. There might be a fine-tuned legal AI but I haven’t heard of it and I wouldn’t trust it till it has been thoroughly vetted.

The college teacher is saying “this is ridiculous, kids can’t think for themselves these days, I’m so sick of grading obviously ai generated slop.” (That’s a quote from a college teacher friend of mine)

bdangubic 7 months ago | parent [-]

with all due respect - this is 100% all wrong.

AI hallucinates with everything, coding in particular as well. By your rationale no developer should be using AI either but you will soon be unable to keep your employment (in many places we are already there) without it.

I have already used AI for legal things that would have cost me thousands and thousands of dollars and lawyers are using LLMs daily for all kinds of sh*t...

ben_w 7 months ago | parent [-]

I've also tried — out of curiosity rather than necessity — putting legal questions to ChatGPT, and not only did literally every single case law citation I had the means to check not really exist, the statue laws it quoted me didn't apply in my case.

I might have been particularly unlucky, but on the other hand, my understanding of the law is that passing the bar exam (the standard that ChatGPT reached) is for humans just the metaphorical foot in the door to allow on the job training.

Of course, if you are a lawyer and you have the skill and means to check the output, then it will likely still help you for the same reason someone fresh out of law school would still help you, and for the same reason that I would describe ChatGPT's code as "intern to fresh graduate" and yet still use it.