Remix.run Logo
OptionOfT 2 days ago

> He thinks it is unrealistic to expect lawyers to stop using AI.

I disagree. It worked until now, and using AI is clearly doing more harm than good, especially in situations where you hire an expert to help you.

Remember, a lawyer is someone who actually has passed a bar exam, and with that there is an understanding that whatever they sign, they validate as correct. The fact that they used AI here actually isn't the worst. The fact that they blindly signed it afterwards is a sign that they are unfit to be a lawyer.

We can make the argument that this might be pushed from upper management, but remember, the license is personal. So it's not that they can hide behind such a mandate.

It's the same discussions I'm having with colleagues about using AI to generate code, or to review code. At a certain moment there is pressure to go faster, and stuff gets committed without a human touching it.

Until that software ends up on your glucose pump, or the system used to radiate your brain tumor.

sonofhans 2 days ago | parent | next [-]

> The fact that they blindly signed it afterwards is a sign that they are unfit to be a lawyer.

Yes, this is the crux of it. More than any other thing you pay a lawyer to get the details right.

beambot 2 days ago | parent | prev | next [-]

I disagree with your disagreement. The legal profession is not "working until now" unless you're quite wealthy and can afford good representation. AI legal assistants will be incredibly valuable for a large swath of the population -- even if the outputs shouldn't be used to directly write briefs. The "right" answer is to build systems to properly validate citations and arguements.

hattmall 2 days ago | parent | next [-]

Incorrect legal information is generally less beneficial than no information at all. A dice roll of correct or incorrect information is potentially even worse.

kulahan 2 days ago | parent [-]

Lawyers are famously never wrong

dcrazy 2 days ago | parent | next [-]

Lawyers are famously subject to sanctions and malpractice lawsuits if they, say, don’t read the motions they file on your behalf.

kulahan a day ago | parent [-]

Yes yes, there are many annoying rules in the legal system meant to keep regular people from having any legal power directly.

DannyBee 2 days ago | parent | prev [-]

Lawyers are required to actually cite properly and check their citations are correct, as well as verify they are citing precedent that is still good (ie has not been overturned).

This is known as shepardizing.

This is done automatically without AI and has been for decades.

DannyBee 2 days ago | parent | prev | next [-]

Lawyer here. I'm not sure why you think AI will fix the first part. What AI does is not a significant part of the cost or labor in the vast majority of kinds of cases. If you have a specific area in mind, happy to go into it with you. The area where this kind of AI seems most likely to reduce cost is probably personal injury.

As for the last sentence, those systems already exist and roughly all sane lawyers use them. They are required to. You aren't allowed to cite overturned cases or bad law to courts, and haven't been allowed for eons. This was true even before the process was automated complety. But now completely automated systems have been around for decades, and one is so popular it caused creation of the word "shepardize" to be used for the task. So this is a double fault on the lawyers part. These systems are integrated well too. Even back in 2006 when I was in law school the system I used published an extension for Microsoft Word that would automatically verify every quote and cite, make sure they were good law and also reformat them into the proper style (there were two major citation styles back then). It has only improved since then. The last sentence is simply a solved problem. The lawyer just didn't do it because they were lazy and committed malpractice.

yibg 2 days ago | parent | prev | next [-]

I don't really see how this is any different from checking for work from another human. If a lawyer tasks another staff to do some research for citations, and the staff made up a bunch of them and the lawyer didn't check, that lawyer would be responsible as well. Just because it's AI and not a person doesn't make it less of an issue.

2 days ago | parent [-]
[deleted]
JumpCrisscross 2 days ago | parent | prev | next [-]

> AI legal assistants will be incredibly valuable for a large swath of the population

In my experience they're a boon to the other side.

Using AI to help prepare your case for presentation to a lawyer is smart. Using it to actually interact with an adversary's lawyer is very, very dumb. I've seen folks take what should have been a slam-dunk case and turn it into something I recommended a company fight because they were clearly using an AI to write letters, the letters contained categorically false representations, and those lies essentially tanked their credibility in the eyes of the, in one case, arbitrator, in another, the court. (In the first case, they'd have been better off representing themselves.)

freejazz 2 days ago | parent | prev | next [-]

You could, give an example to support your argument as opposed to just telling everyone that you are right.

pempem 2 days ago | parent | prev [-]

There are - we have to accept - alternate solutions as well.

alach11 2 days ago | parent | prev [-]

> using AI is clearly doing more harm than good

How do you know this? Wouldn't we expect the benefits of AI in the legal industry to be way less likely to make the front page of HN?