Remix.run Logo
another-dave 8 hours ago

> Prosecutors argued that they had a right to demand material that Heppner created with Claude because his defense lawyers were not directly involved, and because attorney-client privilege does not apply to chatbots. > > Voluntarily revealing information from a lawyer to any third party can jeopardize the customary legal protections for those attorney communications. > > Manhattan-based U.S. District Judge Jed Rakoff ruled, opens new tab in February that Heppner must hand over 31 documents generated by Anthropic's chatbot Claude related to the case. > > No attorney-client relationship exists "or could exist, between an AI user and a platform such as Claude," Rakoff wrote.

If I hand wrote some notes in a notebook or diary, I wouldn't have to hand them over, as I understand it, even with no lawyer in the mix. Same if I wrote some notes in a text file on my computer.

Leaving AI aside, what in particular makes this different from using any other cloud-based software? Does writing a Google Doc to gather my thoughts or a draft email in Gmail constituent "revealing information from a lawyer to a third party"?

What if Google have enabled AI-features on these? Feels like this area really needs clarity for users rather than waiting for courts to rule on it.

jubilanti 7 hours ago | parent | next [-]

> If I hand wrote some notes in a notebook or diary, I wouldn't have to hand them over, as I understand it, even with no lawyer in the mix. Same if I wrote some notes in a text file on my computer.

Absolutely wrong in the U.S. The police can't just break into your home and demand it, but a judge can 100% mandate discovery or a subpoena if there is reason to believe that evidence exists which is relevant to the case.

The 4th amendment prohibits UNREASONABLE search and seizure, and we let judges make that determination. You never have absolute privacy rights.

nostrademons 6 hours ago | parent | next [-]

Note that the judge is bound by precedent and law as to what "unreasonable" means, they can't just make it up as they go along unless there is no precedent. Otherwise the case can be reversed on appeal.

I was on a jury recently where we had to swap out judges in the last couple days of the trial. The reason was because the judge had been assigned another case where the defendant had not waived his right to a speedy trial. The judge wanted to finish his existing case first, the defense lawyers said "You can't do that", the judge looked it up and found out that indeed they were right, so off he went to start the new case and handed off the existing one to a colleague. In my experience judges really do take the law seriously - that's how they get to be judges.

Chaosvex 5 hours ago | parent [-]

Why didn't his colleague just handle the new case? What am I missing?

nostrademons 5 hours ago | parent [-]

My understanding is that judges have certain specialties - one judge might be well versed in a particular area of law but not other ones. The case I was on was an area where nobody in the district had expertise, and everybody (judge, prosecutor, defense, jury) was learning as they go. The new incoming case was one that was in an area that our previous judge would normally handle. So it was assigned to him because it came in through his department, while the case I was on was sort of a free-floating orphan where not much was lost by having another judge handle it (and it was also already in the jury instructions phase, with testimony complete).

reactordev 7 hours ago | parent | prev [-]

This. All of your rights are up for debate under a judge. There’s only a few you can still exercise if a judge wants something from you but ultimately if a judge decides it’s relevant to the case, it’s relevant to the case and you must comply. Or be held in contempt. Or praise? With a senate hearing to boot. I’m confused on how our legal system actually functions now but that is how it’s supposed to be. If a judge decides to include it, it’s in. Go get it.

chasil 4 hours ago | parent [-]

One of my friends recently spent some time getting an OpenClaw instance running in Ubuntu so he could have a truly private conversation with it, complete with an air gap.

The value of that configuration has just been greatly magnified.

nozzlegear 4 hours ago | parent [-]

Has it? There's value in privacy vis-a-vis snooping corporations, but those conversations could still be surrendered to the court if the judge rules them potentially relevant, and if your friend refuses to do so, he'd be held in contempt of court.

chasil 3 hours ago | parent [-]

The judge would have to know about them.

Perhaps this could be gleaned from your ISP's records, but it would be far more difficult than determining the existence of an account at Anthropic.

nozzlegear an hour ago | parent [-]

I agree, but it's not like Anthropic was running to tell the lawyers and the judge in this story. The most likely scenario is your friend would just let slip he's using AI, or people who know him would let it slip, and the lawyers or judge will demand the conversations for discovery.

phire 8 hours ago | parent | prev | next [-]

> If I hand wrote some notes in a notebook or diary, I wouldn't have to hand them over, as I understand it, even with no lawyer in the mix. Same if I wrote some notes in a text file on my computer.

There is some protection of personal private documents for civil cases. But for a criminal case, there is no 4th or 5th amendment protection for stuff you wrote in your diary.

js8 5 hours ago | parent [-]

Should it be relevant though? It seems to me like criminalization of thoughts. Even if they externalized into a diary.

autoexec 2 hours ago | parent | next [-]

If you were caught with notebooks detailing your plans to kill a list of people, showing that you've meticulously tracked their movements and listing locations for dumping the bodies that would be extremely relevant. I don't see how it'd be a good idea to exclude that kind of evidence.

sumeno 5 hours ago | parent | prev | next [-]

If you write in your diary "I'm gonna kill her" and then she gets killed it's relevant

program_whiz 5 hours ago | parent | prev [-]

Depends, if you wrote a detailed confession with material non public facts, a jury can hear it and weigh the evidence.

jcranmer 8 hours ago | parent | prev | next [-]

Reading the ruling in more detail, this is definitely a "this is not even close case."

First off, the Fifth Amendment right to not self-incriminate is rather narrower than you might expect. With regard to document production, it only privileges you from having to produce documents if the act of producing those documents would in effect incriminate you. So if you tell people "I've got a diary where I've been keeping track of all the crimes I've committed..." the government can force you to turn over that diary.

Second, the default assumption whenever you send something to another person is that it's unprivileged communication. IANAL, but even using cloud storage for things I'd want to remain privileged is something I'd want to ask a lawyer about before relying on. Although that's also as much because the default privacy policy of most services is "fuck you."

Which is what happened here. Claude's privacy policy says that Anthropic reserves the right to share your chats with third parties for various reasons, which means you have no reasonable expectation of privacy in those communications in the first place and automatically defeats any other confidential privileges. What happened is therefore little different from the defendant texting his attorney's responses to his friends, which is a fairly time-worn way of defeating attorney-client privilege.

Seems an opportune time to remember that every day is STFU Friday. And, to quote The Wire, is you taking notes on a criminal fucking conspiracy?

SoftTalker 6 hours ago | parent | next [-]

You cannot be compelled to provide testimonial evidence that might incriminate you. Physical evidence, documents, computer files, anything not under attorney-client privilege is fair game for a subpoena or warrant.

ludicrousdispla 6 hours ago | parent | prev [-]

What if I hire a lawyer to use Claude for me instead? Seems like that is space for a disruptive startup.

ndr 7 hours ago | parent | prev | next [-]

Consider AI prompts no different from Google searches: they can be subpoenaed.

And consider local LLM logs no different from your txt file or command history on your computer. Could still be requested for discovery.

ronsor 2 hours ago | parent [-]

Yes, but when you delete them, they're actually gone. So you can have truly ephemeral conversations if you don't want history to stick around.

Nothing saved, nothing to discover.

ndr 2 hours ago | parent [-]

In theory you can have the same on incognito sessions (never stored, that's part of what Italian privacy regulator forced on OpenAI to do) and same for right to deletion as per GDPR.

How complaint they are I have no idea.

ronsor 2 hours ago | parent [-]

Incognito mode is what I use if I don't need to keep history around.

I'd never trust it to actually remove data.

ndr an hour ago | parent [-]

As I said,I don't know how GDPR compliant they are.

I'd expect them to get rid of that data in a reasonable amount of time. Similar to what would happen if you actively deleted a single chat.

rcxdude 8 hours ago | parent | prev | next [-]

>If I hand wrote some notes in a notebook or diary, I wouldn't have to hand them over, as I understand it, even with no lawyer in the mix. Same if I wrote some notes in a text file on my computer.

Is that true? I would expect that any notes I have in any form could be requested during discovery (client-attorney priviledge being one of the few exceptions and narrower than people assume).

wat10000 5 hours ago | parent | prev [-]

I don't think this is any different from other cloud-based software. Cloud providers can be compelled to turn over your data, as long as they're actually capable of doing so. If you don't want your data being snarfed up from a cloud provider and used in court, then only use cloud providers with end-to-end encryption, or better yet don't put your data in cloud providers at all.

The only reason this ruling is even remotely interesting is because people don't understand computer systems, and chatbots feel different. For the technologically minded, it should be pretty obvious that typing into a chatbot is no different from typing into a Google Doc, and that the data in both can be available to the legal system without the user's involvement or consent. But most people aren't technologically minded and may not have realized that all of their data is being saved and made available like that.