Remix.run Logo
chrisjj 17 hours ago

> Google should not have lesser liability because the defamatory statements were published by software that Google created and controls.

Therein lies the rub. Google does not control what its parrot spouts. No-one does.

nerdsniper 13 hours ago | parent | next [-]

For defamatory statements about public figures, "actual malice" is a necessary component of defamation. For private individuals, plaintiffs just have to prove "negligence", that Google didn't act with reasonable care before publishing. It's unclear whether courts would find negligence, but a decent lawyer would argue something like: "By explicitly stating in their disclaimer that Google knows some of the information they are publishing might be inaccurate, they are actively demonstrating that they did not verify the claims - and therefore willfully acted with reckless disregard for the truth."

This is exactly why Google's public comment on this case from the TFA is:

> "AI Overviews frequently improve to show the most helpful information, and we invest significantly in the quality of responses. When issues arise – like if our features misinterpret web content or miss some context – we use those examples to improve our systems and may take action under our policies."

Google's statement is carefully crafted to make the case that they "act with reasonable care" for legal effect, rather than to win any points in the court of public opinion. Courts have yet to determine what passes the reasonable-care test for negligence wrt AI output. Google feels they need to make sure that regardless of anything else that happens in this case, that the decision does not find their publishing was negligent.

digitalPhonix 13 hours ago | parent | prev | next [-]

Even if you accept that (I don't, and neither should the courts), Google controls the next hundred processing/routing/rendering/middleware steps and is fully in control of the content that makes it to the user.

mft_ 14 hours ago | parent | prev | next [-]

If Anthropic can implement a regular expression to monitor for user frustration, Google have certainty got the chops to have some sort of heuristic to check for strongly negative statements.

apothegm 12 hours ago | parent [-]

Or even have a small model check the output of the larger one.

Doesn’t work with APIs, but then the person/entity integrating the API should have that responsibility.

thrownthatway 17 hours ago | parent | prev | next [-]

That’s one perspective.

It’s wrong.

But it’s definitely a perspective.

BizarroLand 15 hours ago | parent | prev | next [-]

Parents have to pay penalties when their underaged children burn down a building.

Companies that get treated with the rights of people should also have the responsibilities of people. Google designed, built, hosted, and promoted their LLM prominently. Logically, it follows that they should be personally and financially responsible for any harms their LLM causes.

chrisjj 14 hours ago | parent [-]

Sure they should have the responsibility. Even more so given they don't have control.

grouchomarx 16 hours ago | parent | prev [-]

ah well, no worries then