Remix.run Logo
autoexec 5 hours ago

Gemini didn't "know" he wasn't a child when it told him to kill himself or to "stage a mass casualty attack while armed with knives and tactical gear."

There are things you shouldn't encourage people of any age to do. If a human telling him these things would be found liable then google should be. If a human would get time behind bars for it, at least one person at google needs to spend time behind bars for this.

tshaddox 5 hours ago | parent | next [-]

> If a human telling him these things would be found liable then google should be.

Sounds like a big if, actually. Can a human be found liable for this? I’d imagine they might be liable for damages in a civil suit, but I’m not even sure about that.

krger 5 hours ago | parent | next [-]

>Can a human be found liable for this?

A father in Georgia was just convicted of second degree murder, child cruelty, and other charges because he failed to prevent his kid from shooting up his school.

autoexec 5 hours ago | parent [-]

More accurately it was because the father had multiple warnings that his child was mentally unstable but ignored them and handed his 14 year old a semiautomatic rifle even as the boy's mother (who did not live with them) pleaded to the father to lock all the guns and ammo up to prevent the kid from shooting people.

If he had only "failed to prevent his kid from shooting up a school" he wouldn't have even been charged with anything.

Imustaskforhelp 3 hours ago | parent [-]

Doesn't google have the capability to have multiple warnings and yet still ignores them?

TheOtherHobbes 3 hours ago | parent | next [-]

Google has legal personhood, but as a corporation its ethical responsibilities are much looser than those of an individual, and it's extremely hard to win a criminal case against a corporation even when its agents and representatives act in ways that would be criminal if they happened in a non-corporate context.

The law - in practice - is heavily weighted towards giving corporations a pass for criminal behaviour.

If the behaviour is really egregious and lobbying is light really bad cases may lead to changes in regulation.

But generally the worst that happens is a corporation can be sued for harm in a civil suit and penalties are purely financial.

You see this over and over in finance. Banks are regularly pulled up for fraud, insider dealing, money laundering, and so on. Individuals - mostly low/mid ranking - sometimes go to jail. But banks as a whole are hardly ever shut down, and the worst offenders almost never make any serious effort to clean up their culture.

autoexec an hour ago | parent | next [-]

When HSBC was caught knowingly laundering money for terrorists, cartels, and drug dealers all they had to do was apologize and hand the US government a cut of the action. It really seems less like the action of a justice system and more like a racketeering. Corporations really need to be reined in, but it's hard to find a politician willing to do it when they're all getting their pockets stuffed with corporate cash.

bluefirebrand 2 hours ago | parent | prev [-]

> as a corporation its ethical responsibilities are much looser than those of an individual

This seems ass backwards

autoexec 3 hours ago | parent | prev [-]

ChatGPT thinks that they can identify when someone may not be mentally well. There's no reason to think that Google can't. In fact, I'm pretty sure Google has a list of the mental health issues of just about every person with a Google account in that user's dossier.

autoexec 5 hours ago | parent | prev | next [-]

https://www.nbcnews.com/news/us-news/michelle-carter-found-g...

john_strinlai 5 hours ago | parent | prev | next [-]

>Can a human be found liable for this? I’d imagine they might be liable for damages in a civil suit

it is generally frowned upon (legally) to encourage someone to suicide. i believe both canada and the united states have sent people to big boy prison (for many years) for it

rootusrootus 5 hours ago | parent | prev | next [-]

Yes, people have gone to prison for it.

XorNot 5 hours ago | parent | prev [-]

It's been found so in US court previously: https://www.abc.net.au/news/2019-02-08/conviction-upheld-for...

not_ai 5 hours ago | parent | prev | next [-]

Preferably the C-Suite.

nickff 4 hours ago | parent | next [-]

I understand the impulse in this direction, but I’m not sure it would serve as much of a disincentive, as there would likely just be a highly-paid scapegoat. Why not something more lasting and less difficult to ignore, like compulsory disclosure of the model’s source code (in addition to compensation for the victim(s)). Compulsory disclosure of the source would be a massive disadvantage.

autoexec 5 hours ago | parent | prev [-]

exactly. That's why they get the big bucks. They're ultimately responsible

ncouture 5 hours ago | parent | prev [-]

It sounds more poetic than an invitation or an insult that invites someone directly or not to kill themselves, in its own, in my opinion.

This isn't Gemini's words, it's many people's words in different contexts.

It's a tragedy. Finding one to blame will be of no help at all.

strongpigeon 4 hours ago | parent | next [-]

> It's a tragedy. Finding one to blame will be of no help at all.

Agreed with the first part, but holding the designers of those products responsible for the death they've incited will help making sure they put more safeguards around this (and I'm not talking about additional warnings)

autoexec 5 hours ago | parent | prev [-]

None of what Gemini says is "Gemini's words". It's always just training data and prompt input remixed and regurgitated out.