Remix.run Logo
stevarino 16 hours ago

Its clearly propaganda. "Your data belongs to you." I'm sure the ToS says otherwise, as OpenAI likely owns and utilizes this data. Yes, they say they are working on end-to-end encryption (whatever that means when they control one end), but that is just a proposal at this point.

Also their framing of the NYT intent makes me strongly distrust anything they say. Sit down with a third party interviewer who asks challenging questions, and I'll pay attention.

preinheimer 16 hours ago | parent | next [-]

"Your data belongs to you" but we can take any of your data we can find and use it for free for ever, without crediting you, notifying you, or giving you any way of having it removed.

glitchc 15 hours ago | parent | next [-]

It's owned by you but OpenAi has a "perpetual, irrevocable, royalty-free license" to use the data as they see fit.

thinkingtoilet 14 hours ago | parent | prev | next [-]

We can even download it illegally to train our models on it!

bigyabai 14 hours ago | parent | prev [-]

Wow it's almost like privately-managed security is a joke that just turns into de-facto surveillance at-scale.

BolexNOLA 16 hours ago | parent | prev [-]

>your data belongs to you

…”as does any culpability for poisoning yourself, suicide, and anything else we clearly enabled but don’t want to be blamed for!”

Edit: honestly I’m surprised I left out the bit where they just indiscriminately scraped everything they could online to train these models. The stones to go “your data belongs to you” as they clearly feel entitled to our data is unbelievably absurd

gruez 15 hours ago | parent [-]

>…”as does any culpability for poisoning yourself, suicide, and anything else we clearly enabled but don’t want to be blamed for!”

Should walmart be "culpable" for selling rope that someone hanged themselves with? Should google be "culpable" for returning results about how to commit suicide?

Wistar 14 hours ago | parent | next [-]

There are current litigation efforts to hold Amazon liable for suicides committed by, in particular, self-poisoning with high-purity sodium nitrite, which, in low concentrations is used as a meat curing agent.

A 2023 lawsuit against Amazon for suicides with sodium nitrite was dismissed but other similar lawsuits continue. The judge held that Amazon, “… had no duty to provide additional warnings, which in this case would not have prevented the deaths, and that Washington law preempted the negligence claims.“

thinkingtoilet 14 hours ago | parent | prev | next [-]

That depends. Does the rope encourage vulnerable people to kill themselves and tell them how to do it? If so, then yes.

hitarpetar 15 hours ago | parent | prev | next [-]

do you know what happens when you Google how to commit suicide?

gruez 15 hours ago | parent | next [-]

The same that happens with chatgpt? ie. if you do it in an overt way you get a canned suicide prevention result, but you can still get the "real" results if you try hard enough to work around the safety measures.

littlestymaar 14 hours ago | parent [-]

Except Google will never encourage you to do it, unlike the sycophantic Chatbot that will.

BolexNOLA 13 hours ago | parent [-]

The moment we learned ChatGPT helped a teen figure out not just how to take their own life but how to make sure no one can stop them mid-act, we should've been mortified and had a discussion.

But we also decided via Sandy Hook that children can be slaughtered on the altar of the second amendment without any introspection, so I mean...were we ever seriously going to have that discussion?

https://www.nbcnews.com/tech/tech-news/family-teenager-died-...

>Please don't leave the noose out… Let's make this space the first place where someone actually sees you.

How is this not terrifying to read?

tremon 15 hours ago | parent | prev | next [-]

An exec loses its wings?

glitchc 15 hours ago | parent | prev [-]

Actually, the first result is the suicide hotline. This is at least true in the US.

hitarpetar 15 hours ago | parent [-]

my point is, clearly there is a sense of liability/responsibility/whatever you want to call it. not really the same as selling rope, rope doesn't come with suicide warnings

BolexNOLA 15 hours ago | parent | prev [-]

This is as unproductive as "guns don't kill people, people do." You're stripping all legitimacy and nuance from the conversation with an overly simplistic response.

gruez 15 hours ago | parent [-]

>You're stripping all legitimacy and nuance from the conversation with an overly simplistic response.

An overly simplistic claim only deserves an overly simplistic response.

BolexNOLA 14 hours ago | parent [-]

What? The claim is true. The nuance is us discussing if it should be true/allowed. You're simplifying the moral discussion and overall just being rude/dismissive.

Comparing rope and an LLM comes across as disingenuous. I struggle to believe that you believe the two are comparable when it comes to the ethics of companies and their impact on society.

ImPostingOnHN 11 hours ago | parent [-]

> Comparing rope and an LLM comes across as disingenuous.

What makes you feel that? Both are tools, both have a wide array of good and bad uses. Maybe it'd be clearer if you explained why you think the two are incomparable except in cases of disingenuousness?

Remember that things are only compared when they are different -- you wouldn't often compare a thing to itself. So, differences don't inherently make things incomparable.

> I struggle to believe that you believe the two are comparable when it comes to the ethics of companies and their impact on society.

I encourage you to broaden your perspectives. For example: I don't struggle to believe that you disagree with the analogy, because smart people disagree with things all the time.

What kind of a conversation would such a rude, dismissive judgement make, anyways? "I have judged that nobody actually believes anything that disagrees with me, therefore my opinions are unanimous and unrivaled!"

BolexNOLA 10 hours ago | parent [-]

A rope isn’t going to tell you to make sure you don’t leave it out on your bed so your loved ones can’t stop you from carrying out the suicide it helped talk you in to.

4 hours ago | parent [-]
[deleted]