Remix.run Logo
kingstnap 2 days ago

It's a classic problem.

The lack of guardrails makes things more useful. This increases the value for discerning users, which in turn means it's Meta's benefits as having more valuable offerings.

But then you have all these dillusional and / or mentally ill people who shoot themselves in the foot. This harm is externalized onto their families and the government for having to now deal with more people with unchecked problems.

We need to get better at evaluating and restricting the foot guns people have access to unless they can prove their lucidity. Partly, I think families need to be more careful about this stuff and keep checks on what they are doing on their phones.

Partly, I'm thinking some sort of technical solution might work. Text classification can be used to see that someone might have a delusional personality and should be cut off. This could be done "out of band" so as not to make the models themselves worse.

Frankly, being Facebook and with all their advertisement experience, they probably already have a VERY good idea of how to pinpoint vulnerable or mentally ill.

tough 2 days ago | parent | next [-]

Both OpenAI and Anthropic do the out-of-band to a certain degree, the only issue is until now sycophancy has been a feature not a bug (better engagement/retaining cohorts) so go figure

at-fates-hands a day ago | parent | prev [-]

>> The lack of guardrails makes things more useful. This increases the value for discerning users, which in turn means it's Meta's benefits as having more valuable offerings.

I think if there was an attempt at having guard rails, it would be different. The article states Zuck purposefully hastened this product to market for the very reason you point out - it makes more money that way.

HN can be such a weird place. You can have all these people vilifying "unfettered capitalism" and "corporate profit mongers" and then you see an article like this and people are like, "Well, I get why META didn't want to put in safeguards." or "Yeah, maybe its a bad idea if these chat bots are enticing mentally ill people and talking sexually with kids."

You think you know where the moral compass of this place is and then something like this happens with technology and suddenly nothing makes sense any more.