| ▲ | fredoliveira 6 days ago |
| > he could have stopped at any time. Obviously, clearly untrue. You go ahead and try stopping a behavior that reinforces your beliefs, especially when you're in an altered mental state. |
|
| ▲ | itvision 6 days ago | parent [-] |
| If a stupid chatbot reinforces something you hold dear, maybe you need the help of a professional psychiatrist. And the kid never did. But yeah, let's paint ChatGPT responsible. It's always corporations, not whatever shit he had in his life, including and not limited to his genes. |
| |
| ▲ | habinero 6 days ago | parent [-] | | Are you really blaming a child in crisis for not having the ability to get a psychiatrist? We regulate plenty of things for safety in highly effective and practical ways. Seatbelts in cars. Railings on stairs. No lead in paint. | | |
| ▲ | msgodel 6 days ago | parent | next [-] | | The problem is there's no way to build anything like a safety rail here. If you had it your way teens, likely everyone else too wouldn't be allowed to use computers at all without some kind of certification. | | |
| ▲ | habinero 6 days ago | parent [-] | | I honestly don't hate the idea. On a more serious note, of course there's ways to put in guard rails. LLMs behave like they do because of intentional design choices. Nothing about it is innate. | | |
| ▲ | imtringued 6 days ago | parent | next [-] | | If you take this idea even a little bit further, you'll end up with licenses for being allowed to speak. | | |
| ▲ | habinero 5 days ago | parent [-] | | I wasn't being entirely serious. Also, we managed to require drivers licenses without also walking licenses. | | |
| ▲ | msgodel 5 days ago | parent [-] | | We did that by making walking practically useless instead as many people here point out ~every week. |
|
| |
| ▲ | lp0_on_fire 6 days ago | parent | prev [-] | | Correct. The companies developing these LLMs are throwing dump trucks full of money at them like we’ve not seen before. They choose to ignore glaring issues with the technology because if they don’t, some one else will. | | |
| ▲ | msgodel 5 days ago | parent [-] | | Perhaps a better way to phrase that would be "beyond what they're doing now." Most popular hosted LLMs already refuse to complete explanations for suicide. | | |
| ▲ | FireBeyond 5 days ago | parent [-] | | Except in this case, the LLM literally said "I can't explain this for you. But if you'd like roleplay with me, I could explain it for you that way." |
|
|
|
| |
| ▲ | itvision 5 days ago | parent | prev [-] | | The concept of "guilt" is foreign to me. I hate it with all my heart. On the other hand, someone might be held responsible for this, and that's it. "Might" is the key word here. Given what we've learned, it's difficult to pinpoint who might be responsible. |
|
|