Remix.run Logo
Fade_Dance 9 hours ago

Not an ideal situation, but on the other hand, it doesn't seem like something that is that hard to address.

You have a group of people that are in severe enough mental distress that they are willingly calling a suicide hotline for help. Even if they aren't the user case that is in imminent danger that the service is designed for, I think it's safe to assume that many of these people do need help or at the very least could greatly benefit from learning more about the resources available to them.

It sounds like this could be screened relatively easily, and these people could perhaps get shifted to a resource line where they can be helped to find the resources that fit there needs. Perhaps that will involve updating the call screen protocol and hiring a few more agents, but a concentrated stream of mentally unwell people actively taking the initiative to seek help is something that provides far more social benefit to efficiently address than to complain about.

Of course the chatbots themselves should be offering more options to call than just a suicide hotline, no disputing that.