▲ | AndrewMPT 4 days ago | ||||||||||||||||||||||||||||||||||
We’re building AI PSY HELP – an AI-powered mental health assistant offering 24/7 anonymous support via voice and text, without appointments or waiting. It’s used by 100,000+ people in Ukraine, including veterans, teens, and first responders. The AI is trained on 40,000+ hours of real psychotherapy sessions and provides individualized emotional guidance to help users manage stress, anxiety, and trauma. We partner with public institutions to deliver large-scale support and just launched a B2B program for employers. Now preparing for EU expansion (starting with Germany), mobile app rollout, and voice interaction in Ukrainian. This is not just a chatbot – it’s scalable mental health infrastructure. → https://ai.psyhelp.info → https://chat.psyhelp.info → https://chat.dev.psyhelp.info (+voice) | |||||||||||||||||||||||||||||||||||
▲ | flir 4 days ago | parent [-] | ||||||||||||||||||||||||||||||||||
How did you get people to agree to training a chatbot on their sessions? That strikes me as extremely intimate text. Is it a "it's in the T&Cs" deal, or did you seek a separate opt-in? I'm askng because the answer will shed light on the level of privacy "the average consumer" is comfortable with. | |||||||||||||||||||||||||||||||||||
|