▲ | zaptheimpaler 4 days ago | |
This is the key question IMO, and one good answer is in this recent video about a case of ChatGPT helping someone poison themselves [1]. A trained therapist will probably not tell a patient to take “a small hit of meth to get through this week”. A doctor may be unhelpful or wrong, but they will not instruct you to replace salt with NaBr and poison yourself. "third as well as as therapist" might be true on average, but the suitability of this thing cannot be reduced to averages. Trained humans don't make insane mistakes like that and they know when they are out of their depth and need to consult someone else. |