▲ | PeterCorless 4 days ago | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Correct. It is more provider-oriented proscription ("You can't say your chatbot is a therapist.") It is not a limitation on usage. You can still, for now, slavishly fall in love with your AI and treat it as your best friend and therapist. There is a specific section that relates to how a licensed professional can use AI: Section 15. Permitted use of artificial intelligence. (a) As used in this Section, "permitted use of artificial intelligence" means the use of artificial intelligence tools or systems by a licensed professional to assist in providing administrative support or supplementary support in therapy or psychotherapy services where the licensed professional maintains full responsibility for all interactions, outputs, and data use associated with the system and satisfies the requirements of subsection (b). (b) No licensed professional shall be permitted to use artificial intelligence to assist in providing supplementary support in therapy or psychotherapy where the client's therapeutic session is recorded or transcribed unless: (1) the patient or the patient's legally authorized representative is informed in writing of the following: (A) that artificial intelligence will be used; and (B) the specific purpose of the artificial intelligence tool or system that will be used; and (2) the patient or the patient's legally authorized representative provides consent to the use of artificial intelligence. Source: Illinois HB1806 https://www.ilga.gov/Legislation/BillStatus/FullText?GAID=18... | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
▲ | janalsncm 4 days ago | parent | next [-] | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
I went to the doctor and they used some kind of automatic transcription system. Doesn’t seem to be an issue as long as my personal data isn’t shared elsewhere, which I confirmed. Whisper is good enough these days that it can be run on-device with reasonable accuracy so I don’t see an issue. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
▲ | romanows 4 days ago | parent | prev [-] | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Yes, but also "An... entity may not provide... therapy... to the public unless the therapy... services are conducted by... a licensed professional". It's not obvious to me as a non-lawyer whether a chat history could be decided to be "therapy" in a courtroom. If so, this could count as a violation. Probably lots of law around this stuff for lawyers and doctors cornered into giving advice at parties already that might apply (e.g., maybe a disclaimer is enough to workaround the prohibition)? | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|