Remix.run Logo
jubilanti 9 hours ago

I still don't want a fucking audio recorder in my doctor's office or a fucking AI that sits in between me and my doctor.

I am intentionally cursing to express my anger at this casual betrayal of medical trust.

EvanAnderson 6 hours ago | parent | next [-]

> I still don't want a fucking audio recorder in my doctor's office ...

If I got a copy of the raw recording I might consider it. Maybe. Having that audio recording would be valuable to me.

It's very irksome medical providers I visit have signs posted prohibiting audio and video recording by patients. My medical appointments aren't exceedingly complex, but a reference audio recording would be handy.

I suppose I could exercise civil disobedience and just record anyway since it's not illegal in my state. Still, it irks me.

burnte 4 hours ago | parent [-]

> If I got a copy of the raw recording I might consider it. Maybe. Having that audio recording would be valuable to me.

We wouldn't be able to provide it because it's never kept. It's transcribed directly, and then only the note summary is kept. This is to ensure the recording and transcript can't leak (because they don't exist). This was one of my first questions for all of these tools. Where does the data go, how is it processed, what happens. One company refused to talk about it, so I refused to talk to them.

OptionOfT 4 hours ago | parent [-]

So how can you verify correctness of transcription and summary in a way that is repeatable over time?

EvanAnderson 3 hours ago | parent [-]

Agreed. That sounds like a recipe for "we don't know how 'the algorithm' came up with what it did" kinds of excuses when, inevitably, inaccuracies are found. It also seems, conveniently, to make the processing system practically unimpeachable.

defrost 3 hours ago | parent [-]

Any thoughts on https://news.ycombinator.com/item?id=47896064 ?

tclancy 7 hours ago | parent | prev | next [-]

This feels wild to me. I think I am pretty well privacy obsessed, but I don't see it here (fwiw, my wonderful doctor has been using these services for years; originally with overseas human labor, now with AI). First off it presupposes some level of privacy with one's GP that I would only want from a therapist. I don't want health information going beyond my doctor? What about him talking to specialists or getting another opinion in the break room?

Ship's sailed on that level of privacy anyway the second you bill an insurance carrier in the US. I am willing to take this particular risk if something I said two years ago pops up to help explain what I am currently experiencing. I understand not everyone is me and I am lucky to be in relatively good health and not have anything going on that might put employment, etc at risk so I can understand where some people may want to refuse. But the knee-jerk "FUCK NO BECAUSE PRIVACY" is almost as bad as writing a post based on a side plot in The Pitt when said side plot was 110% heightening the stress between Dr. Robby and Dr. Al Hashimi, not a goddamn double-blind study of the effectiveness of AI transcripto-bots.

And if you're going to take lessons from The Pitt about medical record transcription, why isn't it Dr. Santos repeatedly falling asleep while transcribing records?

th0ma5 7 hours ago | parent [-]

[dead]

kube-system 9 hours ago | parent | prev | next [-]

It is standard practice to ask patients whether or not they want the scribe used, and in many cases required by law.

jubilanti 8 hours ago | parent | next [-]

For now. It always begins as voluntary. But then doctors will start to treat people who opt out the way TSA treats me when I opt out: a hostile adversary.

I already get glares and sighs when I dare to actually read every word of a multipage form I am expected to sign without reading. Was told once I would lose my appointment if I took longer than a few minutes to read more than 10 pages because I could not be checked in until I signed. Other patients are waiting, your exercise of your human rights is inefficient.

Then soon I'll have to pay a higher copay to opt out. Then I won't be able to opt out at all.

All in the name of optimizing patient NPS scores and patient throughput.

kube-system 8 hours ago | parent | next [-]

I've never had this problem. IME every doctors office recommends showing up 15-20 minutes early to a new-patient appointment for the explicit reason of filling out paperwork.

jeffbee 6 hours ago | parent [-]

Right, doctors and CIOs get to use AI transcripts but you, a lowly patient, will write your name, address, and insurance policy number fifteen times with an exhausted Bic pen.

tclancy 7 hours ago | parent | prev | next [-]

>For now. It always begins as voluntary. But then doctors will start to treat people who opt out the way TSA treats me when I opt out: a hostile adversary.

You sure this is a privacy issue?

ryandrake 8 hours ago | parent | prev [-]

> Was told once I would lose my appointment if I took longer than a few minutes to read more than 10 pages.

I'd be finding a new doctor at that point. Ridiculous. I love it how doctors can be 30 minutes late for their appointments because they're running late and all their appointment delays are cascading, but if the patient reads a document for 5 minutes, they're the problem!

burnte 4 hours ago | parent | prev [-]

There is no legal requirement to inform patients about the use of scribes, human or AI. If a telehealth session is recorded many states are two party and require telling the patient, but AI scribes are treated the same way other electronic tools are are are covered by your general informed consent policy. We inform patients in writing, their providers make the patient aware, and they are given the opportunity to opt out of the use. No recordings are kept, the session goes directly to transcription, that transcript is deleted after the note is saved.

kube-system 4 hours ago | parent [-]

I'm referring to recording laws, as you allude to.

burnte 4 hours ago | parent | prev | next [-]

> I still don't want a fucking audio recorder in my doctor's office

Which would you prefer, your doctor remembering everything, or making verbal notes into a microcassette tape recorder that is transcribed by a human later (sometimes the doctor, sometimes someone else)? What if your doctor had a medical assistance in the room and spoke out loud and that medical assistant wrote down everything, is that ok?

> or a fucking AI that sits in between me and my doctor.

It sits next to the doctor helping them focus on you by transcribing the session, it doesn't do anything the doctor can't and definitely doesn't do anything the doctor SHOULD. No decision making is done, only transcription and summarization which is then checked by the doctor. We do not let AI make decisions.

defrost 4 hours ago | parent [-]

I'd prefer a doctor's brain being actively engaged in the second pass summary checking phase that follows the first pass infomation gathering phase.

You know, keeping a skilled human actively in the oversight loop and not being encouraged by time pressures or apparent conveniences to slide further and further out of the active loop.

ie. Always catching that passing jokes about Coke don't end up as cocaine usage notations etc.

---

I'd seriously suggest / trial delibrately injecting (with doctor's knowledge) some N +/- 2 significant (meaning reversed) transcription errors in either each transcript or in the run of transcripts for a shift.

Now it's a game for a doctor to pick out the {N} known errors as they check the transcription points with penalties for missing known errors and a bonus for finding unknown not delibrately made errors.

Don't allow the doctors to easily fail into the trap of trusting transcription and don't fall into he trap of making easy to spot obvious errors that can be auto hind brain ticked off.

kstrauser 6 hours ago | parent | prev | next [-]

> I still don't want a fucking audio recorder in my doctor's office

Why? Doctors have the strictest privacy regulations I know of. It's the one place where I'd be least uncomfortable with a recording, because there's nothing they can do with it other than use it to provide healthcare to me.

> or a fucking AI that sits in between me and my doctor.

The expected arrangement is that the AI would be alongside you and your doctor, so that your doctor can spend time interacting with you instead of playing transcriptionist and dictating your statement into your chart.

oliwarner 7 hours ago | parent | prev [-]

Notes need writing though.

You can do that by recording and transcribing (many methods) or your doctor has to write on the fly, or worse, has their head in their computer while you talk in their general direction.

Letting doctors talk and examine and not write is a wholly better experience.

Offsite third parties are the problem here. If this was done automatically without data leaving the room, is there a problem? Do you have the same objections to how your digital notes are stored?

slumberlust 7 hours ago | parent | next [-]

We agree on the desired outcome, but couldn't we also give doctors more time to do that job without AI? Feels like the blame is in the wrong place.

alistairSH 6 hours ago | parent | prev [-]

Maybe it's a regional thing, but in my last 3 appointments, 2 had an assistant doing the note-taking (as prompted by the treating physician or PA). The third was a virtual appointment, so no idea what notes were taken, if any.

oliwarner 4 hours ago | parent [-]

Sounds cushy, but not everywhere can afford 2:1 healthcare for every primary contact. It's not a thing here until you get to a ward or hospital-based clinic and you're seeing a team.

I don't like off-site data vacuums. Palantir can get fucked. But good ML transcription tools don't have to be run off-site. Even to get you 90%, or serve as a backup. And as I've said in other threads here, it's hard to be angry about consented audio recording and AI transcription when my entire medical history is floating around in a database that could be hacked, or its data deliberately passed through (eg) a Palantir tool. I think audio of me complaining about lower back pain is the very least of our worries.

Personally, I'd prefer AI and better doctor availability. To have that admin time back as consultation time, or more appointments, or just less overworked doctor.

But also, there have to be weapons grade consequences for people that leak patient data. Loss of registration, never allowed to work with sensitive data again and jail.