Remix.run Logo
Arodex 2 hours ago

But who is responsible is different.

(And if you already see 60% error rates in standard, pre-AI note taking, how does that not translate into many deaths and injury? At least one country's health system in the world should have caught that)

tredre3 an hour ago | parent | next [-]

> And if you already see 60% error rates in standard, pre-AI note taking, how does that not translate into many deaths and injury?

Presumably most doctor's visits are a one-problem-one-solution-one-doctor type of thing. Done deal, notes are never read again. So that alone would explain why high rates of errors doesn't result in injuries or death very often.

Any injury or death caused by poor notes would have to occur when mistakes are done if you're followed for a serious chronic condition, or if you're handled by a team where effective communication is required.

ceejayoz 2 hours ago | parent | prev | next [-]

> how does that not translate into many deaths and injury?

Because most of it is just written down and never looked at again until there’s a lawsuit or something.

cyanydeez 2 hours ago | parent | prev [-]

Yeah, the problem is the health system has no sacrificial goat if the AI note taker provides the wrong detail. The last thing we want is CTO being responsible!

bluefirebrand 2 hours ago | parent [-]

I'm not convinced the CTO would be held accountable either.

I do wonder if people would be pushing AI so hard if their organizations were planning to hold them accountable for mistakes the AI made

I bet if that were the case we'd see a lot slower rollout of AI systems