Remix.run Logo
hansmayer 5 days ago

...Except that a surgeon can reason in real-time even if he wasn´t "trained" on a specific edge-case. Its called intelligence. And unless they have been taking heavy drugs ahead of the procedure, or were sleep deprived, its very un-likely a surgeon will have a hallucination, of the kind that is practically a feature of the GenAI.

dragonwriter 4 days ago | parent [-]

AI “hallucination” is more like confabulation than hallucination in humans (the chosen name the AI phenomenon was poor because the people choosing it don't understand thr domain it was chosen from, which is somewhat amusing given the nominal goal of their field); the risk factors for that aren't as much heavy drugs and sleep deprivation as immediate pressure to speak/act, absence of the knowledge needed, and absence of the opportunity or social permission to seek third-party input. In principle, though, yes, the preparation of the people in the room should make that less likely and less likely to be uncorrected in a human-conducted surgery.

hansmayer 4 days ago | parent [-]

I guess my point was less about the nuances of how we define 'hallucination' for a GenAI system, and more about the important part - not having my liver accidentally removed because the Surgery-ChatGPT had a hickup, or the rate limit was reached or whatever.