Remix.run Logo
themafia 4 hours ago

> and has the potential to debase shared reality.

If only.

What it actually has is the potential to debase the value of "AI." People will just eventually figure out that these tools are garbage and stop relying on them.

I consider that a positive outcome.

gretch 4 hours ago | parent | next [-]

Every other source for information, including (or maybe especially) human experts can also make mistakes or hallucinate.

The reason ppl go to LLMs for medical advice is because real doctors actually fuck up each and everyday.

For clear, objective examples look up stories where surgeons leave things inside of patient bodies post op.

Here’s one, and there many like it.

https://abc13.com/amp/post/hospital-fined-after-surgeon-leav...

nathan_compton 2 hours ago | parent [-]

"A few extreme examples of bad fuck ups justify totally disregarding the medical profession."

themafia 2 hours ago | parent [-]

"Doing your own research" is back on the menu boys!

phatfish 30 minutes ago | parent [-]

I'll insist the surgeon follows ChatGPTs plan for my operation next time I'm in theatre.

By the end of the year AI will be actually doing the surgery, when you look at the recent advancements in robotic hands, right bros?

WheatMillington 4 hours ago | parent | prev [-]

People used to tell me the same about Wikipedia.

themafia 2 hours ago | parent [-]

That it could "debase shared reality?"

Or that using it as a single source of truth was fraught with difficulties?

Has the latter condition actually changed?

WheatMillington an hour ago | parent [-]

That it's a garbage data source that could not be relied upon.