Remix.run Logo
reenorap 4 hours ago

I and many of my friends have used ChatGPT extremely effectively to diagnose medical issues. In fact, I would say that ChatGPT is better than most doctors because most doctors don't actually listen to you. ChatGPT took the time to ask me questions and based on my answers, narrowed down a particularly scary diagnosis and gave excellent instructions on how to get to a local hospital in a foreign country, what to ask for, and that I didn't have to worry very much because it sounded very typical for what I had. The level of reassurance that I was doing everything right actually made me feel less scared, because it was a pretty serious problem. Everything it told me was 100% correct and it guided me perfectly.

I was taking one high blood pressure medication but then noticed my blood sugar jumped. I did some research with ChatGPT and it found a paper that did indicate that it could raise blood sugar levels and gave me a recommendation for an alternative I asked my doctor about it and she said I was wrong, but I gently pushed her to switch and gave the recommended medication. She obliged, which is why I have kept her for almost 30 years now, and lo and behold, my blood sugar did drop.

Most people have a hard time pushing back against doctors and doctors mostly work with blinders on and don't listen. ChatGPT gives you the ability to keep asking questions without thinking you are bothering them.

I think ChatGPT is a great advance in terms of medical help in my opinion and I recommend it to everyone. Yes, it might make mistakes and I caution everyone to be careful and don't trust it 100%, but I say that about human doctors as well.

3rodents 4 hours ago | parent | next [-]

I agree that absolute deference to doctors is a mistake and that individuals should be encouraged to advocate for themselves (and doctors should be receptive to it) but I'm not so convinced in this specific case. Why do high blood sugar levels matter? Are there side effects associated with the alternative treatment? Has ChatGPT actually helped you in a meaningful way, or has the doctor's eventual relenting made you feel like progress has been made, even if that change is not meaningful?

In this context, I think of ChatGPT as a many-headed Redditor (after all, reddit is what ChatGPT is trained on) and think about the information as if it was a well upvoted comment on Reddit. If you had come across a thread on Reddit with the same information, would you have made the same push for a change?

There are quite a few subreddits for specific medical conditions that provide really good advice, and there are others where the users are losing their minds egging each other on in weird and whacky beliefs. Doctors are far from perfect, doctors are often wrong, but ChatGPT's sycophancy and a desperate patient's willingness to treat cancer with fruit feel like a bad mix. How do we avoid being egged on by ChatGPT into forcing doctors to provide bad care? That's not a rhetorical question, curious about your thoughts as an advocate for ChatGPT.

emodendroket 3 hours ago | parent | next [-]

I know what you mean and I would certainly not want to blindly "trust" AI chatbots with any kind of medical plan. But they are very helpful at giving you some threads to pull on for researching. I do think they tend a little toward giving you potentially catastrophic, worst-case possibilities, but that's a known effect from when people were using Google and WebMD as well.

yunohn 3 hours ago | parent | prev [-]

> Why do high blood sugar levels matter?

Are you asking why a side effect that is actually an entire health problem on its own, is a problem? Especially when there is a replacement that doesn’t cause it?

3rodents 3 hours ago | parent [-]

Side effects do not exist in isolation. High blood sugar is not a problem if it is solving a much bigger health issue, or is a lesser side effect than something more serious. If medication A causes high blood sugar but medication B has a chance of causing blood clots, medication A is an obvious choice. If a patient gets it in their head that their high blood sugar is a problem to solve, ChatGPT is going to reinforce that, whereas a doctor will have a much better understanding of the tradeoffs for that patient. The doctor version of the x/y problem.

yunohn 3 hours ago | parent [-]

Look, anyone can argue hypotheticals. But if one reads the comment being discussed, it can be deduced that your proposed hypotheses are not applicable, and that the doctor actually acknowledged the side effect and changed medications leading to relief. Now, if the new medication has a more serious side effect, the doctor (or ChatGPT) should mention and/or monitor for it, but the parent has not stated that is the case (yet). As such, we do not need to invent any scenarios.

3rodents 3 hours ago | parent | next [-]

The comment being discussed advocates for people to use ChatGPT and push their doctor to follow its recommendations. Even if we assume the OP is an average representation of people in their life, that means half of the people they are recommending ChatGPT to for medical advice are not going to be interrogating the information it provides.

A lazy doctor combined with a patient that lacks a clear understanding of how ChatGPT works and how to use it effectively could have disastrous results. A lazy doctor following the established advice for a condition by prescribing a medication that causes high blood sugar is orders of magnitude less dangerous than a lazy doctor who gives in to a crackpot medical plan that the patient has come up with using ChatGPT without the rigour described by the comment we are discussing.

Spend any amount of time around people with chronic health conditions (online or offline) and you'll realise just how much damage could be done by encouraging them to use ChatGPT. Not because they are idiots but because they are desperate.

Calavar 2 hours ago | parent | prev [-]

As a physician, I can give further insight. The blood pressure medication the commenter is referring to is almost certainly a beta blocker. The effect on blood sugar levels is generally modest [1]. (It is rare to advise someone with diabetes to stop taking beta blockers, as opposed to say emphysema, where it is common)

They can be used for isolated, treatment of high blood pressure, but they are also used for dual treatment of blood pressure and various heart issues (heart failure, stable angina, arrhythmias). If you have heart failure, beta blockers can reduce your relative annual mortality risk by about 25%.

I would not trust an LLM to weigh the pros and cons appropriately knowing their syncophantic tendencies. I suspect they are going to be biased toward agreeing with whatever concerns the user initially expresses to them.

[1]

alexjplant 4 hours ago | parent | prev | next [-]

> most doctors don't actually listen to you.

> doctors mostly work with blinders on and don't listen

This has unfortunately been my experience as well. My childhood PCP was great but every interaction I've had with the healthcare system since has been some variation of this. Reading blood work incorrectly, ignoring explanations of symptoms, misremembering medications you've been taking, prescribing inappropriate medications, etc. The worst part is that there are a lot of people that reflexively dismiss you as a contrarian asshole or, even worse, a member of a reviled political group that you have nothing to do with just because you dare to point out that The Person With A Degree In Medicine makes glaring objective mistakes.

Doctors aren't immune to doing a bad job. I don't think it's a secret that the system overworks them and causes many of them to treat patients like JIRA tickets - I'd just like to know what it would take for people to realize that saying such doesn't make you a crackpot.

As an aside I use Claude primarily for research when investigating medical issues, not to diagnose. It is equally likely to hallucinate or mischaracterize in the medical domain as it is others.

gaoshan 3 hours ago | parent | prev | next [-]

ChatGPT helped me understand a problem with my stomach that multiple doctors and numerous tests have not been able to shed any effective light on. Essentially I plugged in all of the data I could from my own observations of my issue over a 35 year period. It settled on these 3 possibilities: "Functional Dyspepsia with slow gastric accommodation, Mild delayed gastric emptying (even subclinical), Visceral hypersensitivity (your nerves fire pain signals when stretched)" and suggested a number of strategies to help with this. I implemented many of them and my stomach pain has been non-existent for months now... longer than I have ever been pain free.

I feel like the difference is that doctors took what I told them and only partially listened. They never took it especially seriously and just went straight to standard tests and treatments (scopes, biopsies and various stomach acid impacting medications). ChatGPT took some of what I said and actually considered it, discounting some things and digging into others (I said that bitter beer helped... doctor laughed at that, ChatGPT said that the alcohol probably did not help but that the bittering agent might and it was correct). ChatGPT got me somewhere better than where I was previously... something no doctor was able to do.

ZhadruOmjar 4 hours ago | parent | prev | next [-]

ChatGPT for health questions is the best use case I have found (Claude wins for code). Having a scratch pad where I can ask about any symptom I might feel, using project memory to cross reference things and having someone actually listen is very helpful. I asked about Crohn's disease since my grandfather suffered from it and I got a few tests I could do, stats on likelihood based on genetics, diet ideas to try and questions to ask the doctor. Much better than the current doc experience which is get the quickest review of my bloods, told to exercise and eat healthy and a see you in six months.

Spooky23 24 minutes ago | parent | prev | next [-]

You may have done so inadvertently. With the transition to doc-in-the-box urgent care with NPs with inconsistent training and massive caseloads... your provider is often using ChatGPT.

I caught one when i ask ChatGPT something, and then went to urgent case. I told my story, they left, came back and essentially read back exactly what ChatGPT told me.

cal_dent 4 hours ago | parent | prev | next [-]

I’ve heard many people say the same (specifically about ppl being better than doctors because they listen) and I find it odd and wonder if this is a specific country thing?

I’ve been lucky enough to not need much beyond relative minor medical help but in the places I’ve lived always found that when I do see a GP they’re generally helpful.

There’s also something here about medical stuff making people feel vulnerable as a default so feeling heard can overcompensate the relationship? Not sure I’m articulating this last point well but it comes up so frequent (it listened, guided me through it step by step etc.) that I wonder if that has an effect. Feeling more in control than a doctor who has other patients and time constraint just say it’s x or do this

Projectiboga 2 hours ago | parent [-]

In America a side effect of our lack of universal care is that every physician has to carry their own malpractice insurance, whereas in most countries you can just get retreated if the first time doesn't work. The Dr might still face consequences is their was actual malpractice but there isn't the shakels of having to do it by the book so strictly.

IncreasePosts 4 hours ago | parent | prev | next [-]

I don't know if I could trust AI for big things, but I had nagging wrist pain for like a year, any time I extended my wrist (like while doing a pushup). It wasn't excruciating but it certainly wasn't pleasant, and it stopped me from doing certain activities (like pushups)

I visited my GP, 2 wrist specialists, and physical therapist to help deal with it. I had multiple x rays and an MRI done. Steroid injection done. All without relief. My last wrist specialist even recommended I just learn to accept it and don't try to extend my wrist too much.

I decided to ask Gemini, and literally the first thing it suggested was maybe the way I was using the mouse was inflaming an extensor muscle, and it suggested changing my mouse and a stretch/massage.

And you know what, the next day I had no wrist pain for the first day in a year. And it's been that way for about 3 weeks now, so I'm pretty hopeful it isn't short term

Marsymars 3 hours ago | parent | next [-]

I guess it’s not nothing that Gemini caught that, but that seems like a pretty obvious oversight from the healthcare practitioners - inquiring about RSI injuries should be one of the first things they ask about.

I pre-emptively switched to trackballs and to alternating left/right hands for mousing near the start of my professional career based on the reading I did after some mild wrist strain.

mapt 3 hours ago | parent | next [-]

I think that 90% of what we get from doctors on musculoskeletal injuries that aren't visible on a simple X-ray is either oversight, or a bias towards doing nothing specifically because treatments have health, administrative, and financial costs and they might not help. There is no time authorized to do deep diagnostic work unless something is clearly killing you.

Marsymars 3 hours ago | parent [-]

I’ve had good results with doctors for soft tissue injuries, but it doesn’t feel like something that GPs are generally equipped/motivated for. The good results I’ve had have been from doctors at high performance sports clinics, or with doctors I’ve been referred to by my (awesome) sports physiotherapist.

code_biologist 3 hours ago | parent | prev [-]

I had progressively worsening pelvic floor pain issues that AI helped me with and are now in remission/repair. My decade of interaction with multiple urologists and clinicians could be characterized as repeated and consistent "pretty obvious oversight from the healthcare practitioners".

Izikiel43 3 hours ago | parent | prev [-]

I was having wrist pain from using the mouse, and switched to a trackball, issue solved, if the wrist doesn't move/flex, it doesn't get strained, therefore no pain.

The only thing that moves is my thumb, and it's much better for flexing than the wrist, also it has a tiny load to manage vs the wrist.

Marsymars 3 hours ago | parent [-]

I did the same, though to non-thumb trackball (CST2545), which I find also virtually eliminates wrist stress.

cm2012 4 hours ago | parent | prev | next [-]

+100 to this from my personal experience

avree 4 hours ago | parent | prev | next [-]

No, no, no. You can change your doctor, and get one that listens to you - you can't change the fact that ChatGPT has no skin in the game - no reputation, no hippocratic oath, no fiscal/legal responsibility. Some people have had miracles with Facebook groups, or WebMD, but that doesn't change where the role of a doctor is or mean that you should be using those things for medical advice as opposed to something that allows you to have an informed conversation with a doctor.

noosphr 4 hours ago | parent | next [-]

Neither do most doctors. No gp will get disbarred for giving the wrong diagnosis on a first consult.

They have 15 minutes and you have very finite money.

Medical agents should be a pre consult tool that the patient talks to in the lobby while waiting for the doctor so the doctor doesn't waste an hour to hear the most important data point and the patient doesn't sit for an hour in the lobby doing nothing.

raincole 4 hours ago | parent | prev | next [-]

Doctors have no skin in the game too. Our society is built on the illusion of 'skin in the game' of professionals like doctors and lawyers (and to a lesser extent, engineers), but it's still an illusion.

3 hours ago | parent [-]
[deleted]
derefr 3 hours ago | parent | prev | next [-]

In countries with public healthcare + doctor shortages (e.g. Canada), good luck even getting a family doctor, let alone having a request to switch you family doctor "when you already have one!" get taken seriously.

Everyone I know just goes to walk-in clinics / urgent-care centres. And neither of those options give doctors any "skin in the game." Or any opportunities for follow-up. Or any ongoing context for evaluating treatment outcomes of chronic conditions, with metrics measured across yearly checkups. Or the "treatment workflow state" required to ever prescribe anything that's not a first-line treatment for a disease. Or, for that matter, the willingness to believe you when you say that your throat infection is not in fact viral, because you've had symptoms continuously for four months already, and this was just the first time you had enough time and energy to wake up at 6AM so you could wait out in front of the clinic at 7:30AM before the "first-come-first-served" clinic fills up its entire patient queue for the day.

Spooky23 17 minutes ago | parent | next [-]

The US has the same issue.

Because the republican party turned out to be a bunch of fascist fucks, there's no real critique of Obamacare. One of the big changes with the ACA is that it allowed medical networks to turn into regional cartels. Most regions have 2-3 medical networks, who are gobbled up all of the medical practices and closed many.

Most of the private general practices have been bought up, consolidated to giant practices, and doctors paid to quit and replaced by other providers at half the cost. Specialty practices are being swept up by PE.

2 hours ago | parent | prev [-]
[deleted]
thewebguyd 3 hours ago | parent | prev [-]

> no reputation, no hippocratic oath, no fiscal/legal responsibility.

To say nothing of giving your personal health information over to a private company with no requirement to practice HIPAA, and just recently got subpoenaed for all chat records. Not to mention potential future government requests, NSA letters, during an administration that has a health secretary openly talking about rounding up mentally ill people and putting them in work camps.

Maybe LLMs have use here, but we absolutely should not be encouraging folks to plug information into public chatbots that they do not control and do not run locally.

It is a recipe for disaster.

jamespo 4 hours ago | parent | prev [-]

[flagged]

dang 2 hours ago | parent [-]

If you wouldn't mind reviewing https://news.ycombinator.com/newsguidelines.html and taking the intended spirit of the site more to heart, we'd be grateful. They include: "Have curious conversation; don't cross-examine." and "Assume good faith."

HN is just a big internet watercooler type place where people exchange experiences, views, and whatnot. Such a context can't really work unless people assume good faith with each other.