| ▲ | SilverElfin 9 hours ago | |||||||||||||||||||
I’ve had much better luck with diagnosis of my own family’s issues than with doctors. Usually now, I’m feeding them more information to begin with, so that their 30 minute office visits are not wasted, requiring another expensive follow up appointment. While I’m sure there can be ways in which such studies are wrong, it’s very obvious that AI can accelerate work in many of these areas where we seek out professional help - doctors, lawyers, etc. | ||||||||||||||||||||
| ▲ | kakacik 9 hours ago | parent [-] | |||||||||||||||||||
It can speed up some aspects of work, but please don't trust some llm with variable quality of output more than professional. If you don't like current doctor try another, most are in the business of helping other people. If you have string of issues with 10 last doctors though, then issue is, most probably, you... My wife is a GP, and easily 1/3 of her patients have also some minor-but-visible mental issue. 1-2 out of 10 scale. Makes them still functional in society but... often very hard to be around with. That doesn't mean I don't trust your words, there are tons of people with either rare issues or even fairly common ones but manifesting in non-standard way (or mixed with some other issue). These folks suffer a lot to find a doctor who doesn't bunch them up in some general state with generic treatment. There are those, but not that often. It helps both sides tremendously if patient is not above or arrogant know-it-all waving with chatgpt into doctor's face and basically just coming for prescription after self-diagnosis. Then, help is sometimes proportional to situation and lawful obligations. | ||||||||||||||||||||
| ||||||||||||||||||||