| ▲ | pydry a day ago |
| >The typical person is using LLMs not at all as it pertains to their daily life tasks. This doesnt track at all with my experience. Everybody is using it everywhere. Moreover people are using them for daily life tasks even when it is not an appropriate use of LLMs - e.g. getting medical advice as you referred to or writing emails which are clearly pissing off their coworkers. In this respect I see it as akin to radium - a new technology that got a little too fashionable for its own good when it first emerged and which will likely have many use cases scaled back. |
|
| ▲ | TheScaryOne a day ago | parent | next [-] |
| >Everybody is using it everywhere. No one in our Auto shop is using AI. One of the new diagnostic tools was demo'd with AI, and none of us were having it. It's about as accurate as Googling your symptoms. My mother had an AI powered lung scan that came back with Stage 4 Cancer. The Oncologist got called in (for a fee!) to tell us it was just early stage COPD. |
|
| ▲ | user34283 a day ago | parent | prev | next [-] |
| In my experience people vastly overestimate the competence of doctors. Getting medical advice from LLMs could be life saving. Personally I experienced this when a specialized doctor believed a drug interaction to be the opposite, thinking A hinders the absorption of B, when actually it hinders the clearance, tripling concentration of B. Without AI, I would have been clueless about this and could not have spotted the mistake. I don't know if it would truly have been critical, but it did shake my confidence in doctors. |
| |
| ▲ | PAndreew 15 hours ago | parent [-] | | This^^ Use both, they have their own strengths and weaknesses. | | |
| ▲ | eru 14 hours ago | parent [-] | | And the AIs are still getting better at a good clip. I'm not so sure about (unassisted) doctors. |
|
|
|
| ▲ | HDThoreaun a day ago | parent | prev [-] |
| > getting medical advice Id be careful stating this is an inappropriate use of LLMs. Im semi tapped in to the medical literature community and there is a lot of serious discussion and research going into the usage of LLMs for medical advice and most of it is showing that LLMs are barely worse than doctors, and much much cheaper/more convenient. They definitely arent ready to completely replace doctors, but it seems they can provide competent medical advice in a pinch. Look out for the literature on this in the coming year, its only the last few months that researchers seem to be taking LLMs seriously. |
| |
| ▲ | Delphiza 4 hours ago | parent | next [-] | | I am surprised that people are surprised by this finding, and support your position. Anecdotally, doctors get things wrong quite frequently. Almost everybody has a bad medical diagnosis/advice story. The amount of reference material that a doctor needs to know off-hand and the data that they are given to make a diagnosis makes it a really difficult job. They also seldom have the ability to know whether their diagnosis/treatment worked, so have a limited ability to 'learn' from outcomes. (I did some work for cancer research and one of the most difficult problems was trying to get 'end of treatment' data because the end of treatment was often an unknown, to the researchers, death). The ability to have a 'prompt' that includes lab data is likely to be better than the opinions of a doctor that only has one person's professional experience, limited ability to interpret 'prompts', and needing map it to an in-memory conditions database. | |
| ▲ | checkyoursudo a day ago | parent | prev | next [-] | | This seems ripe for a joke akin to "how was the food?" "bad, but at least the portions were big!" Like, "how was the medical advice" "worse than a doc's, but at least it was cheaper!" | | |
| ▲ | HDThoreaun a day ago | parent [-] | | Well the thing is that it often isnt worse than a doctor's, thats the point of the research here. I get that sounds crazy, just watch out for the coming literature I guess. A significant portion of americans detest the medical industry and deeply dislike going to the doctor so I dont even think the product needs to be very good to disrupt the way the system works, just different and accessible is likely enough. Funnily enough, restaurants where the food is bad but the portions are big are actually decently popular. Priorities can vary so widely that many people are unable to even comprehend the priorities a significant number of people truly hold. | | |
| ▲ | d2ssa 21 hours ago | parent [-] | | "deeply dislike going to the doctor" No you are not capturing the trade off at all. And frankly you clearly have an inherent agenda implicit in your posts, that's clear to see. |
|
| |
| ▲ | jrflowers a day ago | parent | prev [-] | | > barely worse than doctors I like that this comment is below, and posted after, an example where somebody had to pay extra money to clear up a misdiagnosis of stage 4 cancer by the “barely worse” software | | |
| ▲ | HDThoreaun a day ago | parent [-] | | There are many examples of doctors misdiagnosing a wide variety of things, which is largely the point here. People think of doctors as infallible when that is not even close to true. Im certainly not saying fire all the radiologists, just advising an open mind when the actual literature starts saying that LLMs are as good as doctors in some areas. | | |
| ▲ | pydry a day ago | parent [-] | | There are many examples of people into homeopathy, chinese medicine and even witchcraft using an identical (not similar, identical) argument to the one you just used to push it. | | |
| ▲ | d2ssa 21 hours ago | parent | next [-] | | Legit that dude seems like a nutter. lol'd hard at "Im semi tapped in to the medical literature community." | |
| ▲ | jrflowers 16 hours ago | parent | prev [-] | | Yeah that’s the pitch for Dianetics |
|
|
|
|