| ▲ | bamboozled 14 hours ago | |||||||||||||||||||||||||||||||||||||||||||
I been using Claude for information regarding building and construction related information, (currently building a small house mostly on my own with pros for plumbing and electrical). Seriously the amount of misinformation it has given me is quite staggering. Telling me things like, “you need to fill your drainage pipes with sand before pouring concrete over them…”, the danger with these AI products is that you have to really know a subject before it’s properly useful. I find this with programming too. Yes it can generate code but I’ve introduced some decent bugs when over relying on AI. The plumber I used laughed at my when I told him about there sand thing. He has 40 years experience… | ||||||||||||||||||||||||||||||||||||||||||||
| ▲ | simianwords 28 minutes ago | parent | next [-] | |||||||||||||||||||||||||||||||||||||||||||
Give a single reproduceable example using ChatGPT thinking | ||||||||||||||||||||||||||||||||||||||||||||
| ▲ | FaradayRotation 14 hours ago | parent | prev | next [-] | |||||||||||||||||||||||||||||||||||||||||||
I nearly spit my drink out. This is my kind of humor, thanks for sharing. I've had a decent experience (though not perfect) with identifying and understanding building codes using both Claude and GPT. But I had to be reasonably skeptical and very specific to get to where I needed to go. I would say it helped me figure out the right questions and which parts of the code applied to my scenario, more than it gave the "right" answer the first go round. | ||||||||||||||||||||||||||||||||||||||||||||
| ▲ | justapassenger 14 hours ago | parent | prev | next [-] | |||||||||||||||||||||||||||||||||||||||||||
I'm a hobby woodworker - I've tried using gemini recently for an advice on how to make some tricky cuts. If I'd follow any of the suggestions I'd probably be in ER. Even after me pointing out issues and asking it to improve - it'd come up with more and more sophistical ways of doing same fundamentally dangerous actions. LLMs are AMAZING tools, but they are just that - tools. There's no actual intelligence there. And the confidence with which they spew dangerous BS is stunning. | ||||||||||||||||||||||||||||||||||||||||||||
| ||||||||||||||||||||||||||||||||||||||||||||
| ▲ | trollbridge 14 hours ago | parent | prev | next [-] | |||||||||||||||||||||||||||||||||||||||||||
I've observed some horrendous electrical device, such as "You should add a second bus bar to your breaker box." (This is not something you ever need to do.) | ||||||||||||||||||||||||||||||||||||||||||||
| ▲ | fny 14 hours ago | parent | prev | next [-] | |||||||||||||||||||||||||||||||||||||||||||
I mean... you do have to backfill around your drainage pipe, so it's not too far off. Frankly, if you Google the subject people misspeak about "backfilling pipes" too as if the target of the backfill is the pipe itself too not the trench. Garbage in, garbage out. All the circumstances where ChatGPT has given me shoddy advice fall in three buckets: 1. The internet lacks information, so LLMs will invent answers 2. The internet disagrees, so LLMs sometimes pick some answer without being aware of the others 3. The internet is wrong, so LLMs spew the same nonsense Knowledge from blue collar trades seems often to in those three buckets. For subjects in healthcare, on the other hand, there are rooms worth of peer reviewed research, textbooks, meta studies, and official sources. | ||||||||||||||||||||||||||||||||||||||||||||
| ▲ | dingnuts 14 hours ago | parent | prev [-] | |||||||||||||||||||||||||||||||||||||||||||
honestly I think these things cause a form of Gell Mann's Amnesia where when you use them for something you know already, the errors are obvious, but when you use them for something you don't understand already, the output is sufficiently plausible that you can't tell you're being misled. this makes the tool only useful for things you already know! I mean, just in this thread there's an anecdote from a guy who used it to check a diagnosis, but did he press through other possibilities or ask different questions because the answer was already known? | ||||||||||||||||||||||||||||||||||||||||||||
| ||||||||||||||||||||||||||||||||||||||||||||