| ▲ | agentultra a day ago | |||||||
If I gave you a gun without a safety could you be the one to blame when it goes off because you weren’t careful enough? The problem with this analogy is that it makes no sense. LLMs aren’t guns. The problem with using them is that humans have to review the content for accuracy. And that gets tiresome because the whole point is that the LLM saves you time and effort doing it yourself. So naturally people will tend to stop checking and assume the output is correct, “because the LLM is so good.” Then you get false citations and bogus claims everywhere. | ||||||||
| ▲ | sigbottle a day ago | parent | next [-] | |||||||
Sorry, I'm not following the gun analogies at all But regardless, I thought the point was that... > The problem with using them is that humans have to review the content for accuracy. There are (at least) two humans in this equation. The publisher, and the reader. The publisher at least should do their due diligence, regardless of how "hard" it is (in this case, we literally just ask that you review your OWN CITATIONS that you insert into your paper). This is why we have accountability as a concept. | ||||||||
| ▲ | zdragnar a day ago | parent | prev | next [-] | |||||||
> If I gave you a gun without a safety could you be the one to blame when it goes off because you weren’t careful enough? Absolutely. Many guns don't have safties. You don't load a round in the chamber unless you intend on using it. A gun going off when you don't intend is a negligent discharge. No ifs, ands or buts. The person in possession of the gun is always responsible for it. | ||||||||
| ||||||||
| ▲ | oceansweep a day ago | parent | prev | next [-] | |||||||
Yes. That is absolutely the case. One of the Most popular handguns does not have a safety switch that must be toggled before firing. (Glock series handguns) If someone performs a negligent discharge, they are responsible, not Glock. It does have other safety mechanisms to prevent accidental fires not resulting from a trigger pull. | ||||||||
| ||||||||
| ▲ | baxtr a day ago | parent | prev | next [-] | |||||||
>“because the LLM is so good.” That's the issue here. Of course you should be aware of the fact that these things need to be checked - especially if you're a scientist. This is no secret only known to people on HN. LLMs are tools. People using these tools need to be diligent. | ||||||||
| ▲ | imiric a day ago | parent | prev [-] | |||||||
> LLMs aren’t guns. Right. A gun doesn't misfire 20% of the time. > The problem with using them is that humans have to review the content for accuracy. How long are we going to push this same narrative we've been hearing since the introduction of these tools? When can we trust these tools to be accurate? For technology that is marketed as having superhuman intelligence, it sure seems dumb that it has to be fact-checked by less-intelligent humans. | ||||||||