Remix.run Logo
agentultra a day ago

If I gave you a gun without a safety could you be the one to blame when it goes off because you weren’t careful enough?

The problem with this analogy is that it makes no sense.

LLMs aren’t guns.

The problem with using them is that humans have to review the content for accuracy. And that gets tiresome because the whole point is that the LLM saves you time and effort doing it yourself. So naturally people will tend to stop checking and assume the output is correct, “because the LLM is so good.”

Then you get false citations and bogus claims everywhere.

sigbottle a day ago | parent | next [-]

Sorry, I'm not following the gun analogies at all

But regardless, I thought the point was that...

> The problem with using them is that humans have to review the content for accuracy.

There are (at least) two humans in this equation. The publisher, and the reader. The publisher at least should do their due diligence, regardless of how "hard" it is (in this case, we literally just ask that you review your OWN CITATIONS that you insert into your paper). This is why we have accountability as a concept.

zdragnar a day ago | parent | prev | next [-]

> If I gave you a gun without a safety could you be the one to blame when it goes off because you weren’t careful enough?

Absolutely. Many guns don't have safties. You don't load a round in the chamber unless you intend on using it.

A gun going off when you don't intend is a negligent discharge. No ifs, ands or buts. The person in possession of the gun is always responsible for it.

bluGill a day ago | parent [-]

> A gun going off when you don't intend is a negligent discharg

false. A gun goes off when not intended too often to claim that. It has happned to me - I then took the gun to a qualified gunsmith for repairs.

A gun they fires and hits anything you didn't intend to is negligent discharge even if you intended to shoot. Gun saftey is about assuming a gun that could possible fire will and ensuring nothing bad can happen. When looking at gun in a store (that you might want to buy) you aim it at an upper corner where even if it fires the odds of something bad resulting is the least lively to happen (it should be unloaded - and you may have checked, but you still aim there!)

same with cat toy lazers - they should be safe to shine in an eye - but you still point in a safe direction.

oceansweep a day ago | parent | prev | next [-]

Yes. That is absolutely the case. One of the Most popular handguns does not have a safety switch that must be toggled before firing. (Glock series handguns)

If someone performs a negligent discharge, they are responsible, not Glock. It does have other safety mechanisms to prevent accidental fires not resulting from a trigger pull.

agentultra a day ago | parent [-]

You seem to be getting hung up on the details of guns and missing the point that it’s a bad analogy.

Another way LLMs are not guns: you don’t need a giant data centre owned by a mega corp to use your gun.

Can’t do science because GlockGPT is down? Too bad I guess. Let’s go watch the paint dry.

The reason I made it is because this is inherently how we designed LLMs. They will make bad citations and people need to be careful.

baxtr a day ago | parent | prev | next [-]

>“because the LLM is so good.”

That's the issue here. Of course you should be aware of the fact that these things need to be checked - especially if you're a scientist.

This is no secret only known to people on HN. LLMs are tools. People using these tools need to be diligent.

imiric a day ago | parent | prev [-]

> LLMs aren’t guns.

Right. A gun doesn't misfire 20% of the time.

> The problem with using them is that humans have to review the content for accuracy.

How long are we going to push this same narrative we've been hearing since the introduction of these tools? When can we trust these tools to be accurate? For technology that is marketed as having superhuman intelligence, it sure seems dumb that it has to be fact-checked by less-intelligent humans.