Remix.run Logo
imiric a day ago

> LLMs aren’t guns.

Right. A gun doesn't misfire 20% of the time.

> The problem with using them is that humans have to review the content for accuracy.

How long are we going to push this same narrative we've been hearing since the introduction of these tools? When can we trust these tools to be accurate? For technology that is marketed as having superhuman intelligence, it sure seems dumb that it has to be fact-checked by less-intelligent humans.