Remix.run Logo
sram1337 2 days ago

Who are you writing for?

You can skip about half the article

> Language models are capable of producing and digesting substantial volumes of text. More text than any single person should ever be expected to handle in the course of a lifetime. Compared to the speed at which a human can read and write, these models are the linguistic equivalent of a chainsaw. It’s much the same with computer vision, and generative algorithms producing videos and images of events that never occurred and things that don’t exist.

It’s my belief that, in our current artificial intelligence boom’s haste to grab as much business as possible, we are essentially handing out chainsaws to unqualified and inexperienced people who don’t appreciate the responsibility entrusted to them, and who probably don’t require such power in the first place. And that is not the consumers’ fault—this is all on the companies that are pushing it into their laps.

> Some would say that, compared to the tangible hazards of losing a bodily extremity or dropping a pine trunk through the bedroom ceiling, misuse of AI by irresponsible or malicious actors sounds downright genteel. But think about how quickly memes and misinformation flow through social media and the larger internet. Whoever first used the word “viral” to describe such spread, they hit that nail right on the head.Social media craves that stuff, and AI provides the almost effortless ability to produce unlimited quantities of exactly what it desires. And the reward for the creator, as much as the users of an AI product can be called the “creator” of that content, is a shower of likes, reposts, updoots, badges, and the tiny dribble of dopamine brought by those things. Thus the system perpetuates itself.

> Unlike the venerable chainsaw, AI doesn’t give any indication that it is being misused. It doesn’t growl, shake, kick, or protest. It doesn’t even give a useful indication that “hey this result might be completely useless hogwash, I dunno.” The user doesn’t get to see what happens inside, or know precisely where the information originally came from, or evaluate how the model may have compromised reality to produce an output that looked plausibly like something a human would accept. It just hums along quietly, churning out line after line of approximately whatever it believed was asked of it.