Remix.run Logo
flir 4 hours ago

Now apply that thinking to computers. Or levers.

I've seen the argument that computers let us prop up and even scale governmental systems that would have long since collapsed under their own weight if they’d remained manual more than once. I'm not sure I buy it, but computation undoubtedly shapes society.

The author does seem quite keen on computers, but they've been "getting rid of the free-willed human in the loop" for decades. I think there might be some unexamined bias here.

I'm not even saying the core argument's wrong, exactly - clearly, tools build systems ("...and systems kill" - Crass). I guess I'm saying tools are value neutral. Guns don't kill people. So this argument against LLMs is an argument against all tools, unless you can explain how LLMs are a unique category of tool?

(Aside: calling out the lever sounds silly, but I think it's actually a great example. You can't do monumental architecture without levers, and the point in history where we start doing that is also the point where serious surplus extraction kicks in. I don't think that's coincidence).

prmph 4 hours ago | parent | next [-]

Tools are not value neutral in any way.

In my third world country, motorbikes, scooters, etc have exploded in popularity and use in the past decade. Many people riding these things have made the roads much more dangerous for all, but particularly for them. They keep dying by the hundreds per month, not only just due to the fact that they choose to ride them at all, but how they ride them: on busy high speed highways, weaving between lanes all the time, swerving in front of speeding cars, with barely any protective equipment whatsoever. A car crash is frequently very survivable; motorcycle crash, not so much. Even if you survive the initial collision, the probability of another vehicle running you over is very high on a busy highway.

On would think, given the clear evidence for how dangerous these things are, why do people (1) ride them at all on the highway, and (2) in such a dangerous manner? One might excuse (1) by recognizing that many are poor and can't buy a car, and the motorbikes represent economic possibility: for use in courier business, of being able to work much further from home, etc.

But here is the thing about (2), A motorbike wants to be ridden that way. No matter how well the rider recognizes the danger, there is only so much time can pass before the sheer expediency of riding that way overrides any sense of due caution. Where it would be safer to stop or keep to a fixed lane without any sudden movements, the rider thinks of the inconvenience of stopping, does a quick mental comparison it to the (in their minds) the minuscule additional risk, and carries on. Stopping or keeping to a proper lane in a car require far less discipline than doing that on a motorbike.

So this is what people mean when they say tech is not value neutral. The tech can theoretically be used in many ways. But some forms of use are so aligned with the form of the tech that in practice it shapes behavior.

flir 3 hours ago | parent [-]

> A motorbike wants to be ridden that way

That's a lovely example. But is the dangerous thing the bike, or the infrastructure, or the system that means you're late for work?

I completely get what you're saying. I was thinking of tools in the narrowest possible way - of the tool in isolation (I could use this gun as a doorstop). You're thinking of the tool's interface with its environment (in the real world nobody uses guns as doorstops). I can't deny that's the more useful way to think about tools ("computation undoubtedly shapes society").

idle_zealot 4 hours ago | parent | prev | next [-]

> The author does seem quite keen on computers, but they've been "getting rid of the free-willed human in the loop" for decades. I think there might be some unexamined bias here.

Certainly it's biased. I'm not the author, but to me there's a huge difference between computer/software as a tool, designed and planned, with known deterministic behavior/functionality, then put in the hands of humans, vs automating agency. The former I see as a pretty straightforward expansion of humanity's long-standing relationship with tools, from simple sticks to hand axes to chainsaws. The sort of automation AI-hype seems focused on doesn't have a great parallel in history. We're talking about building a statistical system to replace the human wielding the tool, mostly so that companies don't have to worry about hiring employees. Even if the machine does a terrible job and most of humanity, former workers and current users, all suffer, the bet is that it will be worth the cost savings.

ML is very cool technology, and clearly one of the major frontiers of human progress. At this stage though, I wish the effort on the packaging side was being spent on wrapping the technology in the form of reliable capabilities for humans to call on. Stuff like OCR at the OS level or "separate tracks" buttons in audio editors. The market has decided instead that the majority of our collective effort should go towards automated liability-sinks and replacing jobs with automation that doesn't work reliably.

And the end state doesn't even make sense. If all this capital investment does achieve breakthroughs and creat true AGI, do investors really think they'll see returns? They'll have destroyed the entire concept of an economy. The only way to leverage power at that point would be to try to exercise control over a robot army or something similarly sci-fi and ridiculous.

thwarted 15 minutes ago | parent [-]

"Automating agency" it's such a good way to describe what's happening. In the context of your last paragraph, if they succeed in creating AGI, they won't be able to exercise control over a robot army, because the robot army will have as much agency as humans do. So they will have created the very situation they currently find themselves in. Sans an economy.

amrocha 4 hours ago | parent | prev [-]

It’s a good thing that there’s centuries of philosophy on that subject and the general consensus is that no, tools are not “neutral” and do shape the systems they interact with, sometimes against the will of those wielding these tools.

See the nuclear bomb for an example.

flir 3 hours ago | parent [-]

I'm actually thinking of Marshall McLuhan. Maybe you're right, and tools aren't neutral. Does this mean that computation necessitates inequality? That's an uncomfortable conclusion for people who identify as hackers.