| ▲ | shlip 5 hours ago |
| > AI systems exist to reinforce and strengthen existing structures of power and violence. Exactly. You can see that with the proliferation of chickenized reverse centaurs[1] in all kinds of jobs. Getting rid of the free-willed human in the loop is the aim now that bosses/stakeholders have seen the light. [1] https://pluralistic.net/2022/04/17/revenge-of-the-chickenize... |
|
| ▲ | Glemkloksdjf 3 hours ago | parent | next [-] |
| If you are a software engineere, you can leverage AI a lot better to write code than anyone else. The complexity of good code, is still complicated. which means 1. if software development is really solved, everyone else also gets a huge problem (ceo, cto, accountants, designers, etc. etc.) so we are in the back of the ai doomsday line. And 2. it allows YOU to leverage AI a lot better which can enable you to create your own product. In my startup, we leverage AI and we are not worried that another company just does the same thing because even if they do, we know how to write good code and architecture and we are also using AI. So we will always be ahead. |
|
| ▲ | countWSS 4 hours ago | parent | prev | next [-] |
| Sounds like Manna control system:
https://marshallbrain.com/manna |
|
| ▲ | flir 4 hours ago | parent | prev | next [-] |
| Now apply that thinking to computers. Or levers. I've seen the argument that computers let us prop up and even scale governmental systems that would have long since collapsed under their own weight if they’d remained manual more than once. I'm not sure I buy it, but computation undoubtedly shapes society. The author does seem quite keen on computers, but they've been "getting rid of the free-willed human in the loop" for decades. I think there might be some unexamined bias here. I'm not even saying the core argument's wrong, exactly - clearly, tools build systems ("...and systems kill" - Crass). I guess I'm saying tools are value neutral. Guns don't kill people. So this argument against LLMs is an argument against all tools, unless you can explain how LLMs are a unique category of tool? (Aside: calling out the lever sounds silly, but I think it's actually a great example. You can't do monumental architecture without levers, and the point in history where we start doing that is also the point where serious surplus extraction kicks in. I don't think that's coincidence). |
| |
| ▲ | prmph 4 hours ago | parent | next [-] | | Tools are not value neutral in any way. In my third world country, motorbikes, scooters, etc have exploded in popularity and use in the past decade. Many people riding these things have made the roads much more dangerous for all, but particularly for them. They keep dying by the hundreds per month, not only just due to the fact that they choose to ride them at all, but how they ride them: on busy high speed highways, weaving between lanes all the time, swerving in front of speeding cars, with barely any protective equipment whatsoever. A car crash is frequently very survivable; motorcycle crash, not so much. Even if you survive the initial collision, the probability of another vehicle running you over is very high on a busy highway. On would think, given the clear evidence for how dangerous these things are, why do people (1) ride them at all on the highway, and (2) in such a dangerous manner? One might excuse (1) by recognizing that many are poor and can't buy a car, and the motorbikes represent economic possibility: for use in courier business, of being able to work much further from home, etc. But here is the thing about (2), A motorbike wants to be ridden that way. No matter how well the rider recognizes the danger, there is only so much time can pass before the sheer expediency of riding that way overrides any sense of due caution. Where it would be safer to stop or keep to a fixed lane without any sudden movements, the rider thinks of the inconvenience of stopping, does a quick mental comparison it to the (in their minds) the minuscule additional risk, and carries on. Stopping or keeping to a proper lane in a car require far less discipline than doing that on a motorbike. So this is what people mean when they say tech is not value neutral. The tech can theoretically be used in many ways. But some forms of use are so aligned with the form of the tech that in practice it shapes behavior. | | |
| ▲ | flir 3 hours ago | parent [-] | | > A motorbike wants to be ridden that way That's a lovely example. But is the dangerous thing the bike, or the infrastructure, or the system that means you're late for work? I completely get what you're saying. I was thinking of tools in the narrowest possible way - of the tool in isolation (I could use this gun as a doorstop). You're thinking of the tool's interface with its environment (in the real world nobody uses guns as doorstops). I can't deny that's the more useful way to think about tools ("computation undoubtedly shapes society"). |
| |
| ▲ | idle_zealot 4 hours ago | parent | prev | next [-] | | > The author does seem quite keen on computers, but they've been "getting rid of the free-willed human in the loop" for decades. I think there might be some unexamined bias here. Certainly it's biased. I'm not the author, but to me there's a huge difference between computer/software as a tool, designed and planned, with known deterministic behavior/functionality, then put in the hands of humans, vs automating agency. The former I see as a pretty straightforward expansion of humanity's long-standing relationship with tools, from simple sticks to hand axes to chainsaws. The sort of automation AI-hype seems focused on doesn't have a great parallel in history. We're talking about building a statistical system to replace the human wielding the tool, mostly so that companies don't have to worry about hiring employees. Even if the machine does a terrible job and most of humanity, former workers and current users, all suffer, the bet is that it will be worth the cost savings. ML is very cool technology, and clearly one of the major frontiers of human progress. At this stage though, I wish the effort on the packaging side was being spent on wrapping the technology in the form of reliable capabilities for humans to call on. Stuff like OCR at the OS level or "separate tracks" buttons in audio editors. The market has decided instead that the majority of our collective effort should go towards automated liability-sinks and replacing jobs with automation that doesn't work reliably. And the end state doesn't even make sense. If all this capital investment does achieve breakthroughs and creat true AGI, do investors really think they'll see returns? They'll have destroyed the entire concept of an economy. The only way to leverage power at that point would be to try to exercise control over a robot army or something similarly sci-fi and ridiculous. | | |
| ▲ | thwarted 14 minutes ago | parent [-] | | "Automating agency" it's such a good way to describe what's happening. In the context of your last paragraph, if they succeed in creating AGI, they won't be able to exercise control over a robot army, because the robot army will have as much agency as humans do. So they will have created the very situation they currently find themselves in. Sans an economy. |
| |
| ▲ | amrocha 4 hours ago | parent | prev [-] | | It’s a good thing that there’s centuries of philosophy on that subject and the general consensus is that no, tools are not “neutral” and do shape the systems they interact with, sometimes against the will of those wielding these tools. See the nuclear bomb for an example. | | |
| ▲ | flir 3 hours ago | parent [-] | | I'm actually thinking of Marshall McLuhan. Maybe you're right, and tools aren't neutral. Does this mean that computation necessitates inequality? That's an uncomfortable conclusion for people who identify as hackers. |
|
|
|
| ▲ | Aeolun 4 hours ago | parent | prev | next [-] |
| How is that different from making manual computation obsolete with the help of excel? |
|
| ▲ | fennecfoxy 3 hours ago | parent | prev | next [-] |
| Lmao Cory Doctorow. Desperately trying to coin another catchphrase again. |
|
| ▲ | lynx97 4 hours ago | parent | prev [-] |
| I am surprised (and also kind of not) to see this kind of tech hate on HN of all places. Would you prefer we heat our homes by burning wood, carry water from the nearby spring, and ride horses to visit relatives? Progress is progress, and has always changed things. Its funny that apparently, "progressive" left-leaning people are actually so conservative at the core. So far, in my book, the advancements in the last 100 or even more years have mostly always brought us things I wouldn't want to miss these days. But maybe some people would be happier to go back to the dark ages... |
| |
| ▲ | seu 4 hours ago | parent | next [-] | | > Progress is progress, and has always changed things. Its funny that apparently, "progressive" left-leaning people are actually so conservative at the core. I am surprised (and also kind of not) to see this lack of critical reflection on HN of all places. Saying "progress is progress" serves nobody, except those who drive "progress" in directions that benefits them. All you do by saying "has always changed things" is taking "change" at face value, assuming it's something completely out of your control, and to be accepted without any questioning it's source, it's ways or its effects. > So far, in my book, the advancements in the last 100 or even more years have mostly always brought us things I wouldn't want to miss these days. But maybe some people would be happier to go back to the dark ages... Amazing depiction of extremes as the only possible outcomes. Either take everything that is thrown at us, or go back into a supposed "dark age" (which, BTW, is nowadays understood to not have been that "dark" at all) . This, again, doesn't help have a proper discussion about the effects of technology and how it comes to be the way it is. | | |
| ▲ | Glemkloksdjf 3 hours ago | parent [-] | | Dark age was dark. Human rights, female! rights, hunger, thirst, no progress at all, hard lifes. So are you able, realisticly, to stop progress around a whole planet? Tbh. getting an alignment across the planet to slow down or stop AI would be the equivilent of stoping capitalism and actually building a holistic planet for us. I think ai will force the hand of capitalism but i don't think we will be able to create a star trek universe without getting forced | | |
| ▲ | trashb an hour ago | parent [-] | | > Dark age was dark. Human rights, female! rights, hunger, thirst, no progress at all, hard lifes. There was progress in the Middle Ages, hence the difference between the early and late Middle Ages. Most information was mouth to mouth instead of written down. "The term employs traditional light-versus-darkness imagery to contrast the era's supposed darkness (ignorance and error) with earlier and later periods of light (knowledge and understanding)." "Others, however, have used the term to denote the relative scarcity of written records regarding at least the early part of the Middle Ages" https://en.wikipedia.org/wiki/Dark_Ages_(historiography) |
|
| |
| ▲ | lm28469 3 hours ago | parent | prev | next [-] | | > Would you prefer we heat our homes by burning wood, carry water from the nearby spring, and ride horses to visit relatives? I'm more surprised that seemingly educated people have such simplistic views as "technology = progress, progress = good hence technology = good". Vaccines and running water are tech, megacorps owned "AI" being weaponised by surveillance obsessed governments is also tech. If you don't push back on "tech" you're just blindingly accepting whatever someone else decided for you. Keep in mind the benefits of tech since the 80s have mostly been pocketed by the top 10%, the pleb still work as much, retire as old, &c. despite what politicians and technophiles have been saying | |
| ▲ | andrepd 4 hours ago | parent | prev [-] | | "You don't like $instance_of_X? You must want to get rid of all $X" has got to be one of the most intellectually lazy things you could say. You don't like leaded gasoline? You must want us to walk everywhere. Come on... | | |
| ▲ | lynx97 4 hours ago | parent [-] | | A tool is a tool. These AI critics sound to me like people who have hit their finger with a hammer, and now advocate against using them altogether. Yes, tech has always had two sides. Our "job" as humans is to pick the good parts, and avoid the bad. Nothing new, nothing exceptional. | | |
| ▲ | lm28469 3 hours ago | parent | next [-] | | > A tool is a tool. These AI critics sound to me like people who have hit their finger with a hammer, and now advocate against using them altogether. Speaking of wonky analogies, have you considered that other people have access to these hammers and are aiming for your head ? And that some people might not want to be hit on the head by a hammer | |
| ▲ | andrepd 2 hours ago | parent | prev [-] | | More lazy analogies... Yes a hammer is a tool, so is a machine gun, a nuke, or the guy with his killdozer. So what are you gonna do? Nothing to see here, discussion closed. This is not an interesting conversation. |
|
|
|