Remix.run Logo
ben8bit 11 hours ago

A lot of the magic of LLMs, I think, has been tarnished by these CEOs and other FAANG companies. It might have been a far more interesting world if they didn't bring "AI" or "AGI" into the conversation in such a politicized way.

Thanemate 8 hours ago | parent | next [-]

The power of the tool itself will be overshadowed by the motivations of its real owner. I can be both impressed by its ability to empower me, and be scared of the fact that the tools will change hands sooner or later and be deployed at scale to serve a goal I cannot, at minimum, support.

When most engineers and Marvel fans watched Tony Stark in Avengers collaborating with Jarvis they thought of Jarvis like "an AI with Google's knowledge where I can interact with him". It's true that we're close to that level interaction. However, the ultimate goal is to get as much as possible automated on Jarvis, to the point where Tony Stark is not needed or Tony Stark can be replaced by anyone with a mouth.

In this example, Jarvis isn't the goal but a checkpoint. The goal is a genie, providing software and research to anyone who is loaded with money, and knows how to rub the metaphorical lamp the right way.

ryandrake 5 hours ago | parent | next [-]

> The power of the tool itself will be overshadowed by the motivations of its real owner.

Not only that, but by how blatantly and openly these owners are discussing the tool's power. They are publicly crooning about their product's ability to replace workers. It's the first line of their sales pitch. And also, their customers (business CEOs) are publicly crooning about how awesome it is that they can reduce their headcount! Both the AI producers and their customers are absolutely bragging about worker displacement, and not a single guillotine has been constructed in the streets yet.

bluefirebrand 7 hours ago | parent | prev [-]

> the tools will change hands sooner or later and be deployed at scale to serve a goal I cannot, at minimum, support

Personally, the tools don't need to change hands at all. They are already in the hands of people who are deploying them at a scale to serve goals I cannot and do not support

The people running AI companies right now are some of the most evil motherfuckers on the planet

bluegatty 10 hours ago | parent | prev | next [-]

It'd be nice if they didn't use the term at all because I don't think they're useful relevant or real.

If we thought of all of this as 'stochastic data systems' then our heads would be in the right place as we thought about it just as 'powerful software' that can be used for good or bad purposes, and the negative externalizes will be derived from our use of it, not some inherent property.

dr_dshiv 10 hours ago | parent | next [-]

On the other hand, "magical new systems that provide almost unlimited capacity for intelligent work" is probably a more functional mental model. Genie can give you 1000 wishes till you reach your session limit.

mindok 10 hours ago | parent [-]

Not quite 1000 on Codex as of last day or two!

jacquesm 10 hours ago | parent | prev [-]

It would have been better if they didn't bootstrap it off the outright theft of a very large amount of IP only to lock it behind a paywall.

nekusar 6 hours ago | parent [-]

"Once men turned their thinking over to machines in the hope that this would set them free. But that only permitted other men with machines to enslave them." - Reverend Mother Gaius Helen Mohiam, Dune

keiferski 11 hours ago | parent | prev | next [-]

It’s the inevitable result of valuations based on hype and future potential, not business fundamentals. It incentivizes companies to be as hyperbolic as possible with their pitches and marketing.

Cryptocurrency is an interesting technology with some niche use cases, but it was pitched as replacing the entire money system. LLMs are extremely useful for certain types of work, but are pitched as AGI ending all work. Etc.

djtango 10 hours ago | parent | prev | next [-]

Magic or no, ultimately "AI" leads to labour displacement and it's just a continuation of the much broader trend of automation driven by computers.

Labour displacement leads to an erosion of standards of living and in a world that ties purpose to work is an existential threat on a very practical level.

It was always going to be met with violence once it became more than a curiosity for tinkerers.

andai 8 hours ago | parent | next [-]

We have, as a civilization, two paths before us:

a) Decouple the value of human life from labour.

b) Watch as the value of human life rapidly approaches zero.

---

Though I'd expand this by adding "technically alive" is not a very good standard to aim for. Ostensibly we're already heading for something like poverty level UBI + living in pod + eating the proverbial bugs. We need a level above that!

A great exploration of the pitfalls of "preserve humanity" as a reward function is the video game SOMA. I think you also need "preserve dignity" to make the life actually worth living.

(Path `a` is not without its pitfalls: what lack of survival pressure might do to the human culture and genome, I leave as an exercise for the reader! But path `b` I think we already have enough examples of, to know better...)

bluefirebrand 7 hours ago | parent | next [-]

> We have, as a civilization, two paths before us

You forgot C: Butlerian Jihad. mass outlaw AI research, AI usage, AI building, AI infrastructure, on penalty of death

It may not be a good option but it's there

NeutralCrane 6 hours ago | parent | next [-]

This will literally never happen so it is not worth considering

hollerith 6 hours ago | parent [-]

Just keep telling everyone that and hope they keep believing you.

patrick451 5 hours ago | parent | prev [-]

Exactly. At the very least, we should be treating AI like nuclear weapons. It can exist but it should be locked away and never used.

Throaway199999 7 hours ago | parent | prev [-]

When the value of human labour reaches zero the economy will collapse so that will be interesting.

mitthrowaway2 5 hours ago | parent [-]

I don't see that as a guaranteed outcome if there's something like UBI to sustain demand, and automation to sustain supply.

Throaway199999 2 hours ago | parent [-]

UBI is only valuable if money is valuable though...what are you going to trade it for if no one has a job and everyone has access to super powerful production tools like advanced LLMs (which are at the low end of automated tooling overall)?

MontyCarloHall 8 hours ago | parent | prev | next [-]

>Labour displacement leads to an erosion of standards of living

The two biggest labor displacements in human history were the agricultural and industrial revolutions, both of which resulted in enormous gains in human living standards. Can you think of a mass labor displacement that resulted in an overall erosion of living standards? I cannot.

PontifexMinimus 7 hours ago | parent | next [-]

AI is different. It promises to be able to do everything humans can, but better and more cheaply. When AIs can do every human job cheaper than the subsistence cost of employing a human, humans will be economically obsolete and worthless.

Then there's the minor issue of AI deciding to just wipe us out because we're in the way.

Taking everything together, AI more powerful than that which currently exists must not be created. This needs to be enforced with an international treaty, nuking data centers in non-compliant states if need be.

MontyCarloHall 7 hours ago | parent | next [-]

Before the industrial revolution, approximately 90% of people worked in agriculture. In fully industrialized countries, that figure is now <2%. That decrease constituted a nearly full replacement of everything humans were doing, better and more cheaply. While this time might be different, I don't think this is a given.

ccortes 7 hours ago | parent [-]

Maybe it’s not a given, but it is part of the sales pitch for CEOs. A few others have announced layoffs due to AI being better and more efficient than humans.

How much truth there is to it we don’t know for sure. But it’s not something to be ignored.

MontyCarloHall 6 hours ago | parent [-]

CEOs have been saying the exact same thing for the entire history of automation. Take computing, for example, an industry that's always been unusually amenable to automation:

— in the 1960/1970s, when compilers came out. "We don't need so many programmers hand-writing assembly anymore." Remember, COBOL (COmmon Business-Oriented Language) and FORTRAN (FORmula TRANslator) were marketed as human-readable languages that would let business professionals/scientists no longer be reliant on dedicated specialist programmers.

— in the 1980s/1990s, when higher-level languages came out. "C++ and Java mean we don't need an army of low-level C developers spending most of their effort manually managing memory, and rich standard libraries mean they don't have to continuously reimplement common data structures from scratch."

— in the 1990s/2000s, when frameworks came out. "These things are basically plug-and-play, now one full-stack developer can replace a dedicated sysadmin, backend engineer, database engineer, and frontend engineer."

While all of these statements are superficially true, the result was that the world produced more software (and developer jobs) than ever, as each level of abstraction freed developers from having to worry about lower-level problems and instead focus on higher-level solutions. Mel's intellect was freed from having to optimize the position of the memory drum [0] to allow him to focus on optimizing the higher-level logic/algorithms of the problem he's solving. As a result, software has become both more complex but also much more capable, and thus much more common.

While this time with AI may truly be different, I'm not holding my breath.

[0] http://catb.org/jargon/html/story-of-mel.html

Ray20 5 hours ago | parent | prev [-]

> AI is different

Literally the same thing.

> humans will be economically obsolete and worthless

Only if we are talking about a socialist system (and they are making pretty small progress in the field of AI). A human's value under a capitalist system is equal to their ability to create goods and services. And AI cannot make this ability smaller in any way.

A people's well-being is literally the goods and services created by that people. How can it decrease if the people's ability to produce those goods and services is not hindered in any way?

So, when it comes to the entire nation benefiting from AI, the most important thing is to preserve capitalism, and then the free market will distribute all the benefits. The main danger is a descent into socialism, with all these basic incomes, taxation out of production, and other practices that would lead to people being declared economically obsolete and mass executed to optimize their carbon footprint or something.

PontifexMinimus 5 hours ago | parent [-]

> A human's value under a capitalist system is equal to their ability to create goods and services. And AI cannot make this ability smaller in any way.

Yes they can. Your ability to produce goods and services depends on the infrastructure around you. When that's all run by AIs for AIs, humans won't be able to compete.

See that land over there producing food you need to eat? It turns out it's more economically efficient to pave it over with data centers etc.

Under a US-style capitalist system the rich (i.e. the AIs and AI-run businesses) control politics, the courts, etc, so the decisions the system makes will favour AIs over humans.

> So, when it comes to the entire nation benefiting from AI, the most important thing is to preserve capitalism, and then the free market will distribute all the benefits

...to the AI-run companies!

> The main danger is a descent into socialism, with all these basic incomes

Without UBI most people (or maybe everyone) would starve.

Ray20 4 hours ago | parent [-]

> depends on the infrastructure around you

Yeah, and who is creating those infrastructure? Jesus? This is the same part of goods and services.

> When that's all run by AIs for AIs, humans won't be able to compete.

So what? The ability to produce goods and services (and therefore general well-being) will not decrease because of that.

> It turns out it's more economically efficient to pave it over with data centers etc

By the way, a good argument against your position. Agricultural land is very cheap, but the vast majority of people who believe AI will put people out of work and worsen overall well-being are for some reason reluctant to buy this asset, which would see a catastrophic increase in value under such a scenario. So these people are either incapable of analyzing the economic processes, and their predictions are worthless, or they don’t really believe in such a scenario.

> will favour AIs over humans

Let me repeat: it does not reduce the ability to create goods and services. Under capitalism, this is the only characteristic that determines people's well-being.

> ...to the AI-run companies!

I think this is a fairly unlikely scenario. But even in this very unlikely case, people's well-being will not be reduced. Simply because of the mechanisms of creating well-being.

> Without UBI most people (or maybe everyone) would starve.

Economic theory (and 20th-century economic practice) demonstrates the exact opposite. In every country that attempted to effectively implement UBI, it led to a sharp decline in production and mass starvation. Literally every single time.

subw00f 4 hours ago | parent | prev | next [-]

The agricultural and industrial revolutions "weren't labor displacement", they were technological and social changes that happened unevenly and gradually in time and space and which resulted in labor displacement, but they were not the only cause, and they didn't happen BECAUSE of labor displacement. I would argue the subsequent labor displacement caused a minor part of the social gains to be later distributed and realized through class struggle, but that's beside the point. Most wars cause mass labor displacement and military technological advancements that later translate into society as a whole. Are you prepared to argue for wars? If you are American, you are experiencing firsthand the effects of what once was a major part of your industrial labor being absorbed by China. It has led to massive inequality and erosion of standards of living in the US. Not so much for the Chinese working class, which has increasingly improved their standards of living. Are you going to argue for it? I think if we only look at things from a limited perspective, and in this instance a technocratic and teleologic view of history, as in history has a designed finality and this finality will be achieved through unrestrained development of production forces, you are bound to quietly take part in the destruction of society and nature, now viewed as externalities, and accept the worst of atrocities in the name of "advancement", while most of any gains are captured in the short term by a minority.

throwaway28469 7 hours ago | parent | prev [-]

[dead]

Razengan 8 hours ago | parent | prev | next [-]

https://en.wikipedia.org/wiki/Lamplighter

georgemcbay 10 hours ago | parent | prev | next [-]

> in a world that ties purpose to work is an existential threat on a very practical level.

I don't disagree that we tie purpose to work and severing that tie will have negative societal consequences, but it is far more impactful that we tie the ability to continue to exist to work (for anyone not lucky enough to already be wealthy).

If I suddenly became unemployable tomorrow I'm positive I could find alternate purpose in my life to fill that gap, I already volunteer for various causes and could happily do more of the same to fill in the gaps left by lack of work. What I couldn't do is feed myself, keep myself housed, and get medical care (especially in the US, where this is very directly tied to work).

The really big fuckup we are committing as a society in the US (may or may not apply to each person's country individually) isn't just this looming threat of massive labor displacement due to AI, it is that instead of planning for any sort of soft landing we are continually slashing what few social safety nets already exist. We are creating the conditions for desperation that likely will result in increasing violence as outlined in the linked post.

ryandrake 5 hours ago | parent | next [-]

> The really big fuckup we are committing as a society in the US (may or may not apply to each person's country individually) isn't just this looming threat of massive labor displacement due to AI, it is that instead of planning for any sort of soft landing we are continually slashing what few social safety nets already exist.

Think of the alternative, though: If we planned for a soft landing and implemented safety nets and started transitioning ourselves to a society where people didn't have to work to survive, then a few trillion dollar companies would make slightly less profit every year. We simply cannot allow that. Won't someone think of those trillion dollar companies for a minute?

Throaway199999 7 hours ago | parent | prev [-]

^^^^

yfw 10 hours ago | parent | prev [-]

If ai benefitted everyone and not just the billionaires we would be viewing it differently.

quantummagic 9 hours ago | parent [-]

That's a truism. But it ignores The Iron Law of Oligarchy, Pareto Principle, and dozens more that remind us that power tends towards centralization. It's currently fashionable to call out the billionaires, but if you removed them, they'd just be replaced by corrupt government officials, or something else.

That's not to say we should just throw up our hands and accept every social injustice. But IMHO we shouldn't go around simplistically implying that all social ills will be solved by neutering the billionaire class.

singpolyma3 8 hours ago | parent | next [-]

More importantly we shouldn't deny the rest of humanity benefits on the basis that the majority of the benefit accrues to the powerful. We should strive to change the distribution pattern, not remove the benefit.

theseanz 7 hours ago | parent | prev | next [-]

“But IMHO we shouldn't go around simplistically implying that all social ills will be solved by neutering the billionaire class.”

You’re right. Instead of implying, we should be taking active steps to do it.

Rury 5 hours ago | parent [-]

Right, giving up is actually how these things end up becoming principles/laws. Power centralizes because people become complacent and ignorant on matters of power, so there ends up being a power vacuum, to which others seize the opportunity. But absolute power centralization almost never occurs, due to the delegation that is necessary to wield that power in practice, and so these two forces end up balancing each other. As such, the equilibrium point (or point of maximum entropy) ends up being some type of oligarchy. But anyone can take steps to address this and adjust this equilibrium point, but it takes active work.

pydry 8 hours ago | parent | prev | next [-]

>we shouldn't go around simplistically implying that all social ills will be solved by neutering the billionaire class.

Not to put too fine a point on it but this was basically how the Japanese post war economic miracle was achieved.

In this case it was America which ordered the Japanese oligarchy to be stripped of its wealth.

We've had decades of propaganda telling us that this is the worst thing we could do for economic growth though so it's natural to doubt.

ndsipa_pomu 8 hours ago | parent | prev | next [-]

The problem with billionaires is that they are able to hoard so much money by exploiting others. We would be much better off if billionaires weren't given so much advantage by Capitalism as those resources would be much more useful if distributed.

The biggest problem we currently have with billionaires is that they are now so rich that the world becomes like a game to them and some of them are deliberately pushing us to a dystopia where non-billionaires become functional slaves (c.f. Amazon workers).

throwaway613746 7 hours ago | parent | prev [-]

[dead]

sigmoid10 11 hours ago | parent | prev | next [-]

Unfortunately, this is the only way to get enough venture capital to support the compute needs for this kind of technology. Who is going to spend hundreds on billions on a vague idea without regular claims that this will upend the existing economy in six to twelve months and whoever owns it will become unfathomably rich? And despite all the actual developments we have seen going against that idea, investors keep falling for it. This will continue until it crashes, one way or another. The question is how long it can build up and how deep the fall will be. LLMs will certainly change the economy in the end, but so did mortgage backed securities.

pydry 10 hours ago | parent [-]

It's a sad indictment of our society that there is always a shortage of money for medical care, infrastructure, housing, food stamps and space exploration but always a surplus of cash for war and tools that purport to replace the workforce.

chongli 7 hours ago | parent | next [-]

There will always be a shortage of money for medical care. The dirty secret of social medicine is that a small percentage of the population are essentially unhappy utility monsters [1] who gain little or no benefit no matter how many resources are poured into treating them.

[1] https://en.wikipedia.org/wiki/Utility_monster

gmerc 10 hours ago | parent | prev | next [-]

The opportunity cost to society of performative model training is stunning - 400M for a grok training run to dominate the charts for 2 weeks

roenxi 10 hours ago | parent | prev | next [-]

> It's a sad indictment of our society that there is always a shortage of money for medical care...

It has nothing to do with society; there is infinite demand for medical care. The upper limit is whatever it takes to live until the universe's heat death in good health. That takes a lot of resources.

However much society spends on medical care, there is always more that could be spent. The modern era has the best, most affordable medical care in history and people are showing no signs of being satisfied at all.

While war spending generally just causes pain for no gain it doesn't change the fact that there will never be enough available to satisfy people's demand for medical care. Every single time people get what they want they just come up with a new aspirational minimum standard.

philwelch 10 hours ago | parent | prev | next [-]

There isn’t really a shortage of money for those things, just rampant levels of fraud, corruption, and incompetence in the government to make those things artificially expensive. California spends so much money on high speed rail and gets 0 feet of track because they’re not paying for track; the whole thing is a scam where the politicians give taxpayer money to their political supporters in exchange for political support. Defense isn’t immune to this either; Boeing, which builds a shitty heavy lift rocket out of Space Shuttle spare parts and delivers it late and over budget, pulls the exact same bullshit with their defense contracts, and there’s always some shitty Senator siding with them against the American people whenever anyone gets upset.

vixen99 7 hours ago | parent | prev | next [-]

The current British government should be a shining beacon for you! Its welfare bill actually outstrips national income by far. Britain's pathetic defense capabilities cannot even see off Russian warships that intimidate by deliberately hanging around British waters assessing our vital undersea cabling. The UK government has now asked France if it can help deter these ships. Tangentially - I should add that even with their massive expenditure on the National Health System (NHS) it's not enough and too many people feel that they have to go abroad to get life-saving operations and procedures. If they can afford it of course. But sure, that is another matter. As far as I can tell, there seems to be pretty much an apolitical consensus on both areas.

twsahjklf 6 hours ago | parent [-]

Curous how france manages to have enough resources to protect its own waters, help the UK protect theirs, AND have free universal healthcare...

block_dagger 10 hours ago | parent | prev [-]

War accelerated evolution, it’s why it exists.

bregma 10 hours ago | parent | next [-]

So did compassion, probably in a greater amount. And yet the greater amount of resources goes into war at the expense of compassion.

Humanity has taken control of its own evolution and no longer relyies on natural selection to be the driving force for change. Using evolution as an excuse to make bad and immoral choices is a poor argument and should be left back in the stone age.

pheaded_while9 6 hours ago | parent | next [-]

Yes, the social darwinist approach inevitably lead to eugenical thinking and the human meat grinder that follows. We, as being with the capacity to understand harmful v. non-harmful behaviour, have a consequence to harmful behaviour, collectively: human suffering and the suppression of freedom.

djeastm 6 hours ago | parent | prev [-]

>Humanity has taken control of its own evolution

Has it taken full control of it or just partial control?

jacquesm 10 hours ago | parent | prev [-]

You have cause and effect mixed up.

bethekidyouwant 6 hours ago | parent | prev | next [-]

Were you around for the first release of GPT? It was not the CEOs that were kvetching about being paperclipped by AGI

hackrmn 10 hours ago | parent | prev | next [-]

I don't want to stir up the hornet's nest here, but in my humble opinion the entire problem rests on the unabated and unchecked modern and "late-stage" capitalism model, championed by the U.S. and since exported to and sprung good root everywhere else, even in Europe where it as of yet has a few more checks and balances (which unsurprisingly draws a lot of ire from its acolytes and priests across the Atlantic).

Soviet Union lost due to an inferior societal model, but this too is too much along what once was a relatively sustainable path. The American dream is now a parody of itself, as it takes more to end up with the rest of them, I could go on about the irony of wanting to escape the pit but not wanting to acknowledge the pit is the 99% of the U.S. -- Not Altmans, Bezos'es, Musks or Trumps or their hordes of peripheral elites.

Point being, the model doesn't work _today_ with its cancerous appetite and correspondingly absurd neglect of the human, _any_ human. We can't have humanism and the kind of AI we're about to "enjoy".

The acceleration of wealth disparity may prove to be nearly geometrical, as the common man is further stripped of any capacity to inflict change on the "system". I hope I am wrong, but for all their crimes, anarchy and in a twist of irony -- inhumane treatment of opponent -- the October revolutionaries in Russia, yes bolsheviks, were merely a natural response to a similar atmosphere in Russia at the turn of the previous century. It's just that they didn't have mass surveillance used against them in the same capacity our gadgets allow the "governments" today, nor were they aided by AI which is _also_ something that can be used against an entire slice of populace (a perfect application of general principles put in action). So although the situation may become similar, we're increasingly in no position to change it. The difference may be counted in _generations_, as in it will take multiple generations to dismantle the power structures we allow be put in place now, with Altmans etc. These people may not be evil, but history proves they only have to be short-sighted enough for evil to take root and thrive.

Sorry for the wall of text, but I do agree with the point of the blog post in a way -- demanding people become civilised and refrain from throwing eggs (or Molotovs) on celebrities that are about to swing _entire governments_, is not seeing the forest for the trees.

There's also no precedent in a way -- our historical cataclysms we have created ourselves, have been on a smaller scale, so we're spiraling outwards and not all of the tools we think we have, are going to have the effect required in order to enact the change we want. In the worst case, of course.

fastforwardius 8 hours ago | parent [-]

Which part of societal model you find inferior? I thought it was mostly economics and bureaucracy.

redsocksfan45 10 hours ago | parent | prev | next [-]

[dead]

threethirtytwo 10 hours ago | parent | prev | next [-]

No it’s tarnished by becoming too popular. Just like how people hated nickelback, if you remember.

7 hours ago | parent [-]
[deleted]
tsukurimashou 7 hours ago | parent | prev [-]

stealing and reusing the work of thousands of people as your own is magic now?

joquarky 5 hours ago | parent [-]

Do they not know how to make backups?