Remix.run Logo
627467 3 days ago

> I was made redundant recently "due to AI" (questionable) and it feels like my works in some way contributed to my redundancy where my works contributed to the profits made by these AI megacorps while I am left a victim.

I think anyone here can understand and even share that feeling. And I agree with your "questionable" - its just the lame HR excuse du jour.

My 2c:

- AI megacorps aren't the only ones gaining, we all are. the leverage you have to build and ship today is higher than it was five years ago.

- It feels like megacorps own the keys right now, but that’s a temporary. In a world of autonomous agents and open-weight models, control is decentralized.inference costs continue to drop, you dont need to be running on megacorp stacks. Millions (billions?) of agents finding and sharing among themselves. How will megacorps stop?

- I see the advent of LLMs like the spread of literacy. Scribes once held a monopoly on the written word, which felt like a "loss" to them when reading/writing became universal. But today, language belongs to everyone. We aren't losing code; we are making the ability to code a universal human "literacy."

latexr 2 days ago | parent | next [-]

> AI megacorps aren't the only ones gaining, we all are.

No, no we are not.

> the leverage you have to build and ship today is higher than it was five years ago.

I don’t want more “leverage to build and ship”, I want to live in a world where people aren’t so disconnected from reality and so lonely they have romantic relationships with a chat window; where they don’t turn off their brains and accept any wrong information because it comes from a machine; where propaganda, mass manipulation, and surveillance aren’t at the ready hands of any two-bit despot; where people aren’t so myopic that they only look at their own belly button and use case for a tool that they are incapable of recognising all the societal harms around them.

> We aren't losing code; we are making the ability to code a universal human "literacy."

No, no we are not. What we are, however, is making ever increasingly bad comparisons.

Literacy implies understanding. To be able to read and write, you need to be able to understand how to do both. LLMs just spit text which you don’t need to understand at all, and increasingly people are not even caring to try to understand it. LLM generated code in the hands of someone who doesn’t read it is the opposite of literacy.

mulr00ney 2 days ago | parent | next [-]

>I don’t want more “leverage to build and ship”, I want to live in a world where people aren’t so disconnected from reality and so lonely they have romantic relationships with a chat window; where they don’t turn off their brains and accept any wrong information because it comes from a machine; where propaganda, mass manipulation, and surveillance aren’t at the ready hands of any two-bit despot; where people aren’t so myopic that they only look at their own belly button and use case for a tool that they are incapable of recognising all the societal harms around them.

Preach. Every time I read people doing this weird LARP on this website of "you have so much more leverage, great time to be a founder" I want to put my head through the drywall.

627467 2 days ago | parent | prev | next [-]

> literacy implies understanding

Agree. Do we not understand how LLMs work? Some of us understand better than others, just like literacy is also not guaranteed just because you learned the alphabet.

Accepting the output of an LLM is really materially not different from accepting books, newspapers, opinion makers, academics at face value. Maybe different only in speed of access?

> LLM generated code in the hands of someone who doesn’t read it is the opposite of literacy.

"A popsi article title or paper abstract/conclusion in the mind of someone who doesn't read is the opposite of literacy."

latexr 2 days ago | parent [-]

I’m not sure I understand your point. Mind clarifying? It seems you might be trying to contradict what I said but are in fact only adding to it.

> just like literacy is also not guaranteed just because you learned the alphabet.

I didn’t claim learning the alphabet equals literacy, you did. Your argument comes down to “you’re not literate if you’re not literate”. Which, yes, of course.

> Accepting the output of an LLM is really materially not different from (…)

Multiple things can be true at once. If someone says “angry stupid people with machine guns are dangerous”, responding “angry stupid people with explosives are dangerous” does nothing to the original point. The angry stupid people are part of the problem, sure, but so are the tool which are enabling them to be dangerous. If poison is being dumped in a river and slowly killing the ecosystem, then someone else comes along wanting to dump even more of a different poison, the correct response is to stop both, not shrug it off and stop none.

dzhiurgis 2 days ago | parent | prev | next [-]

> I am left a victim

> I want to live in a world where people aren’t so disconnected from reality

It looks like you are the problem, not the world. Hope you find happiness!

latexr a day ago | parent [-]

What the bloody heck are you on about? That first quote is completely fabricated. I’d also like to live in a world where people don’t argue in bad faith, but since I have no pretence that will happen, at least I’m thankful when bad faith actors do such a poor job of concealing it.

pegasus 2 days ago | parent | prev [-]

But LLMs can also explain code, in fact they're fantastic at that. They can also be used to build anti-censorship, surveillance-avoidance and fact-checking tools. We are all empowered by them, it's just up to us to employ them so as to nudge society towards where we'd like it to go. Instead of giving up prematurely.

simonask 2 days ago | parent [-]

[dead]

j_bum 3 days ago | parent | prev | next [-]

I’m not sure if the analogy is yours, but the scribe note really struck a chord with me.

I’m not a professionally trained SWE (I’m a scientist who does engineering work). LLMs have really accelerated my ability to build, ideate, and understand systems in a way that I could only loosely gain from sometimes grumpy but mostly kind senior engineers in overcrowded chat rooms.

The legality of all of this is dubious, though, per the parent. I GPL licensed my FOSS scientific software because I wanted it to help advance biomedical research. Not because I wanted it to help a big corp get rich.

But then again, maybe code like mine is what is holding these models back lol.

TeMPOraL 2 days ago | parent [-]

Sharing for advancing humanity / benefit of society, and megacorps getting rich off it, is not either-or. On the contrary, megacorps are in part how the benefit to society materializes. After all, it's megacorps that make and distribute the equipment and the software stacks I am using to write code on, that you are using to do your research on, etc.

I find the whole line of thinking, "I won't share my stuff because then a megacorp may use it without paying me the fractional picobuck I'm entitled to", to be a strong case of Dog in the Manger mindset. And I meant that even before LLM exploded, back when people were wringing their hands about Elasticsearch being used by Amazon, back in 2021 or so.

Sharing is sharing. One can't say "oh I'm sharing this for anyone to benefit", and then upon seeing someone using it to make money, say "oh but not like that!!". Or rather, one can say, but then they're just lying about having shared the thing. "OSS but not for megacorps/aicorps" is just proprietary software. Which is perfectly fine thing to work on; what's not fine is lying about it being open.

lentil_soup 2 days ago | parent | next [-]

> "OSS but not for megacorps/aicorps" is just proprietary software

why? it's not like it's binary. It could well be that it's open source but can't be used by a company of X size. I'm not a lawyer but why couldn't a license have that clause? I would still class that as being open, for some definition of open

3form 2 days ago | parent | prev | next [-]

LLMs are one thing, but when you bring ES in AWS example, as outlined in the article, the problem is not the software being used; it's being _made proprietary_. It's about free and open software remaining free and open. Especially to the end user.

graemep 2 days ago | parent | prev [-]

> On the contrary, megacorps are in part how the benefit to society materializes.

That would be true if they were the product of a genuine competitive market.

In fact their strength is in eliminating competition, erecting barriers to entry, manipulating regulation, and maintaining the status quo.

> "OSS but not for megacorps/aicorps"

Who is advocating that? People just want everyone to stick to the terms of the licences.

ori_b 3 days ago | parent | prev | next [-]

> We aren't losing code; we are making the ability to code a universal human "literacy."

The same way that doordash makes kitchen skills universal.

eru 3 days ago | parent [-]

You say it like it's a bad thing.

ori_b 2 days ago | parent | next [-]

I say that like it's a thing. LLMs have the goal of replacing intellectual work with passive consumption. People seem to like that.

ori_b 2 days ago | parent [-]

Basically, the selling point of LLMs is that you no longer need to think about problems, you can skip directly to results. Anything that you have to think about while using them today is somewhere on the product roadmap, or will be.

Many people think this is a form of utopia.

eru 2 days ago | parent [-]

Just like computer is no longer a job description, yes.

2 days ago | parent | next [-]
[deleted]
2 days ago | parent | prev [-]
[deleted]
latexr 2 days ago | parent | prev [-]

No, they are saying it like the comparison doesn’t hold. Which it doesn’t.

matheusmoreira 2 days ago | parent | prev | next [-]

> It feels like megacorps own the keys right now, but that’s a temporary.

Remains to be seen. Hardware prices are increasing. Manufacturers are abandoning the consumer sector to serve the all consuming AI demands. Not to mention the constant attempts to lock down the computers so that we don't own them.

What does the future hold for us? Unknown. It's not looking too good though. What good is hardware if we're priced out? What good are open models and free software if we're unable to run them?

627467 2 days ago | parent | next [-]

The trend I see if older hardward beeing able to run models that are increasing miniturized.

The real (but not new) danger is us giving up to the idea that we cant do it ourselves or that we must use megacorp latest shiny toy for us to "succeed"

not_paid_by_yt 2 days ago | parent | prev [-]

welcome to late capital, please enjoy the ride while people are trying to tell you that LLMs are the only future (you have no future) while SOTA models can barely do shit on their own consistently outside of carefully designed benchmarks, and have to be made available at a loss otherwise no-one would use them.

On your right you can see the CEOs justifying longer hours and lower pay because AI will replace your job one day anyways, and then asking you why you aren't 10x more productive with Claude. On the left you can see the AI companies deciding who will be in charge of the fascist regime once they no-longer need workers other than for the coal mines. They reckon they can get 120 good years before they biosphere is uninhabitable, which they are worried about because what if the next LLM figures out immortally for them, maybe they will have to close the coal mines too after all.

matheusmoreira 2 days ago | parent [-]

Can't say I disagree with you. I do recognize that we seem to be heading towards a technofeudalist cyberpunk dystopia. The only way out for humanity is to automate everything to the point we transcend capitalism into a post-scarcity society where the very concept of an economy has been abolished. If we can't do that, we'll become soylent.

psychoslave 2 days ago | parent | prev | next [-]

>But today, language belongs to everyone. We aren't losing code; we are making the ability to code a universal human "literacy."

Literacy require training though. It’s not the same to be able to make voice rendition of a text, understand what the text is about, have a critical analysis toolbox of texts, and having the habit to lookup for situated within a broader inferred context.

Just throwing LLMs into people hands won’t automatically make them able to use it in relevant manner as far as global social benefits can be considered.

The literacy issue is actually quite independent of the fact that LLMs used are distributed or centralised.

wiseowise 3 days ago | parent | prev | next [-]

> We aren't losing code; we are making the ability to code a universal human "literacy."

LLMs making the ability to code a universal human “literacy” is like saying that Markov chain is making the ability to write a universal human “literacy”.

TeMPOraL 3 days ago | parent [-]

Comparing LLMs to Markov chains was funny in 2023.

wiseowise 2 days ago | parent [-]

I’m not comparing LLMs to Markov chain, read again.

Coding through LLMs is like writing through Markov chains.

oblio 2 days ago | parent | prev | next [-]

Once creation is commodotized, controlling eye balls is king. Look up aggregators. Apple, Facebook, Microsoft, Amazon, etc.

If anything, in Extremistan we're all useless. Platforms and whales are all that matters.

heyethan 2 days ago | parent | prev | next [-]

The literacy analogy makes sense in terms of access.

But the tools back then were cheap and local. Now most of the leverage sits behind large models and infra.

So more people can “write”, but not necessarily on their own terms.

627467 2 days ago | parent [-]

Cheap books too hundreds of years to be accessible. Already we have models that run on "legacy" hardware. Just like large scale publishing never disappeared large scale models and infra also wont. But does it mean that simple paper and pen was pointless to be distributed?

Underqualified 2 days ago | parent | prev | next [-]

This response sounds an awful lot like what ChatGPT would say ...

wolvesechoes 2 days ago | parent | prev | next [-]

> the leverage you have to build and ship today is higher than it was five years ago

Wake me up when you do.

2790276 3 days ago | parent | prev [-]

[dead]