▲ | hosh 19 hours ago | |||||||||||||||||||||||||
While the Internet and LLMs are huge turning points — the metaphor that comes to mind are phase change thresholds, from solid to gas, from gas to solids — there is a crucial difference between the internet and LLMs. The early internet connected personal computing together. It built on technology that was democratizing. LLMs appear to be democratizing, but it is not. The enshittification is proceeding much more rapidly. No one wants to be left behind on the land grab. Many of us remember the rise of the world wide web, and perhaps even personal computing that made the internet mainstream. I am excited to hear the effort of the Swiss models being trained, though it is a step behind. I remember people talking about how fine tuning will accelerate advances out in the open, and that large companies such as Google can’t keep up with that. Perhaps. I’ve been diving into history. The Industrial Revolution was a time of rapid progress when engines accelerated the development of cheaper access to fuels, more powerful engines. We were able to afford abundance for a middle class, but we also had enshittification then too. While there is a _propensity_ for enshittification, I for one don’t see it as inevitable, and neither do I think an AI future is inevitable. | ||||||||||||||||||||||||||
▲ | Karrot_Kream 18 hours ago | parent | next [-] | |||||||||||||||||||||||||
For the internet to be democratizing it needed PCs first. Before that computing was like LLMs: the mainframe era. You either had access to an institution with a mainframe or you were luckily able to get a thin client to a mainframe (the early time-sharing systems.) Even after PCs were invented, for decades mainframes were inarguably better than PCs. Mainframes and thin clients were even some of the earliest computer networks. I am optimistic that local models will catch up and hit the same pareto-optimal point. At some point your OS will ship with a local model, your system will have access to some Intelligence APIs, and that's that. Linux and BSDs will probably ship with an open-weights model. I may be wrong, but this is my hope. If you're interested in a taste of that future try the Gemma3 class of models. While I haven't tried agentic coding with them yet, I find them more than good enough for day-to-day use. | ||||||||||||||||||||||||||
| ||||||||||||||||||||||||||
▲ | TeMPOraL 19 hours ago | parent | prev [-] | |||||||||||||||||||||||||
> Many of us remember the rise of the world wide web, and perhaps even personal computing that made the internet mainstream. I do. The web was the largest and most widespread enshittification process to date, and it started with the first sale made online, with the first ad shown on a web page - this quickly went into full-blown land grab in the late 90s, and then dotcom and smartphones and social media and SaaS and IoT and here we are today. The "propensity for enshittification" is just called business, or entrepreneurship. It is orthogonal to AI. I think comparing rise of LLMs to the web taking off is quite accurate, both with the good and bad sides. | ||||||||||||||||||||||||||
|