| ▲ | saltysalt 3 days ago |
| Not sure the dial-up analogy fits, instead I tend to think we are in the mainframe period of AI, with large centralised computing models that are so big and expensive to host, only a few corporations can afford to do so. We rent a computing timeshare from them (tokens = punch cards). I look forward to the "personal computing" period, with small models distributed everywhere... |
|
| ▲ | chemotaxis 3 days ago | parent | next [-] |
| > I look forward to the "personal computing" period, with small models distributed everywhere... One could argue that this period was just a brief fluke. Personal computers really took off only in the 1990s, web 2.0 happened in the mid-2000s. Now, for the average person, 95%+ of screen time boils down to using the computer as a dumb terminal to access centralized services "in the cloud". |
| |
| ▲ | wolpoli 3 days ago | parent | next [-] | | The personal computing era happened partly because, while there were demands for computing, users' connectivity to the internet were poor or limited and so they couldn't just connect to the mainframe. We now have high speed internet access everywhere - I don't know what would drive the equivalent of the era of personal computing this time. | | |
| ▲ | ruszki 2 days ago | parent | next [-] | | > We now have high speed internet access everywhere As I travel a ton, I can confidently tell you, that this is still not true at all, and I’m kinda disappointed that the general rule of optimizing for bad reception died. | | |
| ▲ | bartread 2 days ago | parent | next [-] | | > the general rule of optimizing for bad reception died. Yep, and people will look at you like you have two heads when you suggest that perhaps we should take this into account, because it adds both cost and complexity. But I am sick to the gills of using software - be that on my laptop or my phone - that craps out constantly when I'm on the train, or in one of the many mobile reception black spots in the areas where I live and work, or because my rural broadband has decided to temporarily give up, because the software wasn't built with unreliable connections in mind. It's not that bleeding difficult to build an app that stores state locally and can sync with a remote service when connectivity is restored, but companies don't want to make the effort because it's perceived to be a niche issue that only affects a small number of people a small proportion of the time and therefore not worth the extra effort and complexity. Whereas I'd argue that it affects a decent proportion of people on at least a semi-regular basis so is probably worth the investment. | | |
| ▲ | asa400 2 days ago | parent | next [-] | | We ignore the fallacies of distributed computing at our peril: https://en.wikipedia.org/wiki/Fallacies_of_distributed_compu... | |
| ▲ | visarga 2 days ago | parent | prev | next [-] | | It's always a small crisis what app/book to install on my phone to give me 5-8 hours of reading while on a plane. I found one - Newsify, combine it with YT caching. | |
| ▲ | donkeybeer 2 days ago | parent | prev | next [-] | | Usually it reduces not adds complexity. Simpler pages without hundred different js frameworks are faster. | |
| ▲ | LogicFailsMe 2 days ago | parent | prev [-] | | Moving services to the cloud unfortunately relieves a lot of the complexity of software development with respect to the menagerie of possible hardware environments. it of course leads to a crappy user experience if they don't optimize for low bandwidth, but they don't seem to care about that, have you ever checked out how useless your algorithmic Facebook feed is now? Tons of bandwidth, very little information. It seems like their measure is time on their website equals money in their pocket and baffling you with BS is a great way to achieve that until you never visit again in disgust and frustration. | | |
| ▲ | wtallis 2 days ago | parent [-] | | I don't think the "menagerie of possible hardware environments" excuse holds much water these days. Even web apps still need to accommodate various screen sizes and resolutions and touch vs mouse input. Native apps need to deal with the variety in software environments (not to say that web apps are entirely insulated from this), across several mobile and desktop operating systems. In the face of that complexity, having to compile for both x86-64 and arm64 is at most a minor nuisance. | | |
| ▲ | bartread 4 hours ago | parent | next [-] | | I don't know that it ever held that much water. I used to work for a company building desktop tools that were distributed to, depending on the tool, on the low end tens of thousands of users, and on the high end, hundreds of thousands. We had one tool that was nominally used by about a million people but, in actuality, the real number of active users each month was more like 300k. I was at the company for 10 years and I can only remember one issue where we could not reproduce or figure it out on tools that I worked on. There may have been others for other tools/teams, but the number would have been tiny because these things always got talked about. In my case the guy with the issue - who'd been super-frustrated by it for a year or more - came up to our stand when we were at a conference in the US, introduced himself, and showed me the problem he was having. He then lent me his laptop overnight[0], and I ended up installing Wireshark to see why he was experiencing massive latency on every keystroke, and what might be going on with his network shares. In the end we managed to apply a fix to our code that sidestepped the issue for users with his situation (to this day, he's been the only person - as far as I'm aware - to report this specific problem). Our tools all ran on Windows, but obviously there were multiple extent versions of both the desktop and server OS that they were run on, different versions of the .NET runtime, at the time everyone had different AV, plus whatever other applications, services, and drivers they might have running. I won't say it was a picnic - we had a support/customer success team, after all - but the vast majority of problems weren't a function of software/OS configuration. These kinds of issues did come up, and they were a pain in the ass, but except in very rare cases - as I've described here - we were always able to find a fix or workaround. Nowadays, with much better screensharing and remote control options, it would be way easier to deal with these sorts of problems than it was 15 - 20 years ago. [0] Can't imagine too many organisations being happy with that in 2025. | |
| ▲ | LogicFailsMe 2 days ago | parent | prev [-] | | Have you ever distributed an app on the PC to more than a million people? It might change your view. Browser issues are a different argument and I agree with you 100% there. I really wish people would pull back and hold everyone to consistent standards but they won't. |
|
|
| |
| ▲ | ChadNauseam 2 days ago | parent | prev | next [-] | | I work on a local-first app for fun and someone told me I was simply creating problems for myself and I could just be using a server. But I'm in the same boat as you. I regularly don't have good internet and I'm always surprised when people act like an internet connection is a safe assumption. Even every day I go up and down an elevator where I have no internet, I travel regularly, I go to concerts and music festivals, and so on. | |
| ▲ | sampullman 2 days ago | parent | prev | next [-] | | I don't even travel that much, and still have trouble. Tethering at the local library or coffee shops is hit or miss, everything slows down during storms, etc. | | |
| ▲ | BoxOfRain 2 days ago | parent [-] | | > everything slows down during storms One problem I've found in my current house is that the connection becomes flakier in heavy rain, presumably due to poor connections between the cabinet and houses. I live in Cardiff which for those unaware is one of Britain's rainiest cities. Fun times. |
| |
| ▲ | BoxOfRain 2 days ago | parent | prev | next [-] | | Yeah British trains are often absolutely awful for this, I started putting music on my phone locally to deal with the abysmal coverage. | |
| ▲ | mlrtime 2 days ago | parent | prev [-] | | Not true because of cost or access? If you consider starlink high speed, it truly is available everywhere. | | |
| ▲ | ruszki 2 days ago | parent | next [-] | | Access. You cannot use Starlink on a train, flight, inside buildings, etc. Starlink is also not available everywhere: https://starlink.com/map. Also, it’s not feasible to bring that with me a lot of time, for example on my backpack trips; it’s simply too large. | |
| ▲ | virgilp 2 days ago | parent | prev [-] | | Because of many reasons. It's not practical to have a Starlink antenna with you everywhere. And then yes, cost is a significant factor too - even in the dialup era satellite internet connection was a thing that existed "everywhere", in theory.... |
|
| |
| ▲ | threetonesun 2 days ago | parent | prev | next [-] | | Privacy. I absolutely will not ever open my personal files to an LLM over the web, and even with my mid-tier M4 Macbook I’m close to a point where I don’t have to. I wonder how much the cat is out of the back for private companies in this regard. I don’t believe the AI companies founded on stealing IP have stopped. | | |
| ▲ | AlecSchueler 2 days ago | parent [-] | | Privacy is a niche concern sadly. | | |
| ▲ | jimbokun 2 days ago | parent [-] | | I believe Apple has made a significant number of iPhone sales due to a perception of better privacy than Android. | | |
| ▲ | AlecSchueler a day ago | parent | next [-] | | I believe you could be in a bubble. | |
| ▲ | kakacik a day ago | parent | prev [-] | | Not a single person I know that has any apple device would claim that, nobody cares or even knows in detail stuff we discuss here. Its HN bubble at its best. Another point is, subjectively, added privacy compared to say South Korean products is mostly a myth. It 100% doesn't apply if you are not US citizen and even then, fingers crossed all 3-letter agencies and device creator are not over-analyzing every single data point about you continuously, is naive. What may be better is devices are harder to steal & take ownership, but for that I would need to see some serious independent comparison, not some paid PR from which HN is not completely immune to. |
|
|
| |
| ▲ | Razengan 2 days ago | parent | prev | next [-] | | > I don't know what would drive the equivalent of the era of personal computing this time. Space. You don't want to wait 3-22 minutes for a ping from Mars. | | |
| ▲ | AlecSchueler 2 days ago | parent [-] | | I'm not sure if the handful of people in space stations are a big enough market to drive such changes. |
| |
| ▲ | almostnormal 2 days ago | parent | prev | next [-] | | Centralized only became mainstream when everything started to be offered "for free". When it was buy or pay recurrently more often the choice was to buy. | | |
| ▲ | troupo 2 days ago | parent | next [-] | | There are no longer options to buy. Everything is a subscription | | |
| ▲ | rightbyte 2 days ago | parent [-] | | Between mobilephone service including SMS and an ISP service which usually include mail I don't see the need for any hosted service. There are FOSS alternatives for about everything for hobbyist and consumer use. | | |
| ▲ | api 2 days ago | parent [-] | | There are no FOSS alternatives for consumer use unless the consumer is an IT pro or a developer. Regular people can’t use most open source software without help. Some of it, like Linux desktop stuff, has a nice enough UI that they can use it casually but they can’t install or configure or fix it. Making software that is polished and reliable and automatic enough that non computer people can use it is a lot harder than just making software. I’d say it’s usually many times harder. | | |
| ▲ | rightbyte 2 days ago | parent [-] | | I don't think that is a software issue but a social issue nowadays. FOSS alternatives have become quite OK in my opinion. If computers came with Debian, Firefox and Libre Office preinstalled instead of only W11, Edge and with some Office 365 trail, the relative difficulty would be gone I think. Same thing with most IT departments only dealing with Windows in professional settings. If you even are allowed to use something different you are on your own. |
|
|
| |
| ▲ | torginus 2 days ago | parent | prev [-] | | I think people have seen enough of this 'free' business model to know the things being sold for free are in fact, not. | | |
| ▲ | mlrtime 2 days ago | parent [-] | | Some people, but a majority see it as free. Go to your local town center and randomly poll people how much they pay for email or google search, 99% will say it is free and stop there. |
|
| |
| ▲ | unethical_ban 2 days ago | parent | prev | next [-] | | Privacy, reliable access when not connected to the web, the principal of decentralizing for some. Less supply chain risk for private enterprise. | |
| ▲ | netdevphoenix 2 days ago | parent | prev [-] | | > We now have high speed internet access everywhere This is such a HN comment illustrating how little your average HN knows of the world beyond their tech bubble. Internet everywhere, you might have something of a point. But "high speed internet access everywhere" sounds like "I haven't travelled much in my life". |
| |
| ▲ | jayd16 3 days ago | parent | prev | next [-] | | I don't know, I think you're conflating content streaming with central compute. Also, is percentage of screentime the relevant metric? We moved TV consumption to the PC, does that take away from PCs? Many apps moved to the web but that's basically just streamed code to be run in a local VM. Is that a dumb terminal? It's not exactly local compute independent... | | |
| ▲ | kamaal 3 days ago | parent | next [-] | | Nah, your parent comment has a valid point. Nearly entirety of the use cases of computers today don't involve running things on a 'personal computer' in any way. In fact these days, every one kind of agrees as little as hosting a spreadsheet on your computer is a bad idea. Cloud, where everything is backed up is the way to go. | | |
| ▲ | jayd16 3 days ago | parent [-] | | But again, that's conflating web connected or even web required with mainframe compute and it's just not the same. PC was never 'no web'. No one actually 'counted every screw in their garage' as the PC killer app. It was always the web. | | |
| ▲ | morsch 2 days ago | parent | next [-] | | One of the actual killer apps was gaming. Which still "happens" mostly on the client, today, even for networked games. | | |
| ▲ | jhanschoo 2 days ago | parent [-] | | Yet the most popular games are online-only and even more have their installation base's copies of the game managed by an online-first DRM. | | |
| ▲ | morsch 2 days ago | parent | next [-] | | That's true, but beside the point: even online only games or those gated by online DRM are not streamed or resemble a thin client architecture. That exists, too, with GeForce Now etc, which is why I said mostly. | |
| ▲ | jayd16 2 days ago | parent | prev [-] | | This is just factually inaccurate. | | |
| ▲ | jhanschoo a day ago | parent [-] | | Please provide a more comprehensive response. I suppose I should be more specific as well. Some of the online only games I am thinking of are CoD, Fortnite, LoL and Minecraft. The online-first DRM I am thinking of is Steam. |
|
|
| |
| ▲ | eru 3 days ago | parent | prev | next [-] | | You know that the personal computer predates the web by quite a few years? | | |
| ▲ | jayd16 2 days ago | parent | next [-] | | Sure, I was too hyperbolic. I simply meant connecting to the web didn't make it not a PC. The web really pushed adoption, much more than a person computation machine. It was the main use case for most folks. | |
| ▲ | rambambram 2 days ago | parent | prev [-] | | This. Although briefly, there was at least a couple of years of using pc's without an internet connection. It's unthinkable now. And even back then, when you blinked with your eyes this time period was over. | | |
| ▲ | eru 2 days ago | parent [-] | | That was a pretty long blink? The personal computer arguably begins with VisiCalc in 1979. > Through the 1970s, personal computers had proven popular with electronics enthusiasts and hobbyists, however it was unclear why the general public might want to own one. This perception changed in 1979 with the release of VisiCalc from VisiCorp (originally Personal Software), which was the first spreadsheet application. https://en.wikipedia.org/wiki/History_of_personal_computers#... Mainstream use of the web really took off in the second half of the 1990s. Arbitrarily, let's say with the release of Windows 95. That's a quarter of a century you'd be blinking for. |
|
| |
| ▲ | kamaal 3 days ago | parent | prev | next [-] | | In time Mainframes of this age will make a come back. This whole idea that you can connect lots of cheap low capacity boxes and drive down compute costs is already going away. In time people will go back to thinking compute as a variable of time taken to finish processing. That's the paradigm in the cloud compute world- you are billed for the TIME you use the box. Eventually people will just want to use something bigger that gets things done faster, hence you don't have to rent them for long. | | |
| ▲ | galaxyLogic 2 days ago | parent [-] | | It's also interesting that computing capacity is no longer discussed as instructions per second, but as Giga Watts. |
| |
| ▲ | bandrami 2 days ago | parent | prev [-] | | Umm... I had a PC a decade before the web was invented, and I didn't even use the web for like another 5 years after it went public ("it's an interesting bit of tech but it will obviously never replace gopher...") The killer apps in the 80s were spreadsheets and desktop publishing. |
|
| |
| ▲ | eru 3 days ago | parent | prev [-] | | > I don't know, I think you're conflating content streaming with central compute. Would you classify eg gmail as 'content streaming'? | | |
| ▲ | mikepurvis 3 days ago | parent | next [-] | | But gmail is also a relatively complicated app, much of which runs locally on the client device. | | |
| ▲ | MobiusHorizons 2 days ago | parent | next [-] | | It is true that browsers do much more computation than "dumb" terminals, but there are still non-trivial parallels. Terminals do contain a processor and memory in order to handle settings menus, handle keyboard input and convert incoming sequences into a character array that is then displayed on the screen. A terminal is mostly useless without something attached to the other side, but not _completely_ useless. You can browse the menus, enable local echo, and use device as something like a scratchpad. I once drew up a schematic as ascii art this way. The contents are ephemeral and you have to take a photo of the screen or something in order to retain the data. Web browsers aren't quite that useless with no internet connection, some sites do offer offline capabilities (for example gmail). but even then, the vast majority of offline experiences exist to tide the user over until network can be re-established, instead of truly offering something useful to do locally. Probably the only mainstream counter-examples would be games. | |
| ▲ | WalterSear 2 days ago | parent | prev [-] | | It's still a SAAS, with components that couldn't be replicated client-side, such as AI. | | |
| ▲ | galaxyLogic 2 days ago | parent | next [-] | | Right. But does it matter whether computation happens on the client or server? Probabaly on both in the end. But yes I am looking forward to having my own LMS on my PC which only I have access to. | |
| ▲ | fragmede 2 days ago | parent | prev [-] | | Google's own Gemma models are runnable locally on a Pixel 9 Max so some lev of AI is replicatable client side. As far as Gmail running locally, it wouldn't be impossible for Gmail to be locally hosted and hit a local cache which syncs with a server only periodically over IMAP/JMAP/whatever if Google actually wanted to do it. | | |
| ▲ | eru a day ago | parent [-] | | Yes, but seems like a lot of hassle for not much gain (for Google). | | |
| ▲ | fragmede a day ago | parent [-] | | The gain, as far as local AI goes for Google, is that, at Google scale, the CPU/GPU time to run even a small model like Gemma will add up across Gmail's millions of users. If clients have the hardware for it (which Pixel 9's have) it means Gmail's servers aren't burning CPU/GPU time on it. As far as how Gmail's existing offline mode works, I don't know. |
|
|
|
| |
| ▲ | jayd16 2 days ago | parent | prev [-] | | Well, app code is streamed, content is streamed. The app code is run locally. Content is pulled periodically. The mail server is the mail server even for Outlook. Outlook gives you a way to look through email offline. Gmail apps and even Gmail in Chrome have an offline mode that let you look through email. It's not easy to call it fully offline, nor a dumb terminal. | | |
| ▲ | eru 2 days ago | parent [-] | | Oh, GMail is definitely a cloud offering---even if they have some offline functionality. I was just probing the 'content _streaming_' term. As you demonstrate, you'd have to squint really hard to describe GMail as content streaming. 'Offline' vs 'content streaming' is a false dichotomy. There's more different types of products and services. (Which reminds me a bit of crypto-folks calling everything software that's not in crypto "web2", as if working on stodgy backends in a bank or making Nintendo Switch games has anything to do with the web at all.) | | |
| ▲ | jayd16 a day ago | parent [-] | | Ok sure but there's plenty of simple video streaming in total screen time, which was the context I was replying to. I never claimed it was a dichotomy, simply a large part of screen time that clearly skews the analysis. | | |
| ▲ | eru a day ago | parent [-] | | Yes, it's a large part of screen time at the moment. |
|
|
|
|
| |
| ▲ | JumpCrisscross 3 days ago | parent | prev | next [-] | | > using the computer as a dumb terminal to access centralized services "in the cloud" Our personal devices are far from thin clients. | | |
| ▲ | freedomben 3 days ago | parent | next [-] | | Depends on the app, and the personal device. Mobile devices are increasingly thin clients. Of course hardware-wise they are fully capable personal computers, but ridiculous software-imposed limitations make that increasingly difficult. | |
| ▲ | immutology 3 days ago | parent | prev | next [-] | | "Thin" can be interpreted as relative, no? I think it depends on if you see the browser for content or as a runtime environment. Maybe it depends on the application architecture...? I.e., a compute-heavy WASM SPA at one end vs a server-rendered website. Or is it an objective measure? | |
| ▲ | Cheer2171 3 days ago | parent | prev | next [-] | | But that is what they are mostly used for. | | |
| ▲ | TheOtherHobbes 3 days ago | parent [-] | | On phones, most of the compute is used to render media files and games, and make pretty animated UIs. The text content of a weather app is trivial compared to the UI. Same with many web pages. Desktop apps use local compute, but that's more a limitation of latency and network bandwidth than any fundamental need to keep things local. Security and privacy also matter to some people. But not to most. |
| |
| ▲ | bigyabai 3 days ago | parent | prev | next [-] | | Speak for yourself. Many people don't daily-drive anything more advanced than an iPad. | | |
| ▲ | eru 3 days ago | parent | next [-] | | IPads are incredibly advanced. Though I guess you mean they don't use anything that requires more sophistication from the user (or something like that)? | |
| ▲ | boomlinde 2 days ago | parent | prev [-] | | The Ipad is not a thin client, is it? | | |
| ▲ | troupo 2 days ago | parent [-] | | It is, for the vast majority of users. Turn off internet on they iPad and see how many apps that people use still work. | | |
| ▲ | boomlinde 2 days ago | parent | next [-] | | I'm not questioning whether the Ipad can be used as a client in some capacity, or whether people tend to use it as a client. I question whether the Ipad is a thin client. The answer to that question doesn't lie in how many applications require an internet connection, but in how many applications require local computational resources. The Ipad is a high performance computer, not just because Apple think that's fun, but out of necessity given its ambition: the applications people use on it require local storage and rather heavy local computation. The web browser standards if nothing else have pretty much guaranteed that the age of thin clients is over: a client needs to supply a significant amount of computational resources and storage to use the web generally. Not even Chromebooks will practically be anything less than rich clients. Going back to the original topic (and source of the analogy), IOS hosts an on-device large language model. | | |
| ▲ | troupo 2 days ago | parent [-] | | As with everything, the lines are a bit blurred these days. We may need a new term for these devices. But despite all the compute and storage and on-device models these supercomputers are barely a step above thin clients. |
| |
| ▲ | mlrtime 2 days ago | parent | prev [-] | | No, its a poor anology, I'm old enough to have used a Wyse terminal. That's what I think of when I hear dumb terminal. It was dumb. Maybe a PC without a hard drive (PXE the OS), but if it has storage and can install software, its not dumb. | | |
|
|
| |
| ▲ | bandrami 2 days ago | parent | prev [-] | | I mean, Chromebooks really aren't very far at all from thin clients. But even my monster ROG laptop when it's not gaming is mostly displaying the results of computation that happened elsewhere |
| |
| ▲ | api 2 days ago | parent | prev | next [-] | | There are more PCs and serious home computing setups today than there were back then. There are just way way way more casual computer users. The people who only use phones and tablets or only use laptops as dumb terminals are not the people who were buying PCs in the 1980s and 1990s, or they were they were not serious users. They were mostly non-computer-users. Non-computer-users have become casual consumer level computer users because the tech went mainstream, but there's still a massive serious computer user market. I know many people with home labs or even small cloud installations in their basements, but there are about as many of them as serious PC users with top-end PC setups in the late 1980s. | |
| ▲ | torginus 2 days ago | parent | prev | next [-] | | I dislike the view of individuals as passive sufferers of the preferences of big corporations. You can and people do self-host stuff that big tech wants pushed into the cloud. You can have a NAS, a private media player, Home Assistant has been making waves in the home automation sphere. Turns out people don't like buying overpriced devices only to have to pay a $20 subscription, and find out their devices don't talk to each other, upload footage inside of their homes to the cloud, and then get bricked once the company selling them goes under and turns of the servers. | | |
| ▲ | rambambram 2 days ago | parent | next [-] | | This. And the hordes of people reacting with some explanation for why this is. The 'why' is not the point, we already know the 'why'. The point is that you can if you want. Might not be easy, might not be convenient, but that's not the point. No one has to ask someone else for permission to use other tech than big tech. The explanation of 'why' is not an argument. Big tech is not making it easy != it's impossible. Passive sufferers indeed. Edit: got a website with an RSS feed somewhere maybe? I would like to follow more people with a point of view like yours. | |
| ▲ | __alexs 2 days ago | parent | prev | next [-] | | You can dislike it but it doesn't make it less true and getting truer. | |
| ▲ | jhanschoo 2 days ago | parent | prev | next [-] | | You can likewise host models if you so choose. Still the vast majority of people use online services both for personal computing or for LLMs. | |
| ▲ | api 2 days ago | parent | prev [-] | | Things are moving this way because it’s convenient and easy and most people today are time poor. | | |
| ▲ | torginus 2 days ago | parent [-] | | I think it has more to do with the 'common wisdom' dictating that this is the way to do it, as 'we've always done it like this'. Which might even be true, since cloud based software might offer conveniences that local substitutes don't. However this is not an inherent property of cloud software, its just some effort needs to go into a local alternative. That's why I mentioned Home Assistant - a couple years ago, smart home stuff was all the rage, and not only was it expensive, the backend ran in the cloud, and you usually paid a subscription for it. Nowadays, you can buy a local Home Assistant hub (or make one using a Pi) and have all your stuff only connect to a local server. The same is true for routers, NAS, media sharing and streaming to TV etc. You do need to get technical a bit, but you don't need to do anything you couldn't figure out by following a 20 minute Youtube video. |
|
| |
| ▲ | MSFT_Edging 2 days ago | parent | prev | next [-] | | I look forward to a possibility where the dumb terminal is less centralized in the cloud, and more how it seems to work in the expanse. They all have hand terminals that seem to automatically interact with the systems and networks of the ship/station/building they're in. Linking up with local resources, and likely having default permissions set to restrict weird behavior. Not sure it could really work like that IRL, but I haven't put a ton of thought into it. It'd make our always-online devices make a little more sense. | |
| ▲ | npilk 2 days ago | parent | prev | next [-] | | But for a broader definition of "personal computer", the number of computers we have has only continued to skyrocket - phones, watches, cars, TVs, smart speakers, toaster ovens, kids' toys... I'm with GP - I imagine a future when capable AI models become small and cheap enough to run locally in all kinds of contexts. https://notes.npilk.com/ten-thousand-agents | | |
| ▲ | seniorThrowaway 2 days ago | parent [-] | | Depending on how you are defining AI models, they already do. Think of the $15 security camera that can detect people and objects. That is AI model driven. LLM's are another story, but smaller, less effective ones can and do already run at the edge. |
| |
| ▲ | seemaze 3 days ago | parent | prev | next [-] | | I think that speaks more to the fact that software ate the world, than locality of compute. It's a breadth first, depth last game. | |
| ▲ | positron26 3 days ago | parent | prev | next [-] | | Makes me want to unplug and go back to offline social media. That's a joke. The dominant effect was networked applications getting developed, enabling community, not a shift back to client terminals. | | |
| ▲ | grumbel 2 days ago | parent [-] | | Once up on a time social media was called Usenet and worked offline in a dedicated client with a standard protocol. You only went online to download and send messages, but could then go offline and read them in an app of your choice. Web2.0 discarded the protocol approach and turned your computer into a thin client that does little more than render webapps that require you to be permanently online. | | |
| ▲ | cesarb 2 days ago | parent | next [-] | | > Once up on a time social media was called Usenet and worked offline in a dedicated client with a standard protocol. There was also FidoNet with offline message readers. | |
| ▲ | positron26 2 days ago | parent | prev [-] | | > called Usenet and worked offline People must have been pretty smart back then. They had to know to hang up the phone to check for new messages. |
|
| |
| ▲ | WhyOhWhyQ 2 days ago | parent | prev | next [-] | | I guess we're in the kim-1 era of local models, or is that already done? | |
| ▲ | pksebben 3 days ago | parent | prev | next [-] | | That 'average' is doing a lot of work to obfuscate the landscape. Open source continues to grow (indicating a robust ecosystem of individuals who use their computers for local work) and more importantly, the 'average' looks like it does not necessarily due to a reduction in local use, but to an explosion of users that did not previously exist (mobile first, SAAS customers, etc.) The thing we do need to be careful about is regulatory capture. We could very well end up with nothing but monolithic centralized systems simply because it's made illegal to distribute, use, and share open models. They hinted quite strongly that they wanted to do this with deepseek. There may even be a case to be made that at some point in the future, small local models will outperform monoliths - if distributed training becomes cheap enough, or if we find an alternative to backprop that allows models to learn as they infer (like a more developed forward-forward or something like it), we may see models that do better simply because they aren't a large centralized organism behind a walled garden. I'll grant that this is a fairly polyanna take and represents the best possible outcome but it's not outlandishly fantastic - and there is good reason to believe that any system based on a robust decentralized architecture would be more resilient to problems like platform enshittification and overdeveloped censorship. At the end of the day, it's not important what the 'average' user is doing, so long as there are enough non-average users pushing the ball forward on the important stuff. | | |
| ▲ | TheOtherHobbes 3 days ago | parent | next [-] | | We already have monolithic centralised systems. Most open source development happens on GitHub. You'd think non-average developers would have noticed their code is now hosted by Microsoft, not the FSF. But perhaps not. The AI end game is likely some kind of post-Cambrian, post-capitalist soup of evolving distributed compute. But at the moment there's no conceivable way for local and/or distributed systems to have better performance and more intelligence. Local computing has latency, bandwidth, and speed/memory limits, and general distributed computing isn't even a thing. | |
| ▲ | idiotsecant 3 days ago | parent | prev [-] | | I can't imagine a universe where a small mind with limited computing resources has an advantage against a datacenter mind, no matter the architecture. | | |
| ▲ | bee_rider 3 days ago | parent | next [-] | | The small mind could have an advantage if it is closer or more trustworthy to users. It only has to be good enough to do what we want. In the extreme, maybe inference becomes cheap enough that we ask “why do I have to wake up the laptop’s antenna?” | | |
| ▲ | galaxyLogic 2 days ago | parent [-] | | I would like to have a personal AI agent which basically has a copy of my knowledge, a reflection of me, so it could help me mupltiply my mind. |
| |
| ▲ | heavyset_go 3 days ago | parent | prev | next [-] | | I don't want to send sensitive information to a data center, I don't want it to leave my machine/network/what have you. Local models can help in that department. You could say the same about all self-hosted software, teams with billions of dollars to produce and host SaaS will always have an advantage over smaller, local operations. | |
| ▲ | pksebben 2 days ago | parent | prev | next [-] | | The advantage it might have won't be in the form of "more power", it would be in the form of "not burdened by sponsored content / training or censorship of any kind, and focused on the use-cases most relevant to the individual end user." We're already very, very close to "smart enough for most stuff". We just need that to also be "tuned for our specific wants and needs". | |
| ▲ | hakfoo 2 days ago | parent | prev | next [-] | | Abundant resources could enable bad designs. I could in particular see a lot of commercial drive for huge models that can solve a bazillion different use cases, but aren't efficient for any of them. There might be also local/global bias strategies. A tiny local model trained on your specific code/document base may be better aligned to match your specific needs than a galaxy scale model. If it only knows about one "User" class, the one in your codebase, it might be less prone to borrowing irrelevant ideas from fifty other systems. | |
| ▲ | gizajob 3 days ago | parent | prev | next [-] | | The only difference is latency. | |
| ▲ | bigfatkitten 3 days ago | parent | prev [-] | | Universes like ours where the datacentre mind is completely untrustworthy. |
|
| |
| ▲ | btown 3 days ago | parent | prev [-] | | Even the most popular games (with few exceptions) present as relatively dumb terminals that need constant connectivity to sync every activity to a mainframe - not necessarily because it's an MMO or multiplayer game, but because it's the industry standard way to ensure fairness. And by fairness, of course, I mean the optimization of enforcing "grindiness" as a mechanism to sell lootboxes and premium subscriptions. And AI just further normalizes the need for connectivity; cloud models are likely to improve faster than local models, for both technical and business reasons. They've got the premium-subscriptions model down. I shudder to think what happens when OpenAI begins hiring/subsuming-the-knowledge-of "revenue optimization analysts" from the AAA gaming world as a way to boost revenue. But hey, at least you still need humans, at some level, if your paperclip optimizer is told to find ways to get humans to spend money on "a sense of pride and accomplishment." [0] We do not live in a utopia. [0] https://www.guinnessworldrecords.com/world-records/503152-mo... - https://www.reddit.com/r/StarWarsBattlefront/comments/7cff0b... | | |
| ▲ | throw23920 2 days ago | parent [-] | | I imagine there are plenty of indie single-player games that work just fine offline. You lose cloud saves and achievements, but everything else still works. |
|
|
|
| ▲ | 8ytecoder 3 days ago | parent | prev | next [-] |
| Funny you would pick this analogy. I feel like we’re back in the mainframe era. A lot of software can’t operate without an internet connection. Even if in practice they execute some of the code on your device, a lot of the data and the heavyweight processing is already happening on the server. Even basic services designed from the ground up to be distributed and local first - like email (“downloading”) - are used in this fashion - like gmail. Maps apps added offline support years after they launched and still cripple the search. Even git has GitHub sitting in the middle and most people don’t or can’t use git any other way. SaaS, Electron, …etc. have brought us back to the mainframe era. |
| |
| ▲ | thewebguyd 3 days ago | parent | next [-] | | It's always struck me as living in some sort of bizaro world. We now have these super powerful personal computers, both handheld (phones) and laptops (My M4 Pro smokes even some desktop class processors) and yet I use all this powerful compute hardware to...be a dumb terminal to someone else's computer. I had always hoped we'd do more locally on-device (and with native apps, not running 100 instances of chromium for various electron apps). But, it's hard to extract rent that way I suppose. | | |
| ▲ | OccamsMirror 3 days ago | parent | next [-] | | What's truly wild when you think about it, is that the computer on the other end is often less powerful than your personal laptop. I access websites on a 64gb, 16 core device. I deploy them to a 16gb, 4 core server. | | |
| ▲ | eloisant 2 days ago | parent [-] | | Yes, but your computer relies on dozens (hundreds?) of servers at any given time. |
| |
| ▲ | 2 days ago | parent | prev | next [-] | | [deleted] | |
| ▲ | ryandrake 3 days ago | parent | prev | next [-] | | I don't even understand why computer and phone manufacturers even try to make their devices faster anymore, since for most computing tasks, the bottleneck is all the data that needs to be transferred to and from the modern version of the mainframe. | | |
| ▲ | tim333 3 days ago | parent | next [-] | | There are often activities that do require compute though. My last phone upgrade was so Pokemon Go would work again, my friend upgrades for the latest 4k video or similar. | |
| ▲ | charcircuit 3 days ago | parent | prev [-] | | Consumers care about battery life. | | |
| ▲ | fainpul 2 days ago | parent | next [-] | | Yet manufacturers give us thinner and thinner phones every year (instead of using that space for the battery), and make it difficult to swap out batteries which have degraded. | | |
| ▲ | thewebguyd 2 days ago | parent [-] | | > make it difficult to swap out batteries which have degraded. That's the part that pisses me off the most. They all claim it's for the IP68, but that's bullshit. There's plenty of devices with removable backs & batteries that are IP68. My BlackBerry bold 9xxx was 10mm thin. the iPhone 17 Pro Max is 8.75. You aren't going to notice the 1.3mm of difference, and my BlackBerry had a user replaceable battery, no tools required just pop off the back cover. The BlackBerry was also about 100 grams lighter. The non-user removable batteries and unibody designs are purely for planned obsolescence, nothing else. |
| |
| ▲ | eloisant 2 days ago | parent | prev | next [-] | | Also when a remote service struggle I can switch to do something else. When a local software struggles it brings my whole device to its knees and I can't do anything. | |
| ▲ | galaxyLogic 2 days ago | parent | prev [-] | | And providers count their capacity in Giga-watts. |
|
| |
| ▲ | closeparen 2 days ago | parent | prev | next [-] | | I think people have been finding more compelling use cases for the fact that information systems can be multi-player now than for marginal FLOPS. Client-server is just a very effective way of organizing multi-player information systems. | |
| ▲ | BeFlatXIII 2 days ago | parent | prev [-] | | yet I use all this powerful compute hardware to...animate liquid glass |
| |
| ▲ | tbrownaw 3 days ago | parent | prev [-] | | > A lot of software can’t operate without an internet connection Or even physical things like mattresses, according to discussions around the recent AWS issues. |
|
|
| ▲ | paxys 3 days ago | parent | prev | next [-] |
| Why would companies sell you the golden goose when they can instead sell you an egg every day? |
| |
| ▲ | JumpCrisscross 3 days ago | parent | next [-] | | > Why would companies sell you the golden goose when they can instead sell you an egg every day? Because someone else can sell the goose and take your market. Apple is best aligned to be the disruptor. But I wouldn’t underestimate the Chinese government dumping top-tier open-source models on the internet to take our tech companies down a notch or ten. | | |
| ▲ | eloisant 2 days ago | parent | next [-] | | Sure, the company that launched iTunes and killed physical media, then released a phone where you can't install apps ("the web is the apps") will be the disruptor to bring back local computing to users... | |
| ▲ | paxys 3 days ago | parent | prev | next [-] | | By that logic none of us should be paying monthly subscriptions for anything because obviously someone would disrupt that pricing model and take business away from all the tech companies who are charging it? Especially since personal computers and mobile devices get more and more powerful and capable with every passing year. Yet subscriptions also get more prevalent every year. If Apple does finally come up with a fully on-device AI model that is actually useful, what makes you think they won't gate it behind a $20/mo subscription like they do for everything else? | | |
| ▲ | JumpCrisscross 2 days ago | parent | next [-] | | > By that logic none of us should be paying monthly subscriptions for anything because obviously someone would disrupt that pricing model and take business away from all the tech companies who are charging it? Non sequitur. If a market is being ripped off by subscription, there is opportunity in selling the asset. Vice versa: if the asset sellers are ripping off the market, there is opportunity to turn it into a subscription. Business models tend to oscillate between these two for a variety of reasons. Nothing there suggets one mode is infinitely yielding. > If Apple does finally come up with a fully on-device AI model that is actually useful, what makes you think they won't gate it behind a $20/mo subscription like they do for everything else? If they can, someone else can, too. They can make plenty of money selling it straight. | | |
| ▲ | Draiken 2 days ago | parent [-] | | > If a market is being ripped off by subscription, there is opportunity in selling the asset. Only in theory. Nothing beats getting paid forever. > Business models tend to oscillate between these two for a variety of reasons They do? AFAICT everything devolves into subscriptions/rent since it maximizes profit. It's the only logical outcome. > If they can, someone else can, too. And that's why companies love those monopolies. So, no... other's can't straight up compete against a monopoly. |
| |
| ▲ | cloverich 2 days ago | parent | prev | next [-] | | Because they need to displace open AI users, or open AI will steer their trajectory towards Apple at some point. | |
| ▲ | phinnaeus 2 days ago | parent | prev [-] | | What on-device app does Apple charge a subscription for? |
| |
| ▲ | troupo 2 days ago | parent | prev | next [-] | | > Apple is best aligned to be the disruptor. It's this disruptor Apple in the room with us now? Apple's second biggest money source is services. You know, subscriptions. And that source keeps growing: https://sixcolors.com/post/2025/10/charts-apple-caps-off-bes... It's also that same Apple that fights tooth and nail every single attempt to let people have the goose or even the promise of a goose. E.g. by saying that it's entitled to a cut even if a transaction didn't happen through Apple. | |
| ▲ | likium 3 days ago | parent | prev | next [-] | | Unfortunately, most people just want eggs, not the burden of actually owning the goose. | |
| ▲ | gizajob 3 days ago | parent | prev [-] | | Putting a few boots in Taiwan would also make for a profitable short. Profitable to the tune of several trillion dollars. Xi must be getting tempted. | | |
| ▲ | CuriouslyC 3 days ago | parent [-] | | It's a lot more complicated than that. They need to be able to take the island very quickly with a decapitation strike, while also keeping TSMC from being sabotaged or destroyed, then they need to be able to weather a long western economic embargo until they can "break the siege" with demand for what they control along with minor good faith concessions. It's very risky play, and if it doesn't work it leaves China in a much worse place than before, so ideally you don't make the play unless you're already facing some big downside, sort of as a "hail Mary" move. At this point I'm sure they're assuming Trump is glad handing them while preparing for military action, they might even view invasion of Taiwan as defensive if they think military action could be imminent anyhow. | | |
| ▲ | gizajob 2 days ago | parent | next [-] | | Destroying TSMC or knowing it would be sabotaged would pretty much be the point of the operation. Would take 48 hours and they could be out of there again and say “ooops sorry” at the UN. | | |
| ▲ | CuriouslyC 2 days ago | parent [-] | | Hard disagree. They need chips bad, and it's the US defense position that TSMC be destroyed if possible in the event of successful Chinese invasion. They also care about reunification on principle, and an attack like that without letting them force "One China" on the Taiwanese in the aftermath would just move them farther from that goal. |
| |
| ▲ | JumpCrisscross 2 days ago | parent | prev [-] | | > then they need to be able to weather a long western economic embargo until they can "break the siege" with demand for what they control along with minor good faith concessions And you know we'd be potting their transport ships, et cetera, from a distance the whole time, all to terrific fanfare. The Taiwan Strait would become the new training ground for naval drones, with the targets being almost exclusively Chinese. | | |
| ▲ | CuriouslyC 2 days ago | parent [-] | | I worked with the Taiwanese Military, that's their dream scenario but the reality is they're scared shitless that the Chinese will decapitate them with massive air superiority. Drones don't mean shit without C2. | | |
| ▲ | JumpCrisscross 2 days ago | parent [-] | | > they're scared shitless that the Chinese will decapitate them with massive air superiority Taiwan fields strong air defenses backed up by American long-range fortifications. The threat is covert decapitation. A series of terrorist attacks carried out to sow confusion while the attack launches. Nevertheless, unless China pulls off a Kabul, they’d still be subject to constant cross-Strait harassment. | | |
| ▲ | CuriouslyC 2 days ago | parent [-] | | China has between 5:1 and 10:1 advantage depending on asset class. If not already on standby, US interdiction is ~48 hours. For sure China is going to blast on all fronts, so cyber and grid interruptions combined with shock and awe are definitely gonna be a thing. It's not a great setup for Taiwan. |
|
|
|
|
|
| |
| ▲ | codegeek 3 days ago | parent | prev | next [-] | | You could say the same thing about Computers when they were mostly mainframe. I am sure someone will figure out how to make it commoditized just like personal computers and internet. | | |
| ▲ | fph 3 days ago | parent | next [-] | | An interesting remark: in the 1950s-1970s, mainframes were typically rented rather than sold. | |
| ▲ | vjvjvjvjghv 3 days ago | parent | prev [-] | | It looks to me like the personal computer area is over. Everything is in the cloud and accessed through terminals like phones and tablets. | | |
| ▲ | freedomben 3 days ago | parent [-] | | And notably, those phones and tablets are intentionally hobbled by the device owners (Apple, Google) who do everything they can to ensure they can't be treated like personal computing devices. Short of regulatory intervention, I don't see this trend changing anytime soon. We're going full on in the direction of more locked down now that Google is tightening the screws on Android. |
|
| |
| ▲ | DevKoala 3 days ago | parent | prev | next [-] | | Because someone else will sell it to you if they dont. | |
| ▲ | kakapo5672 3 days ago | parent | prev | next [-] | | Because companies are not some monolith, all doing identical things forever. If someone sees a new angle to make money, they'll start doing it. Data General and Unisys did not create PCs - small disrupters did that. These startups were happy to sell eggs. | | |
| ▲ | otterley 3 days ago | parent [-] | | They didn't create them, but PC startups like Apple and Commodore only made inroads into the home -- a relatively narrow market compared to business. It took IBM to legitimize PCs as business tools. |
| |
| ▲ | worldsayshi 3 days ago | parent | prev | next [-] | | Well if there's at least one competitor selling golden geese to consumers the rest have to adapt. Assuming consumers even bother to set up a coop in their living room... | |
| ▲ | mulmen 3 days ago | parent | prev | next [-] | | Your margin is my opportunity. The more expensive centralized models get the easier it is for distributed models to compete. | |
| ▲ | saltysalt 3 days ago | parent | prev | next [-] | | Exactly! It's a rent-seeking model. | | |
| ▲ | echelon 3 days ago | parent [-] | | > I look forward to the "personal computing" period, with small models distributed everywhere... Like the web, which worked out great? Our Internet is largely centralized platforms. Built on technology controlled by trillion dollar titans. Google somehow got the lion share of browser usage and is now dictating the direction of web tech, including the removal of adblock. The URL bar defaults to Google search, where the top results are paid ads. Your typical everyday person uses their default, locked down iPhone or Android to consume Google or Apple platform products. They then communicate with their friends over Meta platforms, Reddit, or Discord. The decentralized web could never outrun money. It's difficult to out-engineer hundreds of thousands of the most talented, most highly paid engineers that are working to create these silos. | | |
| ▲ | NemoNobody 3 days ago | parent | next [-] | | Ok, so Brave Browser exists - if you download, you will see 0 ads on the internet, I've never really seen ads on the internet - even in the before brave times. Fr tho, no ads - I'm not making money off them, I've got no invite code for you, I'm a human - I just don't get it. I've probably told 500 people about Brave, I don't know any that ever tried it. I don't ever know what to say. You're not wrong, as long as you never try to do something else. | | |
| ▲ | acheron 2 days ago | parent | next [-] | | Brave is just a rebranded Chrome. By using it you’re still endorsing Google’s control of the web. | | |
| ▲ | makingstuffs 2 days ago | parent [-] | | I was gonna say this. If Google decides to stop developing chromium then Brave is left with very few choices. As someone who has been using brace since it was first announced and very tightly coupled to the BAT crypto token I must say it is much less effective nowadays. I often still see a load of ads and also regularly have to turn off the shields for some sites. |
| |
| ▲ | echelon 3 days ago | parent | prev [-] | | If everyone used Brave, Google wouldn't be a multi-trillion dollar company pulling revenues that dwarf many countries. Or rather, they'd block Brave. |
| |
| ▲ | saltysalt 3 days ago | parent | prev | next [-] | | I agree man, it's depressing. | |
| ▲ | 3 days ago | parent | prev [-] | | [deleted] |
|
| |
| ▲ | positron26 3 days ago | parent | prev | next [-] | | When the consumer decides to discover my site and fund federated and P2P infrastructure, they can have a seat at the table. | |
| ▲ | anjel 3 days ago | parent | prev [-] | | Selling fertile geese was a winning and proven business biz model for a very long time. Selling eggs is better how? |
|
|
| ▲ | graeme 3 days ago | parent | prev | next [-] |
| We have a ton of good, small models. The issues are: 1. Most people don't have machines that can run even midsized local models well 2. The local models are nearly as good as the frontier models for a lot of use cases 3. There are technical hurdles to running local models that will block 99% of people. Even if the steps are: download LM Studio and download a model Maybe local models will get so good that they cover 99% of normal user use cases and it'll be like using your phone/computer to edit a photo. But you'll still need something to make it automatic enough that regular people use it by default. That said, anyone reading this is almost certainly technical enough to run a local model. I would highly recommend trying some. Very neat to know it's entirely run from your machine and seeing what it can do. LM Studio is the most brainless way to dip your toes in. |
| |
| ▲ | loyalcinnamon 2 days ago | parent | next [-] | | As the hype is dying down it's becoming a little bit clearer that AI isn't like blockchain and might be actually useful (for non generative purposes at least) I'm curious what counts as a midsize model; 4B, 8B, or something larger/smaller? What models would you recommend? I have 12GB of vram so anything larger than 8B might be really slow, but i am not sure | | |
| ▲ | riskable 2 days ago | parent | next [-] | | My take: Large: Requires >128GB VRAM Medium: 32-128GB VRAM Small: 16GB VRAM Micro: Runs on a microcontroller or GPUs with just 4GB of VRAM There's really nothing worthwhile for general use cases that runs in under 16GB (from my testing) except a grammar-checking model that I can't remember the name of at the moment. gpt-oss:20b runs on 16GB of VRAM and it's actually quite good (for coding, at least)! Especially with Python. Prediction: The day that your average gaming PC comes with 128GB of VRAM is the day developers will stop bothering with cloud-based AI services. gpt-oss:120b is nearly as good as gpt5 and we're still at the beginning of the AI revolution. | |
| ▲ | DSingularity 2 days ago | parent | prev [-] | | It can depend on your use case. Are you editing a large code base and will thus make lots of completion requests with large contexts? |
| |
| ▲ | FitchApps 2 days ago | parent | prev [-] | | Try WebLLM - it's pretty decent and all in-browser/offline even for light tasks, 1B-1.5B models like Qwen2.5-Coder-1.5B-Instruct. I put together a quick prototype - CodexLocal.com but you can essentially a local nginx and use webllm as an offline app. Of course, you can just use Ollama / LM Studio but that would require a more technical solution |
|
|
| ▲ | consumer451 2 days ago | parent | prev | next [-] |
| I like to think of it more broadly, and that we are currently in the era of the first automobile. [0] LLMs are the internal combustion engine, and chatbot UIs are at the "horseless carriage" phase. My personal theory is that even if models stopped making major advancements, we would find cheaper and more useful ways to use them. In the end, our current implementations will look like the automobile pictured below. [0] https://group.mercedes-benz.com/company/tradition/company-hi... |
|
| ▲ | falcor84 2 days ago | parent | prev | next [-] |
| I'm not a big google fan, but I really like the "Google AI Edge Gallery" android app [0]. In particular, I've been chatting with the "Gemma-3n-E2B-it" model when I don't have an internet connection, and it's really decent! [0] https://play.google.com/store/apps/details?id=com.google.ai.... |
|
| ▲ | js8 2 days ago | parent | prev | next [-] |
| Mainframes still exist, and they actually make a lot of sense from physics perspective. It's good idea to run transactions in a big machine rather than distributed, the latter is less energy efficient. I think the misconception is that things cannot be overpriced for reasons other than inefficiency. |
|
| ▲ | onlyrealcuzzo 3 days ago | parent | prev | next [-] |
| Don't we already have small models highly distributed? |
| |
| ▲ | saltysalt 3 days ago | parent | next [-] | | We do, but the vast majority of users interact with centralised models from Open AI, Google Gemini, Grok... | | |
| ▲ | onlyrealcuzzo 3 days ago | parent | next [-] | | I'm not sure we can look forward to self-hosted models ever being mainstream. Like 50% of internet users are already interacting with one of these daily. You usually only change your habit when something is substantially better. I don't know how free versions are going to be smaller, run on commodity hardware, take up trivial space and ram etc, AND be substantially better | | |
| ▲ | oceanplexian 3 days ago | parent | next [-] | | > I'm not sure we can look forward to self-hosted models ever being mainstream. If you are using an Apple product chances are you are already using self-hosted models for things like writing tools and don't even know it. | |
| ▲ | ryanianian 3 days ago | parent | prev | next [-] | | The "enshittification" hasn't happened yet. They'll add ads and other gross stuff to the free or cheap tiers. Some will continue to use it, but there will be an opportunity for self-hosted models to emerge. | |
| ▲ | o11c 3 days ago | parent | prev | next [-] | | > Like 50% of internet users are already interacting with one of these daily. You usually only change your habit when something is substantially better. No, you usually only change your habit when the tools you are already using are changed without consulting you, and the statistics are then used to lie. | |
| ▲ | saltysalt 3 days ago | parent | prev | next [-] | | You make a fair point, I'm just hoping this will happen, but not confident either to be frank. | |
| ▲ | 3 days ago | parent | prev [-] | | [deleted] |
| |
| ▲ | raincole 3 days ago | parent | prev | next [-] | | Because small models are just not that good. | |
| ▲ | positron26 3 days ago | parent | prev [-] | | The vast majority won't switch until there's a 10x use case. We know they are coming. Why bother hopping? |
| |
| ▲ | godshatter 2 days ago | parent | prev [-] | | I use gpt4all and have downloaded some models. It's not that hard, and doesn't take a huge system. It's pretty easy if you use models on their list, it's a bit more work to find the right chat interactive script (forgot what they are called) for your class of model if you downloaded one that's not on the list, but not that hard. I have 8GB VRAM, and it's slow but they work. I like that it's local and private, but then I don't use them for much other than as an oddity that is fun to interact with. |
|
|
| ▲ | sixtyj 3 days ago | parent | prev | next [-] |
| Dial-up + mainframe. Mainframe from POV as silos, dial-up internet as the speed we have now when looking back to 2025 in 2035. |
|
| ▲ | gowld 3 days ago | parent | prev | next [-] |
| We are also in the mainframe period of computing, with large centralised cloud services. |
|
| ▲ | runarberg 3 days ago | parent | prev | next [-] |
| I actually think we are much closer to the sneaker era of shoes, or the monorail era of public transit. |
|
| ▲ | dzonga 3 days ago | parent | prev | next [-] |
| this -- chips are getting fast enough both arm n x86. unified memory architecture means we can get more ram on devices at faster throughput. we're already seeing local models - just that their capability is limited by ram. |
|
| ▲ | cyanydeez 3 days ago | parent | prev | next [-] |
| I think we are in the dotcom boom era where investment is circular and the cash investments all depend on the idea that growth is infinite. Just a bunch of billionaires jockeying for not being poor. |
|
| ▲ | jijji 3 days ago | parent | prev | next [-] |
| ollama and other peojects already make this possible |
|
| ▲ | raincole 2 days ago | parent | prev | next [-] |
| > "personal computing" period The period when you couldn't use Linux as your main OS because your organization asked for .doc files? No thanks. |
|
| ▲ | EGreg 3 days ago | parent | prev | next [-] |
| I actually don’t look forward to this period. I have always been for open source software and distributism — until AI. Because if there’s one thing worse than governments having nuclear weapons, it’s everyone having them. It would be chaos. And with physical drones and robots coming, it woukd be even worse. Think “shitcoins and memecoins” but unlike those, you don’t just lose the money you put in and you can’t opt out. They’d affect everyone, and you can never escape the chaos ever again. They’d be posting around the whole Internet (including here, YouTube deepfakes, extortion, annoyance, constantly trying to rewrite history, get published, reputational destruction at scale etc etc), and constant armies of bots fighting. A dark forest. And if AI can pay for its own propagation via decentralized hosting and inference, then the chance of a runaway advanced persistent threat compounds. It just takes a few bad apples, or even practical jokers, to unleash crazy stuff. And it will never be shut down, just build and build like some kind of kessler syndrome. And I’m talking about with just CURRENT AI agent and drone technology. |
|
| ▲ | supportengineer 2 days ago | parent | prev | next [-] |
| Imagine small models on a cheap chip that can be added to anything (alarm clock, electric toothbrush, car keys...) |
|
| ▲ | giancarlostoro 2 days ago | parent | prev [-] |
| I mean, people can self-host plenty off of a 5090, heck even Macs with enough RAM can run larger models that I can't run on a 5090. |