| ▲ | chemotaxis 3 days ago |
| > I look forward to the "personal computing" period, with small models distributed everywhere... One could argue that this period was just a brief fluke. Personal computers really took off only in the 1990s, web 2.0 happened in the mid-2000s. Now, for the average person, 95%+ of screen time boils down to using the computer as a dumb terminal to access centralized services "in the cloud". |
|
| ▲ | wolpoli 3 days ago | parent | next [-] |
| The personal computing era happened partly because, while there were demands for computing, users' connectivity to the internet were poor or limited and so they couldn't just connect to the mainframe. We now have high speed internet access everywhere - I don't know what would drive the equivalent of the era of personal computing this time. |
| |
| ▲ | ruszki 2 days ago | parent | next [-] | | > We now have high speed internet access everywhere As I travel a ton, I can confidently tell you, that this is still not true at all, and I’m kinda disappointed that the general rule of optimizing for bad reception died. | | |
| ▲ | bartread 2 days ago | parent | next [-] | | > the general rule of optimizing for bad reception died. Yep, and people will look at you like you have two heads when you suggest that perhaps we should take this into account, because it adds both cost and complexity. But I am sick to the gills of using software - be that on my laptop or my phone - that craps out constantly when I'm on the train, or in one of the many mobile reception black spots in the areas where I live and work, or because my rural broadband has decided to temporarily give up, because the software wasn't built with unreliable connections in mind. It's not that bleeding difficult to build an app that stores state locally and can sync with a remote service when connectivity is restored, but companies don't want to make the effort because it's perceived to be a niche issue that only affects a small number of people a small proportion of the time and therefore not worth the extra effort and complexity. Whereas I'd argue that it affects a decent proportion of people on at least a semi-regular basis so is probably worth the investment. | | |
| ▲ | asa400 2 days ago | parent | next [-] | | We ignore the fallacies of distributed computing at our peril: https://en.wikipedia.org/wiki/Fallacies_of_distributed_compu... | |
| ▲ | visarga 2 days ago | parent | prev | next [-] | | It's always a small crisis what app/book to install on my phone to give me 5-8 hours of reading while on a plane. I found one - Newsify, combine it with YT caching. | |
| ▲ | donkeybeer 2 days ago | parent | prev | next [-] | | Usually it reduces not adds complexity. Simpler pages without hundred different js frameworks are faster. | |
| ▲ | LogicFailsMe 2 days ago | parent | prev [-] | | Moving services to the cloud unfortunately relieves a lot of the complexity of software development with respect to the menagerie of possible hardware environments. it of course leads to a crappy user experience if they don't optimize for low bandwidth, but they don't seem to care about that, have you ever checked out how useless your algorithmic Facebook feed is now? Tons of bandwidth, very little information. It seems like their measure is time on their website equals money in their pocket and baffling you with BS is a great way to achieve that until you never visit again in disgust and frustration. | | |
| ▲ | wtallis 2 days ago | parent [-] | | I don't think the "menagerie of possible hardware environments" excuse holds much water these days. Even web apps still need to accommodate various screen sizes and resolutions and touch vs mouse input. Native apps need to deal with the variety in software environments (not to say that web apps are entirely insulated from this), across several mobile and desktop operating systems. In the face of that complexity, having to compile for both x86-64 and arm64 is at most a minor nuisance. | | |
| ▲ | bartread 4 hours ago | parent | next [-] | | I don't know that it ever held that much water. I used to work for a company building desktop tools that were distributed to, depending on the tool, on the low end tens of thousands of users, and on the high end, hundreds of thousands. We had one tool that was nominally used by about a million people but, in actuality, the real number of active users each month was more like 300k. I was at the company for 10 years and I can only remember one issue where we could not reproduce or figure it out on tools that I worked on. There may have been others for other tools/teams, but the number would have been tiny because these things always got talked about. In my case the guy with the issue - who'd been super-frustrated by it for a year or more - came up to our stand when we were at a conference in the US, introduced himself, and showed me the problem he was having. He then lent me his laptop overnight[0], and I ended up installing Wireshark to see why he was experiencing massive latency on every keystroke, and what might be going on with his network shares. In the end we managed to apply a fix to our code that sidestepped the issue for users with his situation (to this day, he's been the only person - as far as I'm aware - to report this specific problem). Our tools all ran on Windows, but obviously there were multiple extent versions of both the desktop and server OS that they were run on, different versions of the .NET runtime, at the time everyone had different AV, plus whatever other applications, services, and drivers they might have running. I won't say it was a picnic - we had a support/customer success team, after all - but the vast majority of problems weren't a function of software/OS configuration. These kinds of issues did come up, and they were a pain in the ass, but except in very rare cases - as I've described here - we were always able to find a fix or workaround. Nowadays, with much better screensharing and remote control options, it would be way easier to deal with these sorts of problems than it was 15 - 20 years ago. [0] Can't imagine too many organisations being happy with that in 2025. | |
| ▲ | LogicFailsMe 2 days ago | parent | prev [-] | | Have you ever distributed an app on the PC to more than a million people? It might change your view. Browser issues are a different argument and I agree with you 100% there. I really wish people would pull back and hold everyone to consistent standards but they won't. |
|
|
| |
| ▲ | ChadNauseam 2 days ago | parent | prev | next [-] | | I work on a local-first app for fun and someone told me I was simply creating problems for myself and I could just be using a server. But I'm in the same boat as you. I regularly don't have good internet and I'm always surprised when people act like an internet connection is a safe assumption. Even every day I go up and down an elevator where I have no internet, I travel regularly, I go to concerts and music festivals, and so on. | |
| ▲ | sampullman 2 days ago | parent | prev | next [-] | | I don't even travel that much, and still have trouble. Tethering at the local library or coffee shops is hit or miss, everything slows down during storms, etc. | | |
| ▲ | BoxOfRain 2 days ago | parent [-] | | > everything slows down during storms One problem I've found in my current house is that the connection becomes flakier in heavy rain, presumably due to poor connections between the cabinet and houses. I live in Cardiff which for those unaware is one of Britain's rainiest cities. Fun times. |
| |
| ▲ | BoxOfRain 2 days ago | parent | prev | next [-] | | Yeah British trains are often absolutely awful for this, I started putting music on my phone locally to deal with the abysmal coverage. | |
| ▲ | mlrtime 2 days ago | parent | prev [-] | | Not true because of cost or access? If you consider starlink high speed, it truly is available everywhere. | | |
| ▲ | ruszki 2 days ago | parent | next [-] | | Access. You cannot use Starlink on a train, flight, inside buildings, etc. Starlink is also not available everywhere: https://starlink.com/map. Also, it’s not feasible to bring that with me a lot of time, for example on my backpack trips; it’s simply too large. | |
| ▲ | virgilp 2 days ago | parent | prev [-] | | Because of many reasons. It's not practical to have a Starlink antenna with you everywhere. And then yes, cost is a significant factor too - even in the dialup era satellite internet connection was a thing that existed "everywhere", in theory.... |
|
| |
| ▲ | threetonesun 2 days ago | parent | prev | next [-] | | Privacy. I absolutely will not ever open my personal files to an LLM over the web, and even with my mid-tier M4 Macbook I’m close to a point where I don’t have to. I wonder how much the cat is out of the back for private companies in this regard. I don’t believe the AI companies founded on stealing IP have stopped. | | |
| ▲ | AlecSchueler 2 days ago | parent [-] | | Privacy is a niche concern sadly. | | |
| ▲ | jimbokun 2 days ago | parent [-] | | I believe Apple has made a significant number of iPhone sales due to a perception of better privacy than Android. | | |
| ▲ | AlecSchueler a day ago | parent | next [-] | | I believe you could be in a bubble. | |
| ▲ | kakacik a day ago | parent | prev [-] | | Not a single person I know that has any apple device would claim that, nobody cares or even knows in detail stuff we discuss here. Its HN bubble at its best. Another point is, subjectively, added privacy compared to say South Korean products is mostly a myth. It 100% doesn't apply if you are not US citizen and even then, fingers crossed all 3-letter agencies and device creator are not over-analyzing every single data point about you continuously, is naive. What may be better is devices are harder to steal & take ownership, but for that I would need to see some serious independent comparison, not some paid PR from which HN is not completely immune to. |
|
|
| |
| ▲ | Razengan 2 days ago | parent | prev | next [-] | | > I don't know what would drive the equivalent of the era of personal computing this time. Space. You don't want to wait 3-22 minutes for a ping from Mars. | | |
| ▲ | AlecSchueler 2 days ago | parent [-] | | I'm not sure if the handful of people in space stations are a big enough market to drive such changes. |
| |
| ▲ | almostnormal 2 days ago | parent | prev | next [-] | | Centralized only became mainstream when everything started to be offered "for free". When it was buy or pay recurrently more often the choice was to buy. | | |
| ▲ | troupo 2 days ago | parent | next [-] | | There are no longer options to buy. Everything is a subscription | | |
| ▲ | rightbyte 2 days ago | parent [-] | | Between mobilephone service including SMS and an ISP service which usually include mail I don't see the need for any hosted service. There are FOSS alternatives for about everything for hobbyist and consumer use. | | |
| ▲ | api 2 days ago | parent [-] | | There are no FOSS alternatives for consumer use unless the consumer is an IT pro or a developer. Regular people can’t use most open source software without help. Some of it, like Linux desktop stuff, has a nice enough UI that they can use it casually but they can’t install or configure or fix it. Making software that is polished and reliable and automatic enough that non computer people can use it is a lot harder than just making software. I’d say it’s usually many times harder. | | |
| ▲ | rightbyte 2 days ago | parent [-] | | I don't think that is a software issue but a social issue nowadays. FOSS alternatives have become quite OK in my opinion. If computers came with Debian, Firefox and Libre Office preinstalled instead of only W11, Edge and with some Office 365 trail, the relative difficulty would be gone I think. Same thing with most IT departments only dealing with Windows in professional settings. If you even are allowed to use something different you are on your own. |
|
|
| |
| ▲ | torginus 2 days ago | parent | prev [-] | | I think people have seen enough of this 'free' business model to know the things being sold for free are in fact, not. | | |
| ▲ | mlrtime 2 days ago | parent [-] | | Some people, but a majority see it as free. Go to your local town center and randomly poll people how much they pay for email or google search, 99% will say it is free and stop there. |
|
| |
| ▲ | unethical_ban 2 days ago | parent | prev | next [-] | | Privacy, reliable access when not connected to the web, the principal of decentralizing for some. Less supply chain risk for private enterprise. | |
| ▲ | netdevphoenix 2 days ago | parent | prev [-] | | > We now have high speed internet access everywhere This is such a HN comment illustrating how little your average HN knows of the world beyond their tech bubble. Internet everywhere, you might have something of a point. But "high speed internet access everywhere" sounds like "I haven't travelled much in my life". |
|
|
| ▲ | jayd16 3 days ago | parent | prev | next [-] |
| I don't know, I think you're conflating content streaming with central compute. Also, is percentage of screentime the relevant metric? We moved TV consumption to the PC, does that take away from PCs? Many apps moved to the web but that's basically just streamed code to be run in a local VM. Is that a dumb terminal? It's not exactly local compute independent... |
| |
| ▲ | kamaal 3 days ago | parent | next [-] | | Nah, your parent comment has a valid point. Nearly entirety of the use cases of computers today don't involve running things on a 'personal computer' in any way. In fact these days, every one kind of agrees as little as hosting a spreadsheet on your computer is a bad idea. Cloud, where everything is backed up is the way to go. | | |
| ▲ | jayd16 3 days ago | parent [-] | | But again, that's conflating web connected or even web required with mainframe compute and it's just not the same. PC was never 'no web'. No one actually 'counted every screw in their garage' as the PC killer app. It was always the web. | | |
| ▲ | morsch 2 days ago | parent | next [-] | | One of the actual killer apps was gaming. Which still "happens" mostly on the client, today, even for networked games. | | |
| ▲ | jhanschoo 2 days ago | parent [-] | | Yet the most popular games are online-only and even more have their installation base's copies of the game managed by an online-first DRM. | | |
| ▲ | morsch 2 days ago | parent | next [-] | | That's true, but beside the point: even online only games or those gated by online DRM are not streamed or resemble a thin client architecture. That exists, too, with GeForce Now etc, which is why I said mostly. | |
| ▲ | jayd16 2 days ago | parent | prev [-] | | This is just factually inaccurate. | | |
| ▲ | jhanschoo a day ago | parent [-] | | Please provide a more comprehensive response. I suppose I should be more specific as well. Some of the online only games I am thinking of are CoD, Fortnite, LoL and Minecraft. The online-first DRM I am thinking of is Steam. |
|
|
| |
| ▲ | eru 3 days ago | parent | prev | next [-] | | You know that the personal computer predates the web by quite a few years? | | |
| ▲ | jayd16 2 days ago | parent | next [-] | | Sure, I was too hyperbolic. I simply meant connecting to the web didn't make it not a PC. The web really pushed adoption, much more than a person computation machine. It was the main use case for most folks. | |
| ▲ | rambambram 2 days ago | parent | prev [-] | | This. Although briefly, there was at least a couple of years of using pc's without an internet connection. It's unthinkable now. And even back then, when you blinked with your eyes this time period was over. | | |
| ▲ | eru 2 days ago | parent [-] | | That was a pretty long blink? The personal computer arguably begins with VisiCalc in 1979. > Through the 1970s, personal computers had proven popular with electronics enthusiasts and hobbyists, however it was unclear why the general public might want to own one. This perception changed in 1979 with the release of VisiCalc from VisiCorp (originally Personal Software), which was the first spreadsheet application. https://en.wikipedia.org/wiki/History_of_personal_computers#... Mainstream use of the web really took off in the second half of the 1990s. Arbitrarily, let's say with the release of Windows 95. That's a quarter of a century you'd be blinking for. |
|
| |
| ▲ | kamaal 3 days ago | parent | prev | next [-] | | In time Mainframes of this age will make a come back. This whole idea that you can connect lots of cheap low capacity boxes and drive down compute costs is already going away. In time people will go back to thinking compute as a variable of time taken to finish processing. That's the paradigm in the cloud compute world- you are billed for the TIME you use the box. Eventually people will just want to use something bigger that gets things done faster, hence you don't have to rent them for long. | | |
| ▲ | galaxyLogic 2 days ago | parent [-] | | It's also interesting that computing capacity is no longer discussed as instructions per second, but as Giga Watts. |
| |
| ▲ | bandrami 2 days ago | parent | prev [-] | | Umm... I had a PC a decade before the web was invented, and I didn't even use the web for like another 5 years after it went public ("it's an interesting bit of tech but it will obviously never replace gopher...") The killer apps in the 80s were spreadsheets and desktop publishing. |
|
| |
| ▲ | eru 3 days ago | parent | prev [-] | | > I don't know, I think you're conflating content streaming with central compute. Would you classify eg gmail as 'content streaming'? | | |
| ▲ | mikepurvis 3 days ago | parent | next [-] | | But gmail is also a relatively complicated app, much of which runs locally on the client device. | | |
| ▲ | MobiusHorizons 2 days ago | parent | next [-] | | It is true that browsers do much more computation than "dumb" terminals, but there are still non-trivial parallels. Terminals do contain a processor and memory in order to handle settings menus, handle keyboard input and convert incoming sequences into a character array that is then displayed on the screen. A terminal is mostly useless without something attached to the other side, but not _completely_ useless. You can browse the menus, enable local echo, and use device as something like a scratchpad. I once drew up a schematic as ascii art this way. The contents are ephemeral and you have to take a photo of the screen or something in order to retain the data. Web browsers aren't quite that useless with no internet connection, some sites do offer offline capabilities (for example gmail). but even then, the vast majority of offline experiences exist to tide the user over until network can be re-established, instead of truly offering something useful to do locally. Probably the only mainstream counter-examples would be games. | |
| ▲ | WalterSear 2 days ago | parent | prev [-] | | It's still a SAAS, with components that couldn't be replicated client-side, such as AI. | | |
| ▲ | galaxyLogic 2 days ago | parent | next [-] | | Right. But does it matter whether computation happens on the client or server? Probabaly on both in the end. But yes I am looking forward to having my own LMS on my PC which only I have access to. | |
| ▲ | fragmede 2 days ago | parent | prev [-] | | Google's own Gemma models are runnable locally on a Pixel 9 Max so some lev of AI is replicatable client side. As far as Gmail running locally, it wouldn't be impossible for Gmail to be locally hosted and hit a local cache which syncs with a server only periodically over IMAP/JMAP/whatever if Google actually wanted to do it. | | |
| ▲ | eru a day ago | parent [-] | | Yes, but seems like a lot of hassle for not much gain (for Google). | | |
| ▲ | fragmede a day ago | parent [-] | | The gain, as far as local AI goes for Google, is that, at Google scale, the CPU/GPU time to run even a small model like Gemma will add up across Gmail's millions of users. If clients have the hardware for it (which Pixel 9's have) it means Gmail's servers aren't burning CPU/GPU time on it. As far as how Gmail's existing offline mode works, I don't know. |
|
|
|
| |
| ▲ | jayd16 2 days ago | parent | prev [-] | | Well, app code is streamed, content is streamed. The app code is run locally. Content is pulled periodically. The mail server is the mail server even for Outlook. Outlook gives you a way to look through email offline. Gmail apps and even Gmail in Chrome have an offline mode that let you look through email. It's not easy to call it fully offline, nor a dumb terminal. | | |
| ▲ | eru 2 days ago | parent [-] | | Oh, GMail is definitely a cloud offering---even if they have some offline functionality. I was just probing the 'content _streaming_' term. As you demonstrate, you'd have to squint really hard to describe GMail as content streaming. 'Offline' vs 'content streaming' is a false dichotomy. There's more different types of products and services. (Which reminds me a bit of crypto-folks calling everything software that's not in crypto "web2", as if working on stodgy backends in a bank or making Nintendo Switch games has anything to do with the web at all.) | | |
| ▲ | jayd16 a day ago | parent [-] | | Ok sure but there's plenty of simple video streaming in total screen time, which was the context I was replying to. I never claimed it was a dichotomy, simply a large part of screen time that clearly skews the analysis. | | |
| ▲ | eru a day ago | parent [-] | | Yes, it's a large part of screen time at the moment. |
|
|
|
|
|
|
| ▲ | JumpCrisscross 3 days ago | parent | prev | next [-] |
| > using the computer as a dumb terminal to access centralized services "in the cloud" Our personal devices are far from thin clients. |
| |
| ▲ | freedomben 3 days ago | parent | next [-] | | Depends on the app, and the personal device. Mobile devices are increasingly thin clients. Of course hardware-wise they are fully capable personal computers, but ridiculous software-imposed limitations make that increasingly difficult. | |
| ▲ | immutology 3 days ago | parent | prev | next [-] | | "Thin" can be interpreted as relative, no? I think it depends on if you see the browser for content or as a runtime environment. Maybe it depends on the application architecture...? I.e., a compute-heavy WASM SPA at one end vs a server-rendered website. Or is it an objective measure? | |
| ▲ | Cheer2171 3 days ago | parent | prev | next [-] | | But that is what they are mostly used for. | | |
| ▲ | TheOtherHobbes 3 days ago | parent [-] | | On phones, most of the compute is used to render media files and games, and make pretty animated UIs. The text content of a weather app is trivial compared to the UI. Same with many web pages. Desktop apps use local compute, but that's more a limitation of latency and network bandwidth than any fundamental need to keep things local. Security and privacy also matter to some people. But not to most. |
| |
| ▲ | bigyabai 3 days ago | parent | prev | next [-] | | Speak for yourself. Many people don't daily-drive anything more advanced than an iPad. | | |
| ▲ | eru 3 days ago | parent | next [-] | | IPads are incredibly advanced. Though I guess you mean they don't use anything that requires more sophistication from the user (or something like that)? | |
| ▲ | boomlinde 2 days ago | parent | prev [-] | | The Ipad is not a thin client, is it? | | |
| ▲ | troupo 2 days ago | parent [-] | | It is, for the vast majority of users. Turn off internet on they iPad and see how many apps that people use still work. | | |
| ▲ | boomlinde 2 days ago | parent | next [-] | | I'm not questioning whether the Ipad can be used as a client in some capacity, or whether people tend to use it as a client. I question whether the Ipad is a thin client. The answer to that question doesn't lie in how many applications require an internet connection, but in how many applications require local computational resources. The Ipad is a high performance computer, not just because Apple think that's fun, but out of necessity given its ambition: the applications people use on it require local storage and rather heavy local computation. The web browser standards if nothing else have pretty much guaranteed that the age of thin clients is over: a client needs to supply a significant amount of computational resources and storage to use the web generally. Not even Chromebooks will practically be anything less than rich clients. Going back to the original topic (and source of the analogy), IOS hosts an on-device large language model. | | |
| ▲ | troupo 2 days ago | parent [-] | | As with everything, the lines are a bit blurred these days. We may need a new term for these devices. But despite all the compute and storage and on-device models these supercomputers are barely a step above thin clients. |
| |
| ▲ | mlrtime 2 days ago | parent | prev [-] | | No, its a poor anology, I'm old enough to have used a Wyse terminal. That's what I think of when I hear dumb terminal. It was dumb. Maybe a PC without a hard drive (PXE the OS), but if it has storage and can install software, its not dumb. | | |
|
|
| |
| ▲ | bandrami 2 days ago | parent | prev [-] | | I mean, Chromebooks really aren't very far at all from thin clients. But even my monster ROG laptop when it's not gaming is mostly displaying the results of computation that happened elsewhere |
|
|
| ▲ | api 2 days ago | parent | prev | next [-] |
| There are more PCs and serious home computing setups today than there were back then. There are just way way way more casual computer users. The people who only use phones and tablets or only use laptops as dumb terminals are not the people who were buying PCs in the 1980s and 1990s, or they were they were not serious users. They were mostly non-computer-users. Non-computer-users have become casual consumer level computer users because the tech went mainstream, but there's still a massive serious computer user market. I know many people with home labs or even small cloud installations in their basements, but there are about as many of them as serious PC users with top-end PC setups in the late 1980s. |
|
| ▲ | torginus 2 days ago | parent | prev | next [-] |
| I dislike the view of individuals as passive sufferers of the preferences of big corporations. You can and people do self-host stuff that big tech wants pushed into the cloud. You can have a NAS, a private media player, Home Assistant has been making waves in the home automation sphere. Turns out people don't like buying overpriced devices only to have to pay a $20 subscription, and find out their devices don't talk to each other, upload footage inside of their homes to the cloud, and then get bricked once the company selling them goes under and turns of the servers. |
| |
| ▲ | rambambram 2 days ago | parent | next [-] | | This. And the hordes of people reacting with some explanation for why this is. The 'why' is not the point, we already know the 'why'. The point is that you can if you want. Might not be easy, might not be convenient, but that's not the point. No one has to ask someone else for permission to use other tech than big tech. The explanation of 'why' is not an argument. Big tech is not making it easy != it's impossible. Passive sufferers indeed. Edit: got a website with an RSS feed somewhere maybe? I would like to follow more people with a point of view like yours. | |
| ▲ | __alexs 2 days ago | parent | prev | next [-] | | You can dislike it but it doesn't make it less true and getting truer. | |
| ▲ | jhanschoo 2 days ago | parent | prev | next [-] | | You can likewise host models if you so choose. Still the vast majority of people use online services both for personal computing or for LLMs. | |
| ▲ | api 2 days ago | parent | prev [-] | | Things are moving this way because it’s convenient and easy and most people today are time poor. | | |
| ▲ | torginus 2 days ago | parent [-] | | I think it has more to do with the 'common wisdom' dictating that this is the way to do it, as 'we've always done it like this'. Which might even be true, since cloud based software might offer conveniences that local substitutes don't. However this is not an inherent property of cloud software, its just some effort needs to go into a local alternative. That's why I mentioned Home Assistant - a couple years ago, smart home stuff was all the rage, and not only was it expensive, the backend ran in the cloud, and you usually paid a subscription for it. Nowadays, you can buy a local Home Assistant hub (or make one using a Pi) and have all your stuff only connect to a local server. The same is true for routers, NAS, media sharing and streaming to TV etc. You do need to get technical a bit, but you don't need to do anything you couldn't figure out by following a 20 minute Youtube video. |
|
|
|
| ▲ | MSFT_Edging 2 days ago | parent | prev | next [-] |
| I look forward to a possibility where the dumb terminal is less centralized in the cloud, and more how it seems to work in the expanse. They all have hand terminals that seem to automatically interact with the systems and networks of the ship/station/building they're in. Linking up with local resources, and likely having default permissions set to restrict weird behavior. Not sure it could really work like that IRL, but I haven't put a ton of thought into it. It'd make our always-online devices make a little more sense. |
|
| ▲ | npilk 2 days ago | parent | prev | next [-] |
| But for a broader definition of "personal computer", the number of computers we have has only continued to skyrocket - phones, watches, cars, TVs, smart speakers, toaster ovens, kids' toys... I'm with GP - I imagine a future when capable AI models become small and cheap enough to run locally in all kinds of contexts. https://notes.npilk.com/ten-thousand-agents |
| |
| ▲ | seniorThrowaway 2 days ago | parent [-] | | Depending on how you are defining AI models, they already do. Think of the $15 security camera that can detect people and objects. That is AI model driven. LLM's are another story, but smaller, less effective ones can and do already run at the edge. |
|
|
| ▲ | seemaze 3 days ago | parent | prev | next [-] |
| I think that speaks more to the fact that software ate the world, than locality of compute. It's a breadth first, depth last game. |
|
| ▲ | positron26 3 days ago | parent | prev | next [-] |
| Makes me want to unplug and go back to offline social media. That's a joke. The dominant effect was networked applications getting developed, enabling community, not a shift back to client terminals. |
| |
| ▲ | grumbel 2 days ago | parent [-] | | Once up on a time social media was called Usenet and worked offline in a dedicated client with a standard protocol. You only went online to download and send messages, but could then go offline and read them in an app of your choice. Web2.0 discarded the protocol approach and turned your computer into a thin client that does little more than render webapps that require you to be permanently online. | | |
| ▲ | cesarb 2 days ago | parent | next [-] | | > Once up on a time social media was called Usenet and worked offline in a dedicated client with a standard protocol. There was also FidoNet with offline message readers. | |
| ▲ | positron26 2 days ago | parent | prev [-] | | > called Usenet and worked offline People must have been pretty smart back then. They had to know to hang up the phone to check for new messages. |
|
|
|
| ▲ | WhyOhWhyQ 2 days ago | parent | prev | next [-] |
| I guess we're in the kim-1 era of local models, or is that already done? |
|
| ▲ | pksebben 3 days ago | parent | prev | next [-] |
| That 'average' is doing a lot of work to obfuscate the landscape. Open source continues to grow (indicating a robust ecosystem of individuals who use their computers for local work) and more importantly, the 'average' looks like it does not necessarily due to a reduction in local use, but to an explosion of users that did not previously exist (mobile first, SAAS customers, etc.) The thing we do need to be careful about is regulatory capture. We could very well end up with nothing but monolithic centralized systems simply because it's made illegal to distribute, use, and share open models. They hinted quite strongly that they wanted to do this with deepseek. There may even be a case to be made that at some point in the future, small local models will outperform monoliths - if distributed training becomes cheap enough, or if we find an alternative to backprop that allows models to learn as they infer (like a more developed forward-forward or something like it), we may see models that do better simply because they aren't a large centralized organism behind a walled garden. I'll grant that this is a fairly polyanna take and represents the best possible outcome but it's not outlandishly fantastic - and there is good reason to believe that any system based on a robust decentralized architecture would be more resilient to problems like platform enshittification and overdeveloped censorship. At the end of the day, it's not important what the 'average' user is doing, so long as there are enough non-average users pushing the ball forward on the important stuff. |
| |
| ▲ | TheOtherHobbes 3 days ago | parent | next [-] | | We already have monolithic centralised systems. Most open source development happens on GitHub. You'd think non-average developers would have noticed their code is now hosted by Microsoft, not the FSF. But perhaps not. The AI end game is likely some kind of post-Cambrian, post-capitalist soup of evolving distributed compute. But at the moment there's no conceivable way for local and/or distributed systems to have better performance and more intelligence. Local computing has latency, bandwidth, and speed/memory limits, and general distributed computing isn't even a thing. | |
| ▲ | idiotsecant 3 days ago | parent | prev [-] | | I can't imagine a universe where a small mind with limited computing resources has an advantage against a datacenter mind, no matter the architecture. | | |
| ▲ | bee_rider 3 days ago | parent | next [-] | | The small mind could have an advantage if it is closer or more trustworthy to users. It only has to be good enough to do what we want. In the extreme, maybe inference becomes cheap enough that we ask “why do I have to wake up the laptop’s antenna?” | | |
| ▲ | galaxyLogic 2 days ago | parent [-] | | I would like to have a personal AI agent which basically has a copy of my knowledge, a reflection of me, so it could help me mupltiply my mind. |
| |
| ▲ | heavyset_go 3 days ago | parent | prev | next [-] | | I don't want to send sensitive information to a data center, I don't want it to leave my machine/network/what have you. Local models can help in that department. You could say the same about all self-hosted software, teams with billions of dollars to produce and host SaaS will always have an advantage over smaller, local operations. | |
| ▲ | pksebben 2 days ago | parent | prev | next [-] | | The advantage it might have won't be in the form of "more power", it would be in the form of "not burdened by sponsored content / training or censorship of any kind, and focused on the use-cases most relevant to the individual end user." We're already very, very close to "smart enough for most stuff". We just need that to also be "tuned for our specific wants and needs". | |
| ▲ | hakfoo 2 days ago | parent | prev | next [-] | | Abundant resources could enable bad designs. I could in particular see a lot of commercial drive for huge models that can solve a bazillion different use cases, but aren't efficient for any of them. There might be also local/global bias strategies. A tiny local model trained on your specific code/document base may be better aligned to match your specific needs than a galaxy scale model. If it only knows about one "User" class, the one in your codebase, it might be less prone to borrowing irrelevant ideas from fifty other systems. | |
| ▲ | gizajob 3 days ago | parent | prev | next [-] | | The only difference is latency. | |
| ▲ | bigfatkitten 3 days ago | parent | prev [-] | | Universes like ours where the datacentre mind is completely untrustworthy. |
|
|
|
| ▲ | btown 3 days ago | parent | prev [-] |
| Even the most popular games (with few exceptions) present as relatively dumb terminals that need constant connectivity to sync every activity to a mainframe - not necessarily because it's an MMO or multiplayer game, but because it's the industry standard way to ensure fairness. And by fairness, of course, I mean the optimization of enforcing "grindiness" as a mechanism to sell lootboxes and premium subscriptions. And AI just further normalizes the need for connectivity; cloud models are likely to improve faster than local models, for both technical and business reasons. They've got the premium-subscriptions model down. I shudder to think what happens when OpenAI begins hiring/subsuming-the-knowledge-of "revenue optimization analysts" from the AAA gaming world as a way to boost revenue. But hey, at least you still need humans, at some level, if your paperclip optimizer is told to find ways to get humans to spend money on "a sense of pride and accomplishment." [0] We do not live in a utopia. [0] https://www.guinnessworldrecords.com/world-records/503152-mo... - https://www.reddit.com/r/StarWarsBattlefront/comments/7cff0b... |
| |
| ▲ | throw23920 2 days ago | parent [-] | | I imagine there are plenty of indie single-player games that work just fine offline. You lose cloud saves and achievements, but everything else still works. |
|