| ▲ | jayd16 3 days ago |
| I don't know, I think you're conflating content streaming with central compute. Also, is percentage of screentime the relevant metric? We moved TV consumption to the PC, does that take away from PCs? Many apps moved to the web but that's basically just streamed code to be run in a local VM. Is that a dumb terminal? It's not exactly local compute independent... |
|
| ▲ | kamaal 3 days ago | parent | next [-] |
| Nah, your parent comment has a valid point. Nearly entirety of the use cases of computers today don't involve running things on a 'personal computer' in any way. In fact these days, every one kind of agrees as little as hosting a spreadsheet on your computer is a bad idea. Cloud, where everything is backed up is the way to go. |
| |
| ▲ | jayd16 3 days ago | parent [-] | | But again, that's conflating web connected or even web required with mainframe compute and it's just not the same. PC was never 'no web'. No one actually 'counted every screw in their garage' as the PC killer app. It was always the web. | | |
| ▲ | morsch 2 days ago | parent | next [-] | | One of the actual killer apps was gaming. Which still "happens" mostly on the client, today, even for networked games. | | |
| ▲ | jhanschoo 2 days ago | parent [-] | | Yet the most popular games are online-only and even more have their installation base's copies of the game managed by an online-first DRM. | | |
| ▲ | morsch 2 days ago | parent | next [-] | | That's true, but beside the point: even online only games or those gated by online DRM are not streamed or resemble a thin client architecture. That exists, too, with GeForce Now etc, which is why I said mostly. | |
| ▲ | jayd16 2 days ago | parent | prev [-] | | This is just factually inaccurate. | | |
| ▲ | jhanschoo a day ago | parent [-] | | Please provide a more comprehensive response. I suppose I should be more specific as well. Some of the online only games I am thinking of are CoD, Fortnite, LoL and Minecraft. The online-first DRM I am thinking of is Steam. |
|
|
| |
| ▲ | eru 3 days ago | parent | prev | next [-] | | You know that the personal computer predates the web by quite a few years? | | |
| ▲ | jayd16 2 days ago | parent | next [-] | | Sure, I was too hyperbolic. I simply meant connecting to the web didn't make it not a PC. The web really pushed adoption, much more than a person computation machine. It was the main use case for most folks. | |
| ▲ | rambambram 2 days ago | parent | prev [-] | | This. Although briefly, there was at least a couple of years of using pc's without an internet connection. It's unthinkable now. And even back then, when you blinked with your eyes this time period was over. | | |
| ▲ | eru 2 days ago | parent [-] | | That was a pretty long blink? The personal computer arguably begins with VisiCalc in 1979. > Through the 1970s, personal computers had proven popular with electronics enthusiasts and hobbyists, however it was unclear why the general public might want to own one. This perception changed in 1979 with the release of VisiCalc from VisiCorp (originally Personal Software), which was the first spreadsheet application. https://en.wikipedia.org/wiki/History_of_personal_computers#... Mainstream use of the web really took off in the second half of the 1990s. Arbitrarily, let's say with the release of Windows 95. That's a quarter of a century you'd be blinking for. |
|
| |
| ▲ | kamaal 3 days ago | parent | prev | next [-] | | In time Mainframes of this age will make a come back. This whole idea that you can connect lots of cheap low capacity boxes and drive down compute costs is already going away. In time people will go back to thinking compute as a variable of time taken to finish processing. That's the paradigm in the cloud compute world- you are billed for the TIME you use the box. Eventually people will just want to use something bigger that gets things done faster, hence you don't have to rent them for long. | | |
| ▲ | galaxyLogic 2 days ago | parent [-] | | It's also interesting that computing capacity is no longer discussed as instructions per second, but as Giga Watts. |
| |
| ▲ | bandrami 2 days ago | parent | prev [-] | | Umm... I had a PC a decade before the web was invented, and I didn't even use the web for like another 5 years after it went public ("it's an interesting bit of tech but it will obviously never replace gopher...") The killer apps in the 80s were spreadsheets and desktop publishing. |
|
|
|
| ▲ | eru 3 days ago | parent | prev [-] |
| > I don't know, I think you're conflating content streaming with central compute. Would you classify eg gmail as 'content streaming'? |
| |
| ▲ | mikepurvis 3 days ago | parent | next [-] | | But gmail is also a relatively complicated app, much of which runs locally on the client device. | | |
| ▲ | MobiusHorizons 2 days ago | parent | next [-] | | It is true that browsers do much more computation than "dumb" terminals, but there are still non-trivial parallels. Terminals do contain a processor and memory in order to handle settings menus, handle keyboard input and convert incoming sequences into a character array that is then displayed on the screen. A terminal is mostly useless without something attached to the other side, but not _completely_ useless. You can browse the menus, enable local echo, and use device as something like a scratchpad. I once drew up a schematic as ascii art this way. The contents are ephemeral and you have to take a photo of the screen or something in order to retain the data. Web browsers aren't quite that useless with no internet connection, some sites do offer offline capabilities (for example gmail). but even then, the vast majority of offline experiences exist to tide the user over until network can be re-established, instead of truly offering something useful to do locally. Probably the only mainstream counter-examples would be games. | |
| ▲ | WalterSear 2 days ago | parent | prev [-] | | It's still a SAAS, with components that couldn't be replicated client-side, such as AI. | | |
| ▲ | galaxyLogic 2 days ago | parent | next [-] | | Right. But does it matter whether computation happens on the client or server? Probabaly on both in the end. But yes I am looking forward to having my own LMS on my PC which only I have access to. | |
| ▲ | fragmede 2 days ago | parent | prev [-] | | Google's own Gemma models are runnable locally on a Pixel 9 Max so some lev of AI is replicatable client side. As far as Gmail running locally, it wouldn't be impossible for Gmail to be locally hosted and hit a local cache which syncs with a server only periodically over IMAP/JMAP/whatever if Google actually wanted to do it. | | |
| ▲ | eru a day ago | parent [-] | | Yes, but seems like a lot of hassle for not much gain (for Google). | | |
| ▲ | fragmede a day ago | parent [-] | | The gain, as far as local AI goes for Google, is that, at Google scale, the CPU/GPU time to run even a small model like Gemma will add up across Gmail's millions of users. If clients have the hardware for it (which Pixel 9's have) it means Gmail's servers aren't burning CPU/GPU time on it. As far as how Gmail's existing offline mode works, I don't know. |
|
|
|
| |
| ▲ | jayd16 2 days ago | parent | prev [-] | | Well, app code is streamed, content is streamed. The app code is run locally. Content is pulled periodically. The mail server is the mail server even for Outlook. Outlook gives you a way to look through email offline. Gmail apps and even Gmail in Chrome have an offline mode that let you look through email. It's not easy to call it fully offline, nor a dumb terminal. | | |
| ▲ | eru 2 days ago | parent [-] | | Oh, GMail is definitely a cloud offering---even if they have some offline functionality. I was just probing the 'content _streaming_' term. As you demonstrate, you'd have to squint really hard to describe GMail as content streaming. 'Offline' vs 'content streaming' is a false dichotomy. There's more different types of products and services. (Which reminds me a bit of crypto-folks calling everything software that's not in crypto "web2", as if working on stodgy backends in a bank or making Nintendo Switch games has anything to do with the web at all.) | | |
| ▲ | jayd16 a day ago | parent [-] | | Ok sure but there's plenty of simple video streaming in total screen time, which was the context I was replying to. I never claimed it was a dichotomy, simply a large part of screen time that clearly skews the analysis. | | |
| ▲ | eru a day ago | parent [-] | | Yes, it's a large part of screen time at the moment. |
|
|
|
|