| ▲ | vee-kay 21 hours ago |
| For last 2 years, I've noticed a worrying trend: the typical budget PCs (especially Laptops) are being sold at higher prices with lower RAM (just 8GB) and lower-end CPUs (and no dedicated GPUs). Industry mandate should have become 16GB RAM for PCs and 8GB for mobile, since years ago, but instead it is as if computing/IT industry is regressing. New budget mobiles are being launched with lower-end specs as well (e.g., new phones with Snapdragon Gen 6, UFS2.2). Meanwhile, features that were being offered in budget phones, e.g., wireless charging, NFC, UFS3.1 have silently been moved to the premium mobile segment. Meanwhile the OSes and software are becoming more and more complex, bloated and more unstable (bugs) and insecure (security loopholes ready for exploits). It is as if the industry has decided to focus on AI and nothing else. And this will be a huge setback for humanity, especially the students and scientific communities. |
|
| ▲ | koito17 14 hours ago | parent | next [-] |
| This is what I find a bit alarming, too. My M3 Max MacBook Pro takes 2 full seconds to boot Slack, a program that used to literally be an IRC client. Many people still believe client-side compute is cheap and worrying about it is premature optimization. Of course, some performance-focused software (e.g. Zed) does start near-instantly on my MacBook, and it makes other software feel sluggish in comparison. But this is the exception, not the rule. Even as specs regress, I don't think most people in software will care about performance. In my experience, product managers never act on the occasional "[X part of an app] feels clunky" feedback from clients. I don't expect that to change in the near future. |
| |
| ▲ | Workaccount2 8 hours ago | parent [-] | | Software has an unfortunate attribute (compared to hardware) where it's largely bound by what users will tolerate as opposed to what practically is possible. Imagine Ford, upon the invention of push-button climate controls, just layered those buttons on top of the legacy sliders, using arms and actuators so pressing "Heat Up" moved an actuating arm that moved that underlying legacy "Heat" slider up. Then when touch screens came about, they just put a tablet over those buttons (which are already over the sliders), so selecting "Heat Up" fired a solenoid that pressed the "Heat Up" button that moved the arm to slide the "Heat Up" slider. Ford, or anyone else doing hardware, would never implement this or it's analog, for a long obvious list of reasons. But in software? That's just Thursday. Hence software has seemed stuck in time for 30 years while processing speed has done 10,000x. No need to redesign the whole system, just type out a few lines of "actuating arm" code. |
|
|
| ▲ | SXX 18 hours ago | parent | prev | next [-] |
| No dedicated GPU is certainly unrelated to whatever been happening for last two years. It's just in last 5 years integrated GPUs become good enough even for mid-tier gaming let alone running browser and hw accel in few work apps. And even before 5 years ago majority of dedicated GPUs in relatively cheap laptops was garbage barely better than intrgrated one. Manufacturers mostly put them in there for marketing of having e.g Nvidia dGPU. |
| |
| ▲ | amiga-workbench 18 hours ago | parent | next [-] | | A dedicated GPU is a red flag for me in a laptop. I do not want the extra power draw or the hybrid graphics sillyness. The Radeon Vega in my ThinkPad is surprisingly capable. | | |
| ▲ | vee-kay 10 hours ago | parent | next [-] | | Dedicated GPUs in gaming laptops are a necessity for the IT industry, as it forces manufacturers, assemblers and software makers to be more creative and ambitious with power draw and graphics software, and better optimal usage of available hardware resources (e.g., better battery and different performance modes to compensate for the higher power consumption due to the GPU; so a low-power mode enabled by casual user will disable the dedicated GPU and make the OS and apps dependent on the integrated GPU instead, but same/another user using same PC can switch to dedicated GPU when playing a game or doing VFX or modeling). Without dedicated GPUs, we consumers will get only weaker hardware, slower software and the slow death of graphics software market. See the fate of Chromebooks market segment - it is almost dead, and ChromeOS itself got abandoned. Meanwhile, the same Google which made ChromeOS as a fresh alternative OS to Windows, Mac and Linux, is trying to gobble the AI market. And the AI race is on. And the result of all this AI focus and veering away from dedicated GPUs (even by market leader nVidia, which is no longer having GPUs as a priority) is not only the skyrocketing price hikes in hardware components, but also other side effects. e.g., new laptops are being launched with NPUs which are good for AI but bad for gaming and VFX/CAD-CAM work, yet they cost a bomb, and the result is that budget laptop market segment has suffered - new budget laptops have just 8GB RAM, 250GB/500GB SSD, and poor CPU, and such weak hardware, so even basic software (MS Office) struggles on such laptops. And yet even such poor laptops are having a higher cost these days. This kind of deliberate market crippling affects hundreds of millions of students and middle class customers who need affordable yet decent performance PCs. | |
| ▲ | iancmceachern 15 hours ago | parent | prev | next [-] | | For me it's a necessity to run the software I need to do my work (CAD design) | |
| ▲ | silon42 13 hours ago | parent | prev | next [-] | | Same here... I do not wish for a laptop with >65W USB-C power requirements. | |
| ▲ | trinsic2 16 hours ago | parent | prev [-] | | Yea I agree it's not worth it to have a igpu a dedicated. If I'm correct in what you are talking about. There's always issues with that setup in laptops. But I'd stay away from all laptops at this point until we get an Adminstration that enforces anti trust. All manufactures have been cutting so many corners, your likely to have hardware problems within a year unless it's a MacBook or a business class laptop. |
| |
| ▲ | trinsic2 16 hours ago | parent | prev | next [-] | | Yea mid-tier is a stretch. Maybe low-end gaming | | |
| ▲ | SXX 14 hours ago | parent [-] | | Low end gaming ia 2d indie titles and they now run on toasters. All the popular mass matket games work on iGPU: fortnite, roblox, mmos, arena shooters, battlen royales. Good chunk of cross platform console titles also work just fine. You can play Cyberpunk or BG3 on damn Steam Deck. I wont call this low end. Number of games that dont run to some extent without dGPU is limited to heavy AAA titles and niche PC only genres. | | |
| ▲ | ChoGGi 8 hours ago | parent [-] | | You can play doom the dark ages on steam deck. Granted at 30 fps, but it's still doom with ray tracing. |
|
| |
| ▲ | Fabricio20 16 hours ago | parent | prev [-] | | I'm gonna be honest thats not my experience at all. I got a laptop with a modern ryzen 5 CPU four years ago that had an iGPU because "its good enough for even mid-tier gaming!" and it was so bad that I couldn't play 1440p on youtube without it skipping frames. Tried parsec to my desktop PC and it was failing that as well. I returned it and bought a laptop with a nvidia dGPU (low end still, I think it was like a 1050-refresh-refresh equivalent) and haven't had any of those problems. That AMD Vega gpu just couldn't do it. | | |
| ▲ | copx 13 hours ago | parent | next [-] | | No Ryzen 5 system should have any trouble playing YouTube videos, there must have been something wrong with your system. | |
| ▲ | rabf 9 hours ago | parent | prev | next [-] | | Thats a problem with youtube not your gpu! | |
| ▲ | Iulioh 16 hours ago | parent | prev [-] | | What processor are we talking about? Your experience is extremely weird |
|
|
|
| ▲ | 999900000999 17 hours ago | parent | prev | next [-] |
| Or, we can expect better from software. Maybe someone can fork Firefox and make it run better, hard cap how much a browser window can use. The pattern of lazy almost non existent optimization combined with blaming consumers for having weak hardware, needs to stop. On my 16GB ram lunar lake budget laptop CachyOS( Arch) runs so much smoother than Windows. This is very unscientific, but using htop , running Chrome/YouTube playing music, 2 browser games and VS code having Git Copilot review a small project, I was only using 6GBs of ram. For the most part I suspect I could do normal consumer stuff( filing paperwork and watching cat videos) on an 8GB laptop just fine. Assuming I'm using Linux. All this Windows 11 bloat makes computers slower than they should be. A part of me hopes this pushes Microsoft to at least create a low ram mode that just runs the OS and display manager. Then let's me use my computer as I see fit instead of constantly doing a million other weird things. We don't *need* more ram. We need better software. |
| |
| ▲ | walterbell 17 hours ago | parent | next [-] | | > hopes this pushes Microsoft to at least create a low ram mode Windows OS and Surface (CoPilot AI-optimized) hardware have been combined in the "Windows + Devices" division. > We don't *need* more ram RAM and SSDs both use memory wafers and are equally affected by wafer hoarding, strategic supply reductions and market price manipulation. Nvidia is re-inventing Optane for AI storage with higher IOPS, and paid $20B for Groq LPUs using SRAM for high memory bandwidth. The architectural road ahead has tiers of memory, storage and high-speed networking, which could benefit AI & many other workloads. How will industry use the "peace dividend" of the AI wars? https://www.forbes.com/sites/robtoews/2020/08/30/the-peace-d... The rapid growth of the mobile market in the late 2000s and early 2010s led to a burst of technological progress.. core technologies like GPS, cameras, microprocessors, batteries, sensors and memory became dramatically cheaper, smaller and better-performing.. This wave of innovation has had tremendous second-order impacts on the economy. Over the past decade, these technologies have spilled over from the smartphone market to transform industries from satellites to wearables, from drones to electric vehicles.
| | |
| ▲ | PunchyHamster 15 hours ago | parent [-] | | > RAM and SSDs both use NAND flash and are equally affected by wafer hoarding, strategic supply reductions and market price manipulation. Why on earth you think RAM uses NAND flash ? | | |
| |
| ▲ | citrin_ru 8 hours ago | parent | prev | next [-] | | > Maybe someone can fork Firefox and make it run better, hard cap how much a browser window can use Browsing web requires more and more RAM each year but I don't think browsers are the main reason - sites use more and more JS code. With a hard cap many sites will stop working. Software bloat is a natural tendency, the path of least resistance. Trimming weigh requires a significant effort and in case of web - a coordinated effort. I don't believe it could happen unless Google (having a browser with >60% market share) will force this but Google own sites are among worst offenders in term of hardware requirements. | |
| ▲ | tazjin 12 hours ago | parent | prev | next [-] | | > Or, we can expect better from software. Maybe someone can fork Firefox and make it run better, hard cap how much a browser window can use. You can already do this. For example, I use `systemd-run` to run browsers with CPU quotas applied. Firefox gets 400% CPU (i.e. up to 4 cores), and no more. Example command: systemd-run --user --scope -p CPUQuota=400% firefox | | |
| ▲ | vee-kay 10 hours ago | parent [-] | | You can impose CPU restrictions in Windows 10 or 11 too... You can limit CPU usage for a program in Windows by adjusting the "Maximum processor state" in the power options to a lower percentage, such as 80%. Additionally, you can set the program's CPU affinity in Task Manager. Please note this will only affect the process scheduling. You can also use a free tool like Process Lasso or BES to limit the CPU for a Windows application. You can use a free tools like HWInfo, SysInternals (ProcMon, SysMom, ProcDump) to monitor and check for CPU usage, especially to investigate CPU spikes caused by rogue (malware or poor performance) apps. | | |
| ▲ | cellular 8 hours ago | parent [-] | | CPU affinity?
I haven't been able to change priority in task manager since window 8 i think. Cpu affinity seems only to allow which cores get assigned...not really good management. | | |
| ▲ | vee-kay 8 hours ago | parent [-] | | Process Lasso worked for me few years back when I needed to do CPU cores restriction for an old program. |
|
|
| |
| ▲ | viccis 17 hours ago | parent | prev | next [-] | | Yeah I'm sure that will happen, just like prices will go back down when the stupid tariffs are gone. | |
| ▲ | MangoToupe 15 hours ago | parent | prev | next [-] | | > I was only using 6GBs of ram. Insane that this is seen as "better software". I could do basically the same functionality in 2000 with 512mb. I assume this is because everything runs through chrome with dozens more layers of abstraction but | | |
| ▲ | kasabali 14 hours ago | parent | next [-] | | More like 128MB. 512MB in 2000 was like HEDT level (though I'm not sure that acronym existed back then) | | |
| ▲ | anthk 13 hours ago | parent | next [-] | | 512MB weren't that odd for multimedia from 2002, barely a few years later.
By 2002 256MB of RAM were the standard, almost a new low-end PC. 64MB = w98se OK, XP will swap a lot on high load, nixlikes really fast with fvwm/wmaker and the like. KDE3 needs 128MB to run well, so get a bit down. No issues with old XFCE releases. Mozilla will crawl, other browsers will run fine. 128MB = w98se really well, XP willl run fine, SP2-3 will lag. Nixlikes
will fly with wmaker/icewm/fvwm/blackbox and the like. Good enough for mozilla. 192MB = Really decent for a full KDE3 desktop or for Windows XP with real life speeds. 256MB = Like having 8GB today for Windows 10, Gnome 3 or Plasma 6. Yes, you can run then with 2GB and ZRAM, but, realistically, and for the modern bloated tools, 8GB for a 1080p desktop it's mandatory. Even with UBlock Origin for the browser. Ditto back in the day. With 256MB XP and KDE3
flied and they ran much faster than even Win98 with 192MB of RAM. | | |
| ▲ | vee-kay 6 hours ago | parent [-] | | Win10 can work with 8GB DDR4 RAM. Win11, on the other hand, meh.. Though Win10 will stop getting updates, but M$ is mistaken if it thinks it can force customers to switch to more expensive, buggy, bad performance Win11. That's why I switched to Linux for my old PC (a cute little Sony Viao), though it worked well with Win10. Especially after I upgraded it to an 1TB SATA SSD (since even old SATA1.0 socket works with newer SATA SSDs, as SATA interface is backward compatible; it felt awesome to see a new SSD work perfectly in a 15years old laptop), some additional RAM (24GB (8+16) - 16GB repurposed from another PC), and a new battery (from Amazon - it was simply plug and play - simply eject the old battery from its slot and plug in the new battery). I find it refreshing to see how easy it was to upgrade old PCs, I think manufacturers are deliberately making it harder to repair devices, especially mobile phones. That's why EU and India were forced to mandate the Right to Repair. |
| |
| ▲ | MangoToupe 9 hours ago | parent | prev [-] | | You're right. I thought I was misremembering.... |
| |
| ▲ | 999900000999 3 hours ago | parent | prev [-] | | Let's not let perfect be the enemy of good. I was being lazy, but optimized I guess I could get down to 4GB of ram. |
| |
| ▲ | dottjt 17 hours ago | parent | prev [-] | | That's not happening though, hence why we need more ram. | | |
| ▲ | ssl-3 16 hours ago | parent [-] | | Eh? As I see it, we've got options. Option A: We do a better job at optimizing software so that good performance requires less RAM than might otherwise be required Option B: We wish that things were different, such that additional RAM were a viable option like it has been at many times in the past. Option C: We use our time-benders to hop to a different timeline where this is all sorted more favorably (hopefully one where the Ballchinians are friendly) --- To evaluate these in no particular order: Option B doesn't sound very fruitful. I mean: It can be fun to wish, but magical thinking doesn't usually get very far. Option C sounds fun, but my time-bender got roached after the last jump and the version of Costco we have here doesn't sell them. (Maybe someone else has a working one, but they seem to be pretty rare here.) That leaves option A: Optimize the software once, and duplicate that optimized software to whomever it is useful using that "Internet" thing that the cool kids were talking about back in the 1980s. | | |
| ▲ | rabf 8 hours ago | parent [-] | | There is plenty of well optimised software out there already, hopefully a ram shortage can encourage people to seek it out. Would be nice if there were some well curated lists of apps. Sort of like suckless but perhaps a little less extreme. A long standing problem in the software industry is developers havein insanely overspecced machines and fat interne popes leading to performance issues going unnoticed by the people that should be fixing them. The claim that they the need that power to run their their code editor and compiler is really only a need for a code editors and compilers that suck less. I've always ran a 10 year old machine (I'm cheap) and had the expectation that my debug builds run acceptably fast! |
|
|
|
|
| ▲ | zahlman 20 hours ago | parent | prev | next [-] |
| I'm reading this thread on an 11-year-old desktop with 8GB of RAM and not feeling any particular reason to upgrade, although I've priced it out a few times just to see. Mint 22.x doesn't appear to be demanding any more of my machine than Mint 20.x. Neither is Firefox or most websites, although YouTube chat still leaks memory horrendously. (Of course, download sizes have increased.) |
| |
| ▲ | Groxx 18 hours ago | parent | next [-] | | I've been enjoying running Mint on my terrible spec chromebook - it only has 3GB of RAM, but it rarely exceeds 2GB used with random additions and heavy firefox use. The battery life is obscenely good too, I easily break 20 hours on it as long as I'm not doing something obviously taxing. Modern software is fine for the most part. People look at browsers using tens of gigabytes on systems with 32GB+ and complain about waste rather than being thrilled that it's doing a fantastic job caching stuff to run quickly. | |
| ▲ | callc 19 hours ago | parent | prev | next [-] | | Mint is probably around 0.05% of desktop/laptop users. I think root comment is looking at the overall picture of what all customers can get for their money, and see it getting worse. This wasn’t mentioned, but it’s a new thing for everyone to experience, since the general trend of computer hardware is it gets cheaper and more powerful over time. Maybe not exponentially any more, but at least linearly cheaper and more powerful. | | |
| ▲ | OGEnthusiast 17 hours ago | parent [-] | | > I think root comment is looking at the overall picture of what all customers can get for their money, and see it getting worse. A $999 MacBook Air today is vastly better than the same $999 MacBook Air 5 years ago (and even more so once you count inflation). | | |
| ▲ | chii 16 hours ago | parent [-] | | The OP is not looking at the static point (of the price of that item), but the trend - ala, the derivative of the price vs quality. It was on a steep upward incline, and now it's flattening. |
|
| |
| ▲ | cons0le 8 hours ago | parent | prev | next [-] | | This is a text based website though. It should be fast on everything. Most websites are a bloated mess, and a lot slower | |
| ▲ | vee-kay 6 hours ago | parent | prev [-] | | Try Win11 on that old PC, and you'll really feel the need for more RAM and a better CPU. I sometimes feel M$ is deliberately making its Windows OS clunkier, so it can turn into a SaaS offering with a pricey subscription, like it has already successfully done with its MS-Office suite (Office 365 is the norm in corporates these days, though individuals have to shell out $100 per year for MS Office 365 Personal edition). We can still buy MS Office 2024 as standalone editions, but they are not cheap, because Micro$oft knows the alternatives on the market aren't good enough to be a serious threat. |
|
|
| ▲ | chii 17 hours ago | parent | prev | next [-] |
| > Industry mandate should have become 16GB RAM for PCs it was only less than 10 yrs ago that a high end PC would have this level of ram. I think the last decade of cheap ram and increasing core count (and hz) have spoiled a lot of people. We are just returning back on trend. May be software would be written better now that you cannot expect the average low budget PC to have 32G of ram and 8 cores. |
|
| ▲ | GuB-42 14 hours ago | parent | prev | next [-] |
| I think it is the result of specs like RAM and CPU no longer being the selling point it once was, except for gaming PCs. Instead people want thin laptops, good battery life, nice screens, premium materials, etc... We have got to the point where RAM and CPU are no longer a limiting factor for most tasks, or at least until software become bloated enough to matter again. If you want a powerful laptop for cheap, get a gaming PC. The build quality and battery life probably won't be great, but you can't be cheap without making compromises. Same idea for budget mobiles. A Snapdragon Gen 6 (or something by Mediatek) with UFS2.2 is more than what most people need. |
|
| ▲ | linguae 21 hours ago | parent | prev | next [-] |
| I wonder what we can do to preserve personal computing, where users, not vendors, control their computers? I’m tired of the control Microsoft, Apple, Google, OpenAI, and some other big players have over the entire industry. The software has increasingly become enshittified, and now we’re about to be priced out of hardware upgrades. The problem is coming up with a viable business model for providing hardware and software that respect users’ ability to shape their environments as they choose. I love free, open-source software, but how do developers make a living, especially if they don’t want to be funded by Big Tech? |
| |
| ▲ | Saris 21 hours ago | parent [-] | | Run a lightweight Linux distro on older hardware maybe? | | |
| ▲ | ta9000 20 hours ago | parent | next [-] | | This is it. Buy used Dell and HP hardware with 32 GB of RAM and swap the pcie ssd for 4 TB. | | |
| ▲ | dartharva 12 hours ago | parent [-] | | No, this is not it. It only worked when there were a small number of buyers for used hardware, who were largely enthusiasts. The moment it becomes mainstream you're going to face the same scarcity in the used/refurbished market as well. | | |
| ▲ | intrasight 10 hours ago | parent [-] | | There are lots of such computers to be repurposed. It'll relieve price pressure and avoid e-waste. |
|
| |
| ▲ | thatguy0900 18 hours ago | parent | prev [-] | | Exclusively using a ever dwindling stock of old hardware is not really a practical solution to preserving hardware rights in the long term | | |
| ▲ | jjgreen 11 hours ago | parent [-] | | The future is ragged shoeless and grimy humans fighting over the last few 1990's pocket calculators. |
|
|
|
|
| ▲ | mmsimanga 11 hours ago | parent | prev | next [-] |
| It is even worse for those of us in Africa. The equivalent phone I can buy for USD$150-250 back home is absolutely shocking in terms of how bad it is in ram ,disk space and often an outdated version of Android. I buy my phones on Amazon US which delivers and I get a much better phone for the same price range. |
| |
| ▲ | lysace 11 hours ago | parent [-] | | In that price range paid bundling of pre-installed apps become a major economi driver. Those installs are worth a lot more in the US. | | |
|
|
| ▲ | HexPhantom 12 hours ago | parent | prev | next [-] |
| I think a lot of this rings true, but what makes it especially frustrating is that none of it feels technically necessary |
|
| ▲ | cons0le 8 hours ago | parent | prev | next [-] |
| >It is as if the industry has decided to focus on AI and nothing else. I mean, isn't this exactly what happened? I could be wrong about this, but didn't ycombinator themselves say they weren't accepting any ideas that didn't include AI? 80 percent of the jobs people reach out to me for are shady AI jobs that I have no interest in. Hiring in other non-AI jobs seems to have slowed. When I talk to "computer" people, they only want to talk about AI. I wish I could quit computers and never look at a screen again. |
|
| ▲ | deadbabe 18 hours ago | parent | prev [-] |
| I think it will be a good thing actually. Engineers, no longer having the luxury of assuming that users have high end system specs, will be forced to actually write fast and efficient software. No more bloated programs eating up RAM for no reason. |
| |
| ▲ | rurp 18 hours ago | parent | next [-] | | The problem is that higher performing devices will still exist. Those engineers will probably keep using performant devices and their managers will certainly keep buying them. We'll probably end up in an even more bifurcated world where the well off have access to lot of great products and services that most of humanity is increasingly unable to access. | | |
| ▲ | anigbrowl 17 hours ago | parent | next [-] | | Have the laws of supply and demand been suspended? Capital is gonna pour into memory fabrication over the next year or two, and there will probably be a glut 2-3 years from now, followed by retrenchment and wails that innovation has come to a halt because demand has stalled. | | |
| ▲ | kasabali 14 hours ago | parent | next [-] | | We're not talking about growing tomatoes in your backyard. | |
| ▲ | re-thc 16 hours ago | parent | prev [-] | | > Have the laws of supply and demand been suspended? There is the law of uncertainty override it eg trade wars, tariffs , etc. No 1 is going all in with new capacity. |
| |
| ▲ | charcircuit 17 hours ago | parent | prev | next [-] | | If "performant" devices are not widespread then telemetry will reveal that the app is performing poorly for most users. If a new festure uses more memory and sugnificantly increases the crash rate, it will be disabled. Apps are optimized for the install base, not for the engineer's own hardware. | | |
| ▲ | DeepSeaTortoise 16 hours ago | parent [-] | | What is the point of telemetry if your IDE launching in under 10s is considered the pinnacle of optimization? That's like 100B+ instructions on a single core of your average superscalar CPU. I can't wait for maps loading times being measured in percentage of trip time. | | |
| ▲ | charcircuit 14 hours ago | parent | next [-] | | Because you don't want to regress any of the substeps of such a loading progress to turn it back into 10+ seconds of loading. | |
| ▲ | deadbabe 11 hours ago | parent | prev [-] | | If your IDE isn’t launching instantly you have a bad IDE. | | |
|
| |
| ▲ | hansvm 18 hours ago | parent | prev | next [-] | | Can confirm, I'm currently requesting as much RAM as can fit in the chassis and permission to install an OS not too divorced from what we run in prod. On the bright side, I'm not responsible for the UI abominations people seem to complain about WRT laptop specs. | |
| ▲ | incompatible 18 hours ago | parent | prev [-] | | How many "great products and services" even need a lot of RAM, assuming that we can live without graphics-intensive games? | | |
| ▲ | TheDong 17 hours ago | parent | next [-] | | Some open source projects use Slack to communicate, which is a real ram hog. Github, especially for viewing large PR discussions, takes a huge amount of memory. If someone with a low-memory laptop wants to get into coding, modern software-development-related services are incredible memory hogs. | | |
| ▲ | deadbabe 11 hours ago | parent [-] | | IRC is far superior than Slack when it comes to RAM usage. Projects should just switch to that. | | |
| ▲ | anthk 23 minutes ago | parent [-] | | Or even Jabber/XMPP, which has video call and inline images support and it would run on machines a magnitude slower. |
|
| |
| ▲ | rhdunn 15 hours ago | parent | prev [-] | | Image, video, and music editing. Developing, running, and debugging large applications. | | |
| ▲ | Ekaros 13 hours ago | parent [-] | | The last three sounds to me like self-inflicted issues. If applications weren't so large, wouldn't less resources be needed? |
|
|
| |
| ▲ | Insanity 18 hours ago | parent | prev | next [-] | | I'm not optimistic that this would be the outcome. You likely will just have poor running software instead. After all, a significant part of the world is already running lower powered devices on terrible connection speeds (such as many parts of Africa). | | |
| ▲ | chii 16 hours ago | parent [-] | | > a significant part of the world is already running lower powered devices but you cannot consider this in isolation. The developed markets have vastly higher spending consumers, which means companies cater to those higher spending customers proportionately more (as profits demand it). Therefore, the implication is that lower spending markets gets less investment and less catered to; after all, R&D spending is still a limiting factor. If the entirety of the market is running on lower powered devices, then it would get catered for - because there'd be no (or not enough) customers with high powered devices to profit off. | | |
| ▲ | Insanity 8 hours ago | parent [-] | | In general what you are saying makes sense. But there are specific counter examples, such as Crysis in 2008 or CyberPunk 2077 some years ago. Both didn’t run great on the “average consumer hardware”. But I’ll admit this is cherry picking from my side :) |
|
| |
| ▲ | trinsic2 16 hours ago | parent | prev | next [-] | | I don't think it's going to happen in this day and age. Some smart people will but most barley know how to write there own code let alone write efficient code | |
| ▲ | TheDong 18 hours ago | parent | prev | next [-] | | At the same time, AI has made it easier than ever to produce inefficient code, so I expect to rather see an explosion of less efficient software. | | |
| ▲ | rabf 7 hours ago | parent [-] | | It has also made it easier than ever to build native applications that use the os provided toolkits and avoid adding a complete web-tech stack to everything. |
| |
| ▲ | laterium 17 hours ago | parent | prev | next [-] | | Why are you celebrating compute becoming more expensive? Do you actually think it will be good? | |
| ▲ | thatguy0900 18 hours ago | parent | prev | next [-] | | I think the actual outcome is they will expect you to rent servers to conduct all your computing on and your phone and pc will be a dumb terminal. | |
| ▲ | jghn 17 hours ago | parent | prev [-] | | Good luck with that. |
|