Remix.run Logo
nickjj 7 hours ago

I'm running a box I put together in 2014 with an i5-4460 (3.2ghz), 16 GB of RAM, GeForce 750ti, first gen SSD, ASRock H97M Pro4 motherboard with a reasonable PSU, case and a number of fans. All of that parted out at the time was $700.

I've never been more fearful of components breaking than current day. With GPU and now memory prices being crazy, I hope I never have to upgrade.

I don't know how but the box is still great for every day web development with heavy Docker usage, video recording / editing with a 4k monitor and 2nd 1440p monitor hooked up. Minor gaming is ok too, for example I picked up Silksong last week, it runs very well at 2560x1440.

For general computer usage, SSDs really were a once in a generation "holy shit, this upgrade makes a real difference" thing.

Cerium 6 hours ago | parent | next [-]

Don't worry, if you are happy with those specs you can get corporate ewaste dell towers on ebay for low prices. "Dell precision tower", I just saw a listing for 32gb ram, Xeon 3.6ghz for about 300 usd.

Personally, at work I use the latest hardware at home I use ewaste.

silverquiet 5 hours ago | parent | next [-]

I got a junk Precision workstation last year as a "polite" home server (it's quiet and doesn't look like industrial equipment, but still has some server-like qualities, particularly the use of ECC RAM). I liked it so much that it ended up becoming my main desktop.

trollbridge 2 hours ago | parent | prev | next [-]

I have some Dell server with dual Xeons and 192GB RAM. It is NUMA but that’s fine for Docker workloads where you can just associate them with a CPU.

The RAM for that is basically ewaste at this point, yet it runs the workloads it needs to do just fine.

vondur 5 hours ago | parent | prev | next [-]

Ha, I bought one of those for $500 from Ebay. It's a dual Xeon Silver workstation with a Nvidia Quadro P400 8GB, 128GB RAM and 256G SSD. I threw in a 1TB SSD and it's been working pretty well.

Forgeties79 3 hours ago | parent [-]

What are the limitations of machines like these?

snerbles 3 hours ago | parent | next [-]

I too have a crippling dual CPU workstation hoarding habit. Single thread performance is usually worse than enthusiast consumer desktops, and gaming performance will suffer if the game isn't constrained to a single NUMA domain that also happens to have the GPU being used by that game.

On the other hand, seeing >1TiB RAM in htop always makes my day happier.

t0mas88 3 hours ago | parent | prev | next [-]

Power usage is the main limitation of using these as a home server. They have a high idle power use.

trollbridge an hour ago | parent [-]

One of the reasons I use these is because it’s cold half the year and it’s not hard to basically use to supplement the heat.

bri3d an hour ago | parent | prev | next [-]

Very bad performance per watt and higher maintenance needs. Bad performance per watt generally means a larger formfactor and more noise as well.

arprocter 3 hours ago | parent | prev | next [-]

On Dell you'll probably be stuck with the original mobo, and their SFFs don't take standard PSUs

sevensor an hour ago | parent [-]

In favor of their SFFs, they get retired 10k at a time, so you might as well pick up a second one for spares.

arprocter 26 minutes ago | parent [-]

Not a bad call, although you'll probably need to upgrade the PSU to add a GPU (if you can find one small enough to fit the SFF case)

3 hours ago | parent | prev | next [-]
[deleted]
MrVitaliy 3 hours ago | parent | prev [-]

Just performance when compared to current generation hardware. Not significantly worse, but things like DDR4 ram and single thread performance show the signs of aging. Frankly for similar $$$ you can get a new hardware from beelink or equivalent.

Forgeties79 2 hours ago | parent [-]

Got it so basically it's one of those things you do if 1) the project interests you and/or 2) you get one dirt cheap and don't have high expectations for certain tasks

cestith 5 hours ago | parent | prev | next [-]

At home some of my systems are ewaste from former employers who would just give it to employees rather than paying for disposal. A couple are eBay finds. I do have one highish-end system at a time specifically for games. Some of my systems are my old hardware reassembled after all the parts for gaming have been upgraded over the years.

ge96 4 hours ago | parent | prev | next [-]

Optiplex's used to be my go to the SFF, I had a 1050ti in there not crazy but worked for basic gaming

gpderetta 5 hours ago | parent | prev | next [-]

surely these will soon be scavenged for ram? Arbitrage opportunity?

legobmw99 5 hours ago | parent [-]

If they’re DDR4 (or even DDR3), it has no value to e.g. OpenAI so it shouldn’t really matter

noboostforyou 2 hours ago | parent | next [-]

But it's a cascading effect, OpenAI gobbled up all of DDR5 production to the point that consumers are choosing to upgrade their older DDR4 systems instead of paying even more to upgrade to a new system that uses DDR5. As a result, DDR4 ram is at a new all time high - https://pcpartpicker.com/trends/price/memory/

auspiv 4 hours ago | parent | prev | next [-]

DDR4 prices are up 2-6x in the last couple months depending on frequency. High end, high speed modules (e.g. 128GB 3200MHz LRDIMM) are super expensive.

legobmw99 3 hours ago | parent [-]

Isn’t that due to different reasons (like the end of production for older standards)? I recall the same happening shortly after manufacturing for DDR3 ceased, before eventually demand essentially went to 0

jl6 3 hours ago | parent | prev | next [-]

Demand spills over to substitutes.

gpderetta 4 hours ago | parent | prev [-]

The price of DDR4 is also going up!

zzzeek 3 hours ago | parent | prev [-]

ive dealt a bit with ewaste kinds of machines, old Dells and such and have two still running here, the issue is they use a crapton of power. I had one such ewaste Dell machine that I just had to take to the dump it was so underpeforming while it used 3x more power than my other two Dells combined.

snickerbockers 37 minutes ago | parent | prev | next [-]

>For general computer usage, SSDs really were a once in a generation "holy shit, this upgrade makes a real difference" thing.

I only ever noticed it on my windows partition. IIRC on my linux partition it was hardly noticeable because Linux is far better at caching disk contents than windows and also linux in general can boot surprisingly fast even on HDDs if you only install modules you actually need so that the autoconfiguration doesn't waste time probing dozens of modules in search of the best one.

kube-system 6 hours ago | parent | prev | next [-]

> I've never been more fearful of components breaking than current day.

The mid 90s was pretty scary too. Minimum wage was $4.25 and a new Pentium 133 was $935 in bulk.

tossandthrow 6 hours ago | parent | next [-]

If you were in minimum wage jn the 90s your lifelihood likely didn't rely on Pentium processors.

Also, it is frightening how close that is to current day minimum wage.

kube-system 6 hours ago | parent | next [-]

I was an unemployed student then -- a generous family member gifted me my first Windows PC, and it cost about the same as a used car.

silisili 4 hours ago | parent | prev | next [-]

1990-1997 averaged >4% yearly compounded minimum wage hikes, which is probably about where it should have been. The late 90s to today has been <1.25%.

briffle 5 hours ago | parent | prev | next [-]

Yep, I had a Cyrix processor in mine during that time. Slackware didn't care.

pixl97 4 hours ago | parent [-]

It also worked as a very good space heater.

immibis 3 hours ago | parent | prev | next [-]

If you account for inflation it's actually higher than current minimum wage.

adventured 5 hours ago | parent | prev [-]

Except nobody earns the minimum wage today, it's less than 1/2 of 1% of US labor.

The median full-time wage is now $62,000. You can start at $13 at almost any national retailer, and $15 or above at CVS / Walgreens / Costco. The cashier positions require zero work background, zero skill, zero education. You can make $11-$13 at what are considered bad jobs, like flipping pizzas at Little Caesars.

jfindper 4 hours ago | parent | next [-]

>You can make $11-$13 at what are considered bad jobs, like flipping pizzas at Little Caesars.

Holy moly! 11 whole dollars an hour!?

Okay, so we went from $4.25 to $11.00. That's a 159% change. Awesome!

Now, lets look at... School, perhaps? So I can maybe skill-up out of Little Caesars and start building a slightly more comfortable life.

Median in-state tuition in 1995: $2,681. Median in-state tuation in 2025: $11,610. Wait a second! That's a 333% change. Uh oh.

Should we do the same calculation with housing...? Sure, I love making myself more depressed. 1995: $114,600. 2025: $522,200. 356% change. Fuck.

reissbaker 5 minutes ago | parent | next [-]

This will probably be an unpopular reply, but "real median household income" — aka, inflation-adjusted median income — has steadily risen since the 90s and is currently at an all-time high in the United States. [1] Inflation includes the cost of housing (by measuring the cost of rent).

However, we are living through a housing supply crisis, and while overall cost of living hasn't gone up, housing's share of that has massively multiplied. We would all be living much richer lives if we could bring down the cost of housing — or at least have it flatline, and let inflation take care of the rest.

1: https://fred.stlouisfed.org/series/MEHOINUSA672N

AnthonyMouse 2 hours ago | parent | prev | next [-]

You're identifying the right problem (school and housing costs are completely out of hand) but then resorting to an ineffective solution (minimum wage) when what you actually need is to get those costs back down.

The easy way to realize this is to notice that the median wage has increased by proportionally less than the federal minimum wage has. The people in the middle can't afford school or housing either. And what happens if you increase the minimum wage faster than overall wages? Costs go up even more, and so does unemployment when small businesses who are also paying those high real estate costs now also have to pay a higher minimum wage. You're basically requesting the annihilation of the middle class.

Whereas you make housing cost less and that helps the people at the bottom and the people in the middle.

jfindper 2 hours ago | parent [-]

>resorting to an ineffective solution (minimum wage) when what you actually need is to get those costs back down.

I'm not really resorting to any solution.

My comment is pointing out that when you only do one side of the equation (income) without considering the other side (expenses), it's worthless. Especially when you are trying to make a comparison across years.

How we go about fixing the problem, if we ever do, is another conversation. But my original comment doesn't attempt to suggest any solution, especially not one that "requests the annihilation of the middle class". It's solely to point out that adventured's comment is a bunch of meaningless numbers.

AnthonyMouse 2 hours ago | parent [-]

> It's solely to point out that adventured's comment is a bunch of meaningless numbers.

The point of that comment was to point out that minimum wage is irrelevant because basically nobody makes that anyway; even the entry-level jobs pay more than the federal minimum wage.

In that context, arguing that the higher-than-minimum wages people are actually getting still aren't sufficient implies an argument that the minimum wage should be higher than that. And people could read it that way even if it's not what you intended.

So what I'm pointing out is that that's the wrong solution and doing that rather than addressing the real issue (high costs) is the thing that destroys the middle class.

jfindper 2 hours ago | parent [-]

>implies an argument that the minimum wage should be higher than that.

It can also imply that expenses should come down, you just picked the implication you want to argue against.

AnthonyMouse 2 hours ago | parent [-]

Exactly. When it's ambiguous at best it's important that people not try to follow the bad fork.

2 hours ago | parent [-]
[deleted]
genewitch 3 hours ago | parent | prev | next [-]

1980 mustang vs 2025 mustang is what i usually use. in the past 12 years my price per KWh electricity costs have doubled.

in the mid 90s you could open a CD (certificate of deposit at a bank or credit union) and get 9% or more APY. savings accounts had ~4% interest.

in the mid 90s a gallon of gasoline in Los Angeles county was $0.899 in the summer and less than that any other time. It's closer to $4.50 now.

mrits 3 hours ago | parent | prev [-]

The BBQ place across the street from me pays $19/hour to be a cashier in Austin. Or the sign says it does anyways

mossTechnician 2 hours ago | parent | next [-]

Does the sign happen to have the words "up to" before the dollar amount?

jfindper 3 hours ago | parent | prev [-]

sweet! according to austintexas.gov, that's only $2.63 below the 2024 living wage. $5.55 below, if you use the MIT numbers for 2025.

As long as you don't run into anything unforseen like medical expenses, car breakdowns, etc., you can almost afford a bare-bones, mediocre life with no retirement savings.

hylaride 7 minutes ago | parent [-]

I don't disagree that there has been a huge issue with stagnant wages, but not everybody who works minimum wage needs to make a living wage. Some are teenagers, people just looking for part time work, etc. Pushing up minimum wage too high can risk destroying jobs that are uneconomical at that level that could have been better than nothing for many people.

That being said, there's been an enormous push by various business groups to do everything they can to keep wages low.

It's a complicated issue and one can't propose solutions without acknowledging that there's a LOT of nuance...

GoatInGrey 5 hours ago | parent | prev | next [-]

Counterpoint: affording average rent for a 1-bedroom apartment (~$1,675) requires that exact median full-time wage. $15 an hour affords you about $740 for monthly housing expenses. One can suggest getting two roommates for a one-bedroom apartment, but they would be missing the fact that this is very unusual for the last century. It's more in line with housing economics from the early-to-mid 19th century.

mossTechnician 4 hours ago | parent | prev | next [-]

In addition to the other comments, I presume the big box retailers do not hire for full-time positions when they don't have to, and gig economy work is rapidly replacing jobs that used to be minimum wage.

yndoendo 4 hours ago | parent | prev | next [-]

My uncle was running a number of fast food restaurants for a franchise owner making millions. His statement about this topic is simple, "they are not living wage jobs ... go into manufacturing if you want a living wage".

I don't like my uncle at all and find him and people like him to be terrible human beings.

The-Bus 4 hours ago | parent [-]

If a business can't pay a living wage, it's not really a successful business. I, too, could become fabulously wealthy selling shoes if someone just have me shoes for $1 so I could resell them for $50.

AnthonyMouse 2 hours ago | parent | next [-]

> If a business can't pay a living wage, it's not really a successful business.

Let's consider the implications of this. We take an existing successful business, change absolutely nothing about it, but separately and for unrelated reasons the local population increases and the government prohibits the construction of new housing.

Now real estate is more scarce and the business has to pay higher rent, so they're making even less than before and there is nothing there for them to increase wages with. Meanwhile the wages they were paying before are now "not a living wage" because housing costs went way up.

Is it this business who is morally culpable for this result, or the zoning board?

raw_anon_1111 3 hours ago | parent | prev | next [-]

Can we use the same argument for all of the businesses that are only surviving because of VC money?

I find it rich how many tech people are working for money losing companies, using technology from money losing companies and/or trying to start a money losing company and get funding from a VC.

Every job is not meant to support a single person living on their own raising a family.

dpkirchner 2 hours ago | parent [-]

That's what VC money is for. When it comes to paying below a living wage, we typically expect the government to provide support to make up the difference (so they're not literally homeless). Businesses that rely on government to pay their employees should not exist.

raw_anon_1111 2 hours ago | parent [-]

That’s kind of the point, a mom and pop restaurant or a McDonald’s franchise owner doesn’t have the luxury of burning $10 for every $1 in revenue for years and being backed by VC funding.

Oh and the average franchise owner is not getting rich. They are making $100K a year to $150K a year depending on how many franchises they own.

Also tech companies can afford to pay a tech worker more money because you don’t have to increase the number of workers when you get more customers.

YC is not going to give the aspiring fast food owner $250K to start their business like they are going to give “pets.ai - AI for dog walkers”

dpkirchner a few seconds ago | parent [-]

In that case they probably shouldn't be running a McDonald's. They aren't owed that and they shouldn't depend on their workers getting government support just so the owners can "earn" their own living wage.

CamperBob2 3 hours ago | parent | prev [-]

Classically, not all jobs are considered "living wage" jobs. That whole notion is something some people made up very recently.

A teenager in his/her first job at McDonald's doesn't need a "living wage." As a result of forcing the issue, now the job doesn't exist at all in many instances... and if it does, the owner has a strong incentive to automate it away.

autoexec 2 hours ago | parent | next [-]

> A teenager in his/her first job at McDonald's doesn't need a "living wage." As a result of forcing the issue, now the job doesn't exist at all in many instances

The majority of minimum wage workers are adults, not teenagers. This is also true for McDonald's employees. The idea that these jobs are staffed by children working summer jobs is simply not reality.

Anyone working for someone else, doing literally anything for 40 hours a week, should be entitled to enough compensation to support themselves at a minimum. Any employer offering less than that is either a failed business that should die off and make room for one that's better managed or a corporation that is just using public taxpayer money to subsidize their private labor expenses.

kube-system an hour ago | parent | prev | next [-]

A teenager is presumably also going to school full time and works their job part time, not ~2000 hours per year.

If we build a society where someone working a full time job is not able to afford to reasonably survive, we are setting ourselves up for a society of crime, poverty, and disease.

swiftcoder 2 hours ago | parent | prev | next [-]

> A teenager in his/her first job at McDonald's doesn't need a "living wage."

Turns out our supply of underage workers is neither infinite, nor even sufficient to staff all fast food jobs in the nation

jfindper 2 hours ago | parent | prev | next [-]

>A teenager in his/her first job at McDonald's doesn't need a "living wage."

Wow, a completely bad-faith argument.

Can you try again, but this time, try "steelman" instead of "strawman"?

3 hours ago | parent | prev [-]
[deleted]
4 hours ago | parent | prev | next [-]
[deleted]
zzzeek 3 hours ago | parent | prev [-]

in that case it should be completely uncontroversial to raise the minimum wage and help that .5% of labor out. yet somehow, it's a non-starter. (btw, googling says the number is more like 1.1%. in 1979, 13.4% of the labor force made minimum wage. this only shows how obsolete the current minimum wage level is).

trollbridge an hour ago | parent | prev | next [-]

In the mid 90s mere mortals ran a 486DX2 or DX4.

Pentium 60/66s were in the same price tier as expensive alpha or sparc workstations.

microtonal 5 hours ago | parent | prev | next [-]

That's kinda like saying the mid-20s were pretty scary too, minimum wage was AMOUNT and a MacBook M4 Max was $3000..

In the mid-90s me and my brother were around 14 and 10, earning nothing but a small amount of monthly pocket money. We were fighting so much over our family PC, that we decided to save and put together a machine from second-hands parts we could get our hands on. We built him a 386 DX 40 or 486SX2 50 or something like that and it was fine enough for him to play most DOS games. Heck, you could even run Linux (I know because I ran Linux in 1994 on a 386SX 25, with 5MB RAM and 20MB disk space).

sejje 28 minutes ago | parent | next [-]

Linux notoriously runs on worse hardware than almost anything, especially in the 90s

kube-system 4 hours ago | parent | prev [-]

> That's kinda like saying the mid-20s were pretty scary too, minimum wage was AMOUNT and a MacBook M4 Max was $3000..

A powerbook 5300 was $6500 in 1995, which is $13,853 today.

kergonath an hour ago | parent [-]

> A powerbook 5300 was $6500 in 1995

The TCO was much higher, considering how terrible and flimsy this laptop was. The power plug would break if you looked at it funny and the hinge was stiff and brittle. I know that’s not the point you are making but I am still bitter about that computer.

nickjj 6 hours ago | parent | prev | next [-]

> The mid 90s was pretty scary too.

If you fast forward just a few years though, it wasn't too bad.

You could put together a decent fully parted out machine in the late 90s and early 00s for around $600-650. These were machines good enough to get a solid 120 FPS playing Quake 3.

morsch 6 hours ago | parent | prev [-]

Are you sure? From what I can tell it's more like 500 USD RRP on release, boxed.

Either way, it was the 90s: two years later that was a budget CPU because the top end was two to three times the speed.

Barathkanna 7 hours ago | parent | prev | next [-]

I agree with you on SSDs, that was the last upgrade that felt like flipping the “modern computer” switch overnight. Everything since has been incremental unless you’re doing ML or high-end gaming.

asenna 6 hours ago | parent | next [-]

I know it's not the same. But I think a lot of people had a similar feeling going from Intel-Macbooks to Apple Silicon. An insane upgrade that I still can't believe.

crazygringo 5 hours ago | parent | next [-]

This. My M1 MacBook felt like a similarly shocking upgrade -- probably not quite as much as my first SSD did, but still the only other time when I've thought, "holy sh*t, this is a whole different thing".

wongarsu 5 hours ago | parent | prev | next [-]

The M1 was great. But the jump felt particularly great because Intel Macbooks had fallen behind in performance per dollar. Great build quality, great trackpad, but if you were after performance they were not exactly the best thing to get

skylurk 3 hours ago | parent [-]

For as long as I can remember, before M1, Macs were always behind in the CPU department. PC's had much better value if you cared about CPU performance.

After the M1, my casual home laptop started outperforming my top-spec work laptops.

kergonath an hour ago | parent [-]

> For as long as I can remember, before M1, Macs were always behind in the CPU department. PC's had much better value if you cared about CPU performance.

But not if you cared about battery life, because that was the tradeoff Apple was making. Which worked great until about 2015-2016. The parts they were using were not Intel’s priority and it went south basically after Broadwell, IIRC. I also suppose that Apple stopped investing heavily into a dead-end platform while they were working on the M1 generation some time before it was announced.

redwall_hp 4 hours ago | parent | prev | next [-]

I usually use an M2 Mac at work, and haven't really touched Windows since 2008. Recently I had to get an additional Windows laptop (Lenovo P series) for a project my team is working on, and it is such a piece of shit. It's unfathomable that people are tolerating Windows or Intel (and then still have the gall to talk shit about Macs).

It's like time travelling back to 2004. Slow, loud fans, random brief freezes of the whole system, a shell that still feels like a toy, a proprietary 170W power supply and mediocre battery life, subpar display. The keyboard is okay, at least. What a joke.

Meanwhile, my personal M3 Max system can render Da Vinci timelines with complex Fusion compositions in real time and handle whole stacks of VSTs in a DAW. Compared to the Lenovo choking on an IDE.

ponector an hour ago | parent [-]

There will be not so big difference if you compare laptops in the same price brackets. Cheap PCs are crap.

bigyabai 4 hours ago | parent | prev [-]

It's a lot more believable if you tried some of the other Wintel machines at the time. Those Macbook chassis were the hottest of the bunch, it's no surprise the Macbook Pro was among the first to be redesigned.

simlevesque 6 hours ago | parent | prev | next [-]

I've had this with gen5 PCIe SSDs recently. My T710 is so fast it's hard to believe. But you need to have a lot of data to make it worth.

Example:

    > time du -sh .
    737G .
    ________________________
    Executed in   24.63 secs
And on my laptop that has a gen3, lower spec NVMe:

    > time du -sh .
    304G .
    ________________________
    Executed in   80.86 secs

It's almost 10 times faster. The CPU must have something to do with it too but they're both Ryzen 9.
adgjlsfhk1 5 hours ago | parent | next [-]

To me that reads 3x, not "almost 10x". The main differrence here is probably power. A desktop/server is happy to send 15W to the SSD and hundreds of watts to the CPU, while a laptop wants the SSD running in the ~1 watt range and the CPU in the 10s of watts range.

simlevesque 5 hours ago | parent [-]

There's over twice as much content in the first test. It's around 3.8gb/s vs 30gb/s if you divide both folder size and both du durations. That makes it 7.9 times faster and I'm comfortable calling this "almost 10 times".

ls65536 4 hours ago | parent | next [-]

The total size isn't what matters in this case but rather the total number of files/directories that need to be traversed (and their file sizes summed).

simlevesque 3 hours ago | parent [-]

I responded here, it's essentially the same content: https://news.ycombinator.com/item?id=46150030

adgjlsfhk1 4 hours ago | parent | prev [-]

oops. I missed the size diff. that's a solid 8x. that's cool!

taneliv 4 hours ago | parent | prev [-]

I believe you, but your benchmark is not very useful. I get this on two 5400rpm 3T HDDs in a mirror:

    $ time du -sh .
    935G    .
                                                                                                                          
    real    0m1.154s
Simply because there's less than 20 directories and the files are large.
simlevesque 4 hours ago | parent [-]

I should have been more clear: It's my http cache for my crawling jobs. Lots of files in many shapes.

My new setup: gen5 ssd in desktop:

    > time find . -type f | wc -l
    5645741
    ________________________
    Executed in    4.77 secs
My old setup, gen3 ssd in laptop:

    > time find . -type f | wc -l
    2944648
    ________________________
    Executed in   27.53 secs
Both are running pretty much non-stop, very slowly.
jug 3 hours ago | parent | prev | next [-]

I thought so too on my mini PC. Then I got myself my current Mac mini M4 and I have to give it to Apple, or maybe in part to ARM... It was like another SSD moment. It's still not spun up the fan and run literally lukewarm at most my office, coding and photo work.

pstadler 6 hours ago | parent | prev | next [-]

This and high resolution displays, for me at least.

wdfx 6 hours ago | parent | prev [-]

The only time I had this other than changing to SSD was when I got my first multi-core system, a Q6600 (confusingly labeled a Core 2 Quad). Had a great time with that machine.

genewitch 3 hours ago | parent [-]

"Core" was/is like "PowerPC" or "Ryzen", just a name. Intel Core i9, for instance, as opposed to Intel Pentium D, both x86_x64, different chip features.

prmoustache 6 hours ago | parent | prev | next [-]

As other mentionned, there are plenty of refurbished stuff and second hand parts that there isn't any risk of finding yourself having to buy something at insane prices if your computer was to die today.

If you don't need a GPU for gaming you can get a decent computer with an i5, 16GB of ram and an nvme drive for usd 50. I bought one a few weeks ago ago.

forinti 6 hours ago | parent | prev | next [-]

You can still get brand new generic motherboards for old CPUs.

I swapped out old ASUS MBs for an i3-540 and an Athlon II X4 with brand new motherboards.

They are quite cheaper than getting a new kit, so I guess that's the market they cater to: people who don't need an upgrade but their MBs gave in.

You can get these for US$20-US$30.

davely 4 hours ago | parent | prev | next [-]

About a month ago, the mobo for my 5950x decided to give up the ghost. I decided to just rebuild the whole thing and update from scratch.

So went crazy and bought a 9800X3D, purchased a ridiculous amount of DDR5 RAM (96GB, which matches my old machine’s DDR4 RAM quantity). At the time, it was about $400 USD or so.

I’ve been living in blissful ignorance since then. Seeing this post, I decided to check Amazon. The same amount of RAM is currently $1200!!!

VHRanger 4 hours ago | parent | next [-]

Same, I got 96GB of high end 6000MHz DDR5 this summer for $600CAD and now it's nearly triple at $1500CAD

genewitch 3 hours ago | parent | prev [-]

what are you doing with that old 5950x?

mikepurvis 6 hours ago | parent | prev | next [-]

For a DDR3-era machine, you'd be buying RAM for that on Ebay, not Newegg.

I have an industrial Mini-ITX motherboard of similar vintage that I use with an i5-4570 as my Unraid machine. It doesn't natively support NVMe, but I was able to get a dual-m2 expansion card with its own splitter (no motherboard bifurcation required) and that let me get a pretty modern-feeling setup with nice fast cache disks.

phantasmish 3 hours ago | parent | prev | next [-]

I’m worried about the Valve mini PC coming out next year.

Instant buy $700 or under. Probably buy up to $850. At, like, $1,100, though… solid no. And I’m counting on that thing to take the power-hog giant older Windows PC tower so bulky it’s unplugged and in a closet half the time, out of my house.

acters 4 hours ago | parent | prev | next [-]

I am still running an i5 4690k, really all I need is better GPU but those prices are criminal. I wish I got a 4090 when I had the chance rip

genewitch 3 hours ago | parent [-]

intel arc b580 (i think that's the latest one) isn't obnoxiously priced but you're going to have to face the fact that your PCIE is really very slow. But it should work.

if you want to save even more money get the older Arc Battlemage GPUs. I used one it was comparable with an RTX 3060; i returned it because the machine i was running it in had a bug that was fixed 2 days before i returned it but i didn't know that.

I was seriously considering getting a b580 or waiting until the b*70 came out with more memory, although at this point i doubt it will be very affordable considering VRAM prices going up as well. A friend is supposedly going to ship me a few GTX 1080ti cards so i can delay buying newer cards for a bit.

aposm 6 hours ago | parent | prev | next [-]

A few years later but similarly - I am still running a machine built spur-of-the-moment in a single trip to Micro Center for about $500 in late 2019 (little did we know what was coming in a few months!). I made one small upgrade in probably ~2022 to a Ryzen 5800X w/ 64GB of RAM but otherwise untouched. It still flies through basically anything & does everything I need, but I'm dreading when any of the major parts go and I have to fork out double or triple the original cost for replacements...

KronisLV 2 hours ago | parent | prev | next [-]

If I needed a budget build, I'd probably look in the direction of used parts on AliExpress, you can sometimes find good deals on AM4 CPUs (that platform had a lot of longevity, even now my main PC has a Ryzen 7 5800X) and for whatever reason RX 580 GPUs were really, really widespread (though typically the 2048SP units). Not amazing by any means, but a significant upgrade from your current setup and if you don't get particularly unlucky, it might last for years with no issues.

Ofc there's also the alternate strategy of going for a mid/high end rig and hoping it lasts a decade, but the current DDR5 prices make me depressed so yeah maybe not.

I genuinely hope that at some point the market will get flooded with good components with a lot of longevity and reasonable prices again in the next gens: like AM4 CPUs, like that RX 580, or GTX 1080 Ti but I fear that Nvidia has learnt their lesson in releasing stuff that pushes you in the direction of incremental upgrades rather than making something really good for the time, same with Intel's LGA1851 being basically dead on arrival, after the reviews started rolling in (who knows, maybe at least mobos and Core Ultra chips will eventually be cheap as old stock). On the other hand, at least something like the Arc B580 GPUs were a step in the right direction - competent and not horribly overpriced (at least when it came to MSRP, unfortunately the merchants were scumbags and often ignored it).

hnu0847 5 hours ago | parent | prev | next [-]

Don't all RAM manufacturers offer a lifetime warranty?

That said, if the shortage gets bad enough then maybe they could find themselves in a situation where they were unable/unwilling to honor warranty claims?

SoftTalker 4 hours ago | parent [-]

I've never heard of a lifetime warranty on anything in the enterprise space. Maybe consumer stuff, where it's just a marketing gimmick.

chihuahua 4 hours ago | parent [-]

Oh, your RAM died? That means its lifetime ended at that moment, and so did the lifetime warranty. Is there anything else we can help you with today?

the__alchemist 6 hours ago | parent | prev | next [-]

Man, it was just GPU for a while. But same boat. I regret not getting the 4090 for $1600 direct from Nvidia. "That's too much for a video card", and got the 4080 instead. I dread the day when I need to replace it.

jakogut 5 hours ago | parent [-]

The Radeon RX 9070 XT performs at a similar level to the RTX 5070, and is retailing around $600 right now.

the__alchemist 5 hours ago | parent [-]

No CUDA means not an option for me.

the__alchemist 5 hours ago | parent | next [-]

> What kinds of applications do you use that require CUDA?

Molecular dynamics simulations, and related structural bio tasks.

vlovich123 2 hours ago | parent [-]

Is the CUDA compat layer AMD has that transparently compiled existing CUDA just fine insufficient somehow or buggy somehow? Or are you just stuck in the mindshare game and haven’t reevaluate whether the AMD situation has changed this year?

the__alchemist an hour ago | parent [-]

I haven't checkout out AMD's transparency layer and know nothing about it. I tried to get vkFFT working in addition to cuFFT for a specific computation, but can't get it working right; crickets on the GH issue I posted.

I use Vulkan for graphics, but Vulkan compute is a mess.

I'm not in a mindshare, and this isn't a political thing. I am just trying to get the job done, and have observed that no alternative has stepped up to nvidia's CUDA from a usability perspective.

vlovich123 7 minutes ago | parent [-]

I didn’t talk about Vulkan compute.

> have observed that no alternative has stepped up to nvidia's CUDA from a usability perspective.

I’m saying this is a mindshare thing if you haven’t evaluated ROCm / HIP. HIPify can convert CUDA source to HIP automatically and HIP is very similar syntax to CUDA.

jakogut 5 hours ago | parent | prev [-]

What kinds of applications do you use that require CUDA?

square_usual 5 hours ago | parent | prev | next [-]

You can still buy DDR4 for pretty cheap, and if you're replacing a computer that old any system built around DDR4 will still be a massive jump in performance.

ls612 5 hours ago | parent | prev | next [-]

GPU prices are actually at MSRP now for most cards other than the 5090.

adventured 5 hours ago | parent | prev | next [-]

You could still easily build a $800-$900 system that would dramatically jump forward from that machine.

$700 in 2014 is now $971 inflation adjusted (BLS calculator).

RTX 3060 12gb $180 (eBay). Sub $200 CPU (~5-7 times faster than yours). 16gb DDR4 $100-$120. $90 PSU. $100 motherboard. WD Black 1tb SSD $120. Roughly $800 (which inflation adjusted beats your $700).

Right now is a rather amazing time for CPUs, even though RAM prices have gone crazy.

Assume you find some deals somewhere in there, you could do slightly better with either pricing or components.

TacticalCoder 6 hours ago | parent | prev | next [-]

> For general computer usage, SSDs really were a once in a generation "holy shit, this upgrade makes a real difference" thing.

The last one were I really remember seeing a huge speed bump was going from a regular SSD to a NVMe M.2 PCIe SSD... Around 2015 I bought one of the very first consumer motherboard with a NVMe M.2 slot and put a Samsung 950 Pro in it: that was quite something (now I was upgrading the entire machine, not just the SSD, so there's that too). Before that I don't remember when I switched from SATA HDD to SATA SSD.

I'm now running one of those WD SN850X Black NVMe SSD but my good old trusty, now ten years old, Samsung 950 Pro is still kicking (in the wife's PC). There's likely even better out there and they're easy to find: they're still reasonably priced.

As for my 2015 Core i7-6700K: it's happily running Proxmox and Docker (but not always on).

Even consumer parts are exceptionally reliable: the last two failures I remember, in 15 years (and I've got lots of machines running), are a desktop PSU (replaced by a Be Quiet! one), a no-name NVMe SSD and a laptop's battery.

Oh and my MacBook Air M1's screen died overnight for no reason after precisely 13 months, when I had a warranty of 12 months, (some refer to it as the "bendgate") but that's because first gen MacBook Air M1 were indescribable pieces of fragile shit. I think Apple got their act together and came up with better screens in later models.

Don't worry too much: PCs are quite reliable things. And used parts for your PC from 2014 wouldn't be expensive on eBay anyway. You're not forced to upgrade to a last gen PC with DDR5 (atm 3x overpriced) and a 5090 GPU.

genewitch 3 hours ago | parent [-]

fyi someone or something is downvoting your recent posts to oblivion, and i didn't see any obvious reason.

testing22321 4 hours ago | parent | prev | next [-]

I got a used M1 MacBook Air a year ago.

By far the fastest computer I’ve ever used. It felt like the SSD leap of years earlier.

sneak 7 hours ago | parent | prev [-]

Am I crazy for thinking that anyone using computers for doing their job and making their income should have a $5k/year computer hardware budget at a minimum? I’m not saying to do what I do and buy a $7k laptop and a $15k desktop every year but compared to revenue it seems silly to be worrying about a few thousand dollars per year delta.

I buy the best phones and desktops money can buy, and upgrade them often, because, why take even the tiniest risk that my old or outdated hardware slows down my revenue generation which is orders of magnitude greater than their cost to replace?

Even if you don’t go the overkill route like me, we’re talking about maybe $250/month to have an absolutely top spec machine which you can then use to go and earn 100x that.

Spend at least 1% of your gross revenue on your tools used to make that revenue.

macNchz 6 hours ago | parent | next [-]

What is the actual return on that investment, though? This is self indulgence justified as « investment ». I built a pretty beefy PC in 2020 and have made a couple of upgrades since (Ryzen 5950x, 64GB RAM, Radeon 6900XT, a few TB of NVMe) for like $2k all-in. Less than $40/month over that time. It was game changing upgrade from an aging laptop for my purposes of being able to run multiple VMs and a complex dev environment, but I really don’t know what I would have gotten out of replacing it every year since. It’s still blazing fast.

Even recreating it entirely with newer parts every single year would have cost less than $250/mo. Honestly it would probably be negative ROI just dealing with the logistics of replacing it that many times.

crazygringo 5 hours ago | parent | next [-]

> This is self indulgence justified as « investment ».

Exactly that. There's zero way that level of spending is paying for itself in increased productivity, considering they'll still be 99% as productive spending something like a tenth of that.

It's their luxury spending. Fine. Just don't pretend it's something else, or tell others they ought to be doing the same, right?

londons_explore 6 hours ago | parent | prev | next [-]

Every hardware update for me involves hours or sometimes days of faffing with drivers and config and working round new bugs.

Nobody is paying for that time.

And whilst it is 'training', my training time is better spent elsewhere than battling with why cuda won't work on my GPU upgrade.

Therefore, I avoid hardware and software changes merely because a tiny bit more speed isn't worth the hours I'll put in.

mikepurvis 6 hours ago | parent | prev [-]

My main workstation is similar, basically a top-end AM4 build. I recently bumped from a 6600 XT to a 9070 XT to get more frames in Arc Raiders, but looking at what the cost would be to go to the current-gen platform (AM5 mobo + CPU + DDR5 RAM) I find myself having very little appetite for that upgrade.

Clent 6 hours ago | parent | prev | next [-]

This is a crazy out of touch perspective.

Depending on salary, 2 magnitudes at $5k is $500k.

That amount of money for the vast majority of humans across the planet is unfathomable.

No one is worried about if the top 5% can afford DRAM. Literally zero people.

sneak 6 hours ago | parent [-]

[flagged]

jfindper 6 hours ago | parent | prev | next [-]

>I’m not saying to do what I do and buy a $7k laptop and a $15k desktop every year

>I buy the best phones and desktops money can buy

Sick man! Awesome, you spend 1/3 of the median US salary on a laptop and desktop every year. That's super fucking cool! Love that for you.

Anyways, please go brag somewhere else. You're rich, you shouldn't need extra validation from an online forum.

mitthrowaway2 6 hours ago | parent | prev | next [-]

Yes? I think that's crazy. I just maxed out my new Thinkpad with 96 GB of RAM and a 4 TB SSD and even at today's prices, it still came in at just about $2k and should run smoothly for many years.

Prices are high but they're not that high, unless you're buying the really big GPUs.

sgerenser 6 hours ago | parent [-]

Where can you buy a new Thinkpad with 96GB and 4TB SSD for $2K? Prices are looking quite a bit higher than that for the P Series, at least on Lenovo.com in the U.S. And I don't see anything other than the P Series that lets you get 96GB of RAM.

mitthrowaway2 6 hours ago | parent | next [-]

You have to configure it with the lowest-spec SSD and then replace that with an aftermarket 4 TB SSD at around $215. The P14s I bought last week, with that and the 8 GB Nvidia GPU, came to a total of USD $2150 after taxes, including the SSD. Their sale price today is not quite as good as it was last week but it's still in that ballpark; with the 255H CPU and iGPU and a decent screen, and you can get the Intel P14s for $2086 USD. That actually becomes $1976 because you get $110 taken off at checkout. Then throw in the aftermarket SSD and it'll be around $2190. And if you log in as a business customer you'll get another couple percent off as well.

The AMD model P14s, with 96 GB and upgraded CPU and the nice screen and linux, still goes for under $1600 at checkout, which becomes $1815 when you add the aftermarket SSD upgrade.

It's still certainly a lot to spend on a laptop if you don't need it, but it's a far cry from $5k/year.

lionkor 6 hours ago | parent | prev [-]

Typing this on similar spec P16s that was around 2.6k or so. So if you call anything under 3k simply 2k, then it was 2k.

Thats in Germany, from a corporate supplier.

gr4vityWall 6 hours ago | parent | prev | next [-]

> maybe $250/month (...) which you can then use to go and earn 100x that.

25k/month? Most people will never come close to earn that much. Most developers in the third world don't make that in a full year, but are affected by raises in PC parts' prices.

I agree with the general principle of having savings for emergencies. For a Software Engineer, that should probably include buying a good enough computer for them, in case they need a new one. But the figures themselves seem skewed towards the reality of very well-paid SV engineers.

Dibby053 5 hours ago | parent | next [-]

>Most developers in the third world don't make that in a full year

And many in the first world haha

londons_explore 6 hours ago | parent | prev [-]

> But the figures themselves seem skewed towards the reality of very well-paid SV engineers.

The soon to be unemployed SV engineers when LLM's mean anyone can design an app and backend with no coding knowledge.

genewitch 3 hours ago | parent [-]

and you can code from an rpi / cellphone and use a cloud computer to run it so you actually don't really need an expensive PC at all

ceejayoz 6 hours ago | parent | prev | next [-]

> Am I crazy for thinking that anyone using computers for doing their job and making their income should have a $5k/year computer hardware budget at a minimum?

Yes. This is how we get websites and apps that don't run on a normal person's computer, because the devs never noticed their performance issues on their monster machines.

Modern computing would be a lot better if devs had to use old phones, basic computers, and poor internet connections more often.

vultour 6 hours ago | parent | prev | next [-]

Yes, that's an absolutely deranged opinion. Most tech jobs can be done on a $500 laptop. You realise some people don't even make your computer budget in net income every year, right?

sneak 6 hours ago | parent [-]

Most tech jobs could be done on a $25 ten year old smartphone with a cracked screen and bulging battery.

That’s exactly my point. Underspending on your tools is a misallocation of resources.

pqtyw 4 hours ago | parent | next [-]

That's a bizarrely extreme position. For almost everyone ~$2000-3000 PC from several years ago is indistinguishable from one they can buy now from a productivity standpoint. Nobody is talking about $25 ten year old smartphones. Of course claiming that a $500 laptop is sufficient is also a severe exaggeration, a used desktop, perhaps...

jermaustin1 5 hours ago | parent | prev | next [-]

Overspending on your tools is a misallocation of resources. An annual $22k spend on computing is around 10-20x over spend for a wealthy individual. I'm in the $200-300k/year, self-employed, buys-my-own-shit camp, and I can't imagine spending 1% of my income on computing needs, let alone close to 10%. There is no way to make that make sense.

antiframe 5 hours ago | parent | prev [-]

Yes, you don't want to under spend on your tools to the point where you suffer. But, I think you are missing the flip side. I can do my work comfortably with 32GB RAM, but my 1% a year budget could get me more. But, why not pocket it.

The goal is the right tool for the job, not the best tool you can afford.

kube-system 6 hours ago | parent | prev | next [-]

I agree with the general sentiment - that you shouldn't pinch pennies on tools that you use every day. But at the same time, someone who makes their money writing with with a pen shouldn't need to spend thousands on pens. Once you have adequate professional-grade tools, you don't need to throw more money at the problem.

dghlsakjg 4 hours ago | parent | prev | next [-]

If you are consistently maxing out your computers performance in a way that is limiting your ability to earn money at a rate greater than the cost of upgrades, and you can't offload that work to the cloud, then I guess it might make sense.

If, you are like every developer I have ever met, the constraint is your own time, motivation and skills, then spending $22k dollars per year is a pretty interesting waste of resources.

DOes it makes sense to buy good tools for your job? Yes. Does it make sense to buy the most expensive version of the tool that you already own last years most expensive version of? Rarely.

hansvm 5 hours ago | parent | prev | next [-]

Most people who use computers for the main part of their jobs literally can't spend that much if they don't want to be homeless.

Most of the rest arguably shouldn't. If you have $10k/yr in effective pay after taxes, healthcare, rent, food, transportation to your job, etc, then a $5k/yr purchase is insane, especially if you haven't built up an emergency fund yet.

Of the rest (people who can relatively easily afford it), most still probably shouldn't. Unless the net present value of your post-tax future incremental gains (raises, promotions, etc) derived from that expenditure exceeds $5k/yr you're better off financially doing almost anything else with that cash. That's doubly true when you consider that truly amazing computers cost $2k total nowadays without substantial improvements year-to-year. Contrasting buying one of those every 2yrs vs your proposal, you'd need a $4k/yr net expenditure to pay off somehow, somehow making use of the incremental CPU/RAM/etc to achieve that value. If it doesn't pay off then it's just a toy you're buying for personal enjoyment, not something that you should nebulously tie to revenue generation potential with an arbitrary 1% rule. Still maybe buy it, but be honest about the reason.

So, we're left with people who can afford such a thing and whose earning potential actually does increase enough with that hardware compared to a cheaper option for it to be worth it. I'm imagining that's an extremely small set. I certainly use computers heavily for work and could drop $5k/yr without batting an eye, but I literally have no idea what I could do with that extra hardware to make it pay off. If I could spend $5k/yr on internet worth a damn I'd do that in a heartbeat (moving soon I hope, which should fix that), but the rest of my setup handily does everything I want it to.

Don't get me wrong, I've bought hardware for work before (e.g., nobody seems to want to procure Linux machines for devs even when they're working on driver code and whatnot), and it's paid off, but at the scale of $5k/yr I don't think many people do something where that would have positive ROI.

ChromaticPanic 6 hours ago | parent | prev | next [-]

That's crazy spend for anyone making sub 100K

jermaustin1 5 hours ago | parent | next [-]

It is crazy for anyone making any amount. A $15k desktop is overkill for anything but the most demanding ML or 3D work loads, and the majority of the cost will be in GPUs or dedicated specialty hardware and software.

A developer using even the clunkiest IDE (Visual Studio - I'm still a fan and daily user, it's just the "least efficient") can get away without a dedicated graphics card, and only 32GB of ram.

red-iron-pine 5 hours ago | parent | prev [-]

thats a crazy spend for sub-200k or even sub-500k

you're just building a gaming rig with a flimsy work-related justification.

neogodless 5 hours ago | parent | prev | next [-]

Have you ever heard of the term "efficiency"?

It's when you find ways to spend the minimum amount of resources in order to get the maximum return on that spend.

With computer hardware, often buying one year old hardware and/or the second best costs a tiny fraction of the cost of the bleeding edge, while providing very nearly 100% of the performance you'll utilize.

That and your employer should pay for your hardware in many cases.

crote an hour ago | parent | prev | next [-]

Sorry, but that's delusional.

For starters, hardware doesn't innovate quickly enough to buy a new generation every year. There was a 2-year gap between Ryzen 7000 and Ryzen 9000, for example, and a 3-year gap between Ryzen 5000 and Ryzen 7000. On top of that, most of the parts can be reused, so you're at best dropping in a new CPU and some new RAM sticks.

Second, the performance improvement just isn't there. Sure, there's a 10% performance increase in benchmarks, but that does not translate to a 10% productivity improvement for software development. Even a 1% increase is unlikely, as very few tasks are compute-bound for any significant amount of time.

You can only get to $15k by doing something stupid like buying a Threadripper, or putting an RTX 4090 into it. There are genuine use-cases for that kind of hardware - but it isn't in software development. It's like buying a Ferrari to do groceries: at a certain point you've got to admit that you're just doing it to show off your wealth.

You do you, but in all honesty you'd probably get a better result spending that money on a butler to bring your coffee to your desk instead of wasting time by walking to the coffee machine.

iberator an hour ago | parent | prev | next [-]

Extremist point of view, and NOT optimal. Diminishing performance per $...

Proper calculation is: cost/ performance ratio. Then buy a second from the list:)

nickjj 6 hours ago | parent | prev | next [-]

I try to come at it with a pragmatic approach. If I feel pain, I upgrade and don't skimp out.

======== COMPUTER ========

I feel no pain yet.

Browsing the web is fast enough where I'm not waiting around for pages to load. I never feel bound by limited tabs or anything like that.

My Rails / Flask + background worker + Postgres + Redis + esbuild + Tailwind based web apps start in a few seconds with Docker Compose. When I make code changes, I see the results in less than 1 second in my browser. Tests run fast enough (seconds to tens of seconds) for the size of apps I develop.

Programs open very quickly. Scripts I run within WSL 2 also run quickly. There's no input delay when typing or performance related nonsense that bugs me all day. Neovim runs buttery smooth with a bunch of plugins through the Windows Terminal.

I have no lag when I'm editing 1080p videos even with a 4k display showing a very wide timeline. I also record my screen with OBS to make screencasts with a webcam and have live streamed without perceivable dropped frames, all while running programming workloads in the background.

I can mostly play the games I want, but this is by far the weakest link. If I were more into gaming I would upgrade, no doubt about it.

======== PHONE ========

I had a Pixel 4a until Google busted the battery. It runs all of the apps (no games) I care about and Google Maps is fast. The camera was great.

I recently upgraded to a Pixel 9a because the repair center who broke my 4a in a number of ways gave me $350 and the 9a was $400 a few months ago. It also runs everything well and the camera is great. In my day to day it makes no difference from the 4a, literally none. It even has the same storage space of which I have around 50% space left with around 4,500 photos saved locally.

======== ASIDE ========

I have a pretty decked out M4 MBP laptop issued by my employer for work. I use it every day and for most tasks I feel no real difference vs my machine. The only thing it does noticeably faster is heavily CPU bound tasks that can be parallelized. It also loads the web version of Slack about 250ms faster, that's the impact of a $2,500+ upgrade for general web usage.

I'm really sensitive to skips, hitches and performance related things. For real, as long as you have a decent machine with an SSD using a computer feels really good, even for development workloads where you're not constantly compiling something.

Krssst 6 hours ago | parent | prev | next [-]

One concern I'd have is that if the short-term supply of RAM is fixed anyway, even if all daily computer users were to increase their budget to match the new pricing and demand exceeds supply again, the pricing would just increase in response until prices get unreasonable enough that demand lowers back to supply.

ambicapter 6 hours ago | parent | prev | next [-]

I don't spend money on my computers from a work or "revenue-generating" perspective because my work buys me a computer to work on. Different story if you freelance/consult ofc.

pharrington 5 hours ago | parent | prev | next [-]

are you paid by the FLOP?

kotaKat 6 hours ago | parent | prev [-]

I mean, as a frontline underpaid rural IT employee with no way to move outward from where I currently live, show me where I’m gonna put $5k a year into this budget out of my barren $55k/year salary. (And, mind you - this apparently is “more” than the local average by only around $10-15k.)

I’m struggling to buy hardware already as it is, and all these prices have basically fucked me out of everything. I’m riding rigs with 8 and 16GB of RAM and I have no way to go up from here. The AI boom has basically forced me out of the entire industry at this point. I can’t get hardware to learn, subscriptions to use, anything.

Big Tech has made it unaffordable for everyone.

zozbot234 5 hours ago | parent | next [-]

8GB or 16GB of RAM is absolutely a usable machine for many software development and IT tasks, especially if you set up compressed swap to stretch it further. Of course you need to run something other than Windows or macOS. It's only very niche use cases such as media production or running local LLM's that will absolutely require more RAM.

pqtyw 4 hours ago | parent [-]

> something other than Windows or macOS > 8GB

No modern IDE either. Nor a modern Linux desktop environment either (they are not that much more memory efficient than Macos or windows). Yes you can work with not much more than a text editor. But why?

ecshafer 6 hours ago | parent | prev | next [-]

The bright side is the bust is going to make a glut of cheap used parts.

sneak 6 hours ago | parent | prev [-]

[flagged]

kotaKat 6 hours ago | parent [-]

Oh. I’m not allowed to own a home computer to try to further my own learning and education and knowledge then.

Guess I’ll go fuck myself now then.

jfindper 6 hours ago | parent [-]

They're just using this comment section to brag about how well off they are, I wouldn't worry too much. They're completely out of touch.

bombcar 5 hours ago | parent [-]

It's the "how much can the banana cost, $10?" of HN.

The point they're trying to make is a valid one - a company should be willing to spend "some money" if it saves time of the employee they're paying.

The problem is usually that the "IT Budget" is a separate portion/group of the company than the "Salary" budget, and the "solution" can be force a certain dollar amount has to be spent each year (with one year carry-forward, perhaps) so that the employees always have good access to good equipment.

(Some companies are so bad at this that a senior engineer of 10+ years will have a ten year old PoS computer, and a new intern will get a brand new M5 MacBook.)