| ▲ | If You Need a Laptop, Buy It Now(theatlantic.com) |
| 16 points by fortran77 17 hours ago | 20 comments |
| |
|
| ▲ | cousinbryce 16 hours ago | parent | next [-] |
| Anyone else expect prices to go down after the AI pop? |
| |
| ▲ | nerdsniper 15 hours ago | parent | next [-] | | Yes, especially if CXMT is able to continue scaling their production and if China is able to crack EUV mass production. I see RAM prices dropping to new lows in 3-5 years. | | |
| ▲ | extraduder_ire 13 hours ago | parent [-] | | Do they need EUV to make RAM? Doing a small amount of searching leads me to 2025 press releases from companies saying they're first to a new process node, and mentioning EUV like it's an innovation. I assume they could scale faster with more machines of the older, more understood, lithography technology. |
| |
| ▲ | vaylian 14 hours ago | parent | prev | next [-] | | Yes. However, the economy is also bad due to other factors like unnecessary wars. Things can still get worse outside of the AI bubble. | | | |
| ▲ | lijok 15 hours ago | parent | prev | next [-] | | Where do you envision the pop will come from? | | |
| ▲ | nostrademons 13 hours ago | parent | next [-] | | Two possible sources: 1. People who are currently buying AI services realizing it's not all that useful to them and discontinuing their subscriptions. Note that this can come from a changing ecosystem as much as anything to do with the products themselves. I know a couple people running AI propaganda operations where a single person can now do what previously took a major media conglomerate; this is great for them, but if I personally know a couple folks doing this, it indicates that there are probably hundreds of thousands worldwide, and people are simply going to stop trusting anything they read on the Internet. 2. Rising interest rates from the Iran war. Suddenly the cash flows needed to finance all this datacenter and AI model expansion are much higher, and combined with #1, may not be viable. | | |
| ▲ | pants2 10 hours ago | parent [-] | | 1. Most AI datacenter plans and valuation are not tied to subscriptions, but from a more vague promise of "AGI," so this isn't likely to pop the bubble IMO (even if it does happen) 2. Historical precedent holds that governments are more likely to suppress rates to spur the economy during wartime. |
| |
| ▲ | andriy_koval 12 hours ago | parent | prev [-] | | > Where do you envision the pop will come from? sudden end of overinvestments in hardware procurements by big players. Its unclear if google for example will sustain 50B/y investments. |
| |
| ▲ | dyauspitr 15 hours ago | parent | prev [-] | | With how good Claude code and codex are there might not be one. |
|
|
| ▲ | vibe42 16 hours ago | parent | prev | next [-] |
| Higher-end gaming laptops are still decently priced and work well for local AI inference. And Linux runs better than ever on them; I'm running debian 13 with almost no driver issues. For $2k you can get 32 GB DDR5 RAM and 16 GB fast VRAM. Bump the RAM to 64 GB and you're still below $3k. |
| |
| ▲ | solstice 15 hours ago | parent [-] | | What models or classes of models would I be able to run on that hardware? I've asked myself that question while looking at some of the models on this: site https://laptopparts4less.frl/index.php?route=common/home | | |
| ▲ | vibe42 14 hours ago | parent [-] | | With 16 GB VRAM one can run a decent quant (Q4-Q8) of newer, smaller dense models. This leaves room for e.g. 32-256k context size. This might not be enough to chew through a large code base but for smaller projects it can easily fit enough if not all of the code base to drive a good coding agent. I don't recommend specific models or model providers due to how much hype and BS there is around benchmarks etc. Easiest is to check the latest generation of open models and look for a dense-type where a decent quant fits within the VRAM. Some models run fast enough that some of the weights can spill over from VRAM to RAM while maintaining a usable prompt/token gen speed. | | |
|
|
|
| ▲ | axiologist 12 hours ago | parent | prev | next [-] |
| https://archive.ph/FoWIt |
|
| ▲ | Lord_Zero 15 hours ago | parent | prev | next [-] |
| Everyone panic. |
| |
|
| ▲ | fortran77 17 hours ago | parent | prev | next [-] |
| Share link: https://www.theatlantic.com/technology/2026/03/laptop-electr... |
|
| ▲ | fortran77 12 hours ago | parent | prev | next [-] |
| Technology for things like laptops generally gets better. I still think it's best to buy technology when you need it, not sooner. |
|
| ▲ | 16 hours ago | parent | prev [-] |
| [deleted] |