| ▲ | giwook 7 hours ago |
| Lots of us have noticed that usage limits for Claude have been nerfed in recent weeks/months. If anything, these new multipliers are more transparent than anything OpenAI or Anthropic have communicated regarding actual costs and give us a more realistic understanding of what it's costing these providers. The fact that we were able to get such a substantial amount of usage for $20/$100/$200 a month was never meant to last and to think otherwise was perhaps a bit naive. This feels like a strategy from the ZIRP era of tech growth where companies burned investor capital and gave away their products and services for free (or subsidized them heavily) in order to prioritize user acquisition initially. Then once they'd gained enough traction and stickiness they'd then implement a monetization strategy to capitalize on said user base. |
|
| ▲ | dualvariable 6 hours ago | parent | next [-] |
| However, inference costs for entirely good enough models are likely to keep declining in the future. We're probably hitting diminishing returns on model size and training. The new generations aren't quantum leaps anymore, and newer generations of open source models like DeepSeek are likely to start getting good enough. There's going to be a limit to how much they can raise prices, because someone can always build out a datacenter and fill it up with open source DeepSeek inference and undercut your prices by 10x while still making a very good ROI--and that's a business model right there. Right now I'm sure there's a lot of people who will protest that they couldn't do their jobs with lesser models, but as time goes on that will get less and less. Already right now the consumers who are using AI for writing presentations, cooking recipe generation and ELI5 answers for common things, aren't going to be missing much from a lesser model. That'll actually only start to get cheaper over time. Also for business needs, as AI inference costs escalate there comes a point where businesses rediscover human intelligence again, and start hiring/training people to do more work to use lesser models--if that is more productive in the end than shelling out large amounts of cash for inference on the latest models. [Although given how much companies waste on AWS, there's a lot of tolerance for overspending in corporations...] |
| |
| ▲ | geodel 6 hours ago | parent | next [-] | | > because someone can always build out a datacenter and fill it up with open source DeepSeek inference and undercut your prices by 10x while still making a very good ROI- Not sure how it all works out. Currently trillion dollar companies can't make a native app for platforms. Everything is just JS/Electron because economics does not work for them. And here companies can make GW data center running very expensive GPUs for 1/10th of current prices. Sound little fanciful to me. | | |
| ▲ | bootsmann 4 hours ago | parent [-] | | The price you pay for anthropic must include the price of training new and better models which is incredibly costly. If you use the models someone else already spend money to develop you don’t need to pay this price. |
| |
| ▲ | giwook 6 hours ago | parent | prev | next [-] | | I think so too. And at some point even frontier model costs will hopefully come down (if there is still a meaningful difference between closed and open source models at that point) as all of the compute that's being built out right now comes online. | |
| ▲ | croes 6 hours ago | parent | prev | next [-] | | I guess the new models will still be quantum leaps, but literally: "The smallest possible change in a system" | | |
| ▲ | ctoth 5 hours ago | parent [-] | | Yups... Mythos is the smallest possible leap. Not a standard model generation advance, not even a version point advance. Just the smallest possible quanta of a change. We are absolutely hitting a plateau any day now. Any day. Any time. Any second now. Yup. Right now! Surely! | | |
| ▲ | cubefox 5 hours ago | parent [-] | | Yeah. AI progress is insanely fast if you compare it to anything else. Where else is a one year old technology already hopelessly outdated? 10 years ago is basically stone age. | | |
| ▲ | madamelic 4 hours ago | parent [-] | | I am continually tripped out by the fact when I was 16, I didn't have a 'smartphone' beyond a Windows Mobile 6 phone that had no internet on it. Now, I have this high-resolution shiny object that can near instantaneously get any information I want along with _streaming HD video to it_ *anywhere*. 15 years even feels like a stone age. I can't fathom what it has to feel like people in their 60s and 70s. | | |
| ▲ | nonameiguess 4 hours ago | parent [-] | | I'm not quite 60, but it's always interesting to me that I feel quite the opposite of this. When I was 16, I didn't have a computer, didn't have a phone, had never used the Internet, but when I think of how life has changed, it's frankly not much. I woke up this morning, scooped my cats' litter boxes, took out some trash, made myself breakfast, ate that, read some news while eating, then lifted weights in my garage, had some work meetings, wrote up some instructions per a customer request from Friday, and am about to go drive to the lake to go do a 9 mile longboard loop. That's very close to a normal day in 1996. The biggest difference is I read the news on my phone instead of a physical newspaper. The news was not any more interesting or informative because of that. I guess I can also still do the loop reasonably well, but I'm a lot slower than I was in 1996 when I was a cross-country state champion. My parents are closing in on 70 and I guess I can't speak for them, but I'm at least aware of the daily routines of their lives, too. Walk the dog, do housework, DIY building projects, visit kids and grankids. Seems much the same, too, with the biggest difference being they're now teaching my sister's sons to play baseball rather than me, but shit, one of her sons even looks like exactly the same way I looked when I was 7! The more things change, the more they stay the same. | | |
| ▲ | madamelic an hour ago | parent | next [-] | | Thank you for this insight! I always wonder the views of older people. My parents are very technology forward and have been my entire life so it is difficult to gauge how different life is compared to when they were growing up. It's easy to hear "Oh well I only had 640kb of memory and typed programs out of a magazine I got in the mail!" and see as distinct from having 'unlimited' resources and the internet. Your insight is good ("The biggest difference is I read the news on my phone instead of a physical newspaper") that life sort of stays the same but the modality changes. People still go to the store like they did in the mid-1800s but now it is by car. I wonder what our "industrial revolution" will be where the previous generation lived (ie: out in the country on a farm) totally different lives to the current (ie: in the city in a factory). Maybe when space travel and multi-planetary living is normalized? | | |
| ▲ | bobthepanda 31 minutes ago | parent | next [-] | | To some degree this already happened with the move from the industrial city to suburbanization and then re-urbanization. In particular one of the most notable recent developments is that urban waterways are now pretty desirable places to be with parks and recreation; in most industrializing cities the waterfront was actively avoided because the industrial use made it polluted, smelly etc. | |
| ▲ | saulpw an hour ago | parent | prev [-] | | > It's easy to hear "Oh well I only had 640kb of memory and typed programs out of a magazine I got in the mail!" Since I was there (young, but there), I want to point out that this crosses three eras which all felt quite different: 1978: typed programs in from a magazine or loaded from a cassette (16kB, TRS-80)
1983: loaded programs from a floppy (64kB, Apple ][ and C64 etc)
1988: loaded programs from a hard disk (640kB, IBM PC and Mac).
Exact years vary but these eras were only about 5 years each. Nobody had a floppy in 1978 but almost computer user did by 1983; nobody had a hard drive in 1983 but almost everyone did by 1988. |
| |
| ▲ | zdragnar 3 hours ago | parent | prev | next [-] | | Depends on where you live. My dad is almost 80, grew up in a very rural area, and when he was 16 they'd just gotten indoor plumbing. Up until he was 14, his school was a one-room school house with no heating other than a wood stove. If you were the first kid to arrive for the day, it was your job to get the fire going in winter months. Housework meant no laundry machine, no dishwasher, and possibly no vacuum cleaner. That means hand washing everything, and beating rugs with sticks and brushes to get the dust off of them. | |
| ▲ | rootusrootus 3 hours ago | parent | prev [-] | | If your parents are closing in on 70, I would have expected you to be closer to not quite 50 than not quite 60. I am just over 50 myself and I agree with your points. Technology has changed but life is largely very similar to wear it was in the 90s. At least day to day. Attitudes are way worse now. |
|
|
|
|
| |
| ▲ | Fire-Dragon-DoL 6 hours ago | parent | prev [-] | | I hope it's true, but right now hardware prices are insane |
|
|
| ▲ | hirako2000 5 hours ago | parent | prev | next [-] |
| It does feel like the music is about to stop. It has been years now, of cash injections, investors can't keep feeding the beast forever. |
| |
| ▲ | Gigachad an hour ago | parent | next [-] | | This is the best AI programming will be. From here on the enshitification starts and the prices go up. | |
| ▲ | ctoth 5 hours ago | parent | prev [-] | | It has been years now of reading this same comment... Surely people can't keep typing it forever. | | |
| ▲ | stuartq 3 hours ago | parent [-] | | But the prices haven't been going up by multiples of 6 for the past few years. Things are actually changing now. I don't think it's over, but in the short term, it's going to be considerably more expensive. | | |
| ▲ | hirako2000 2 hours ago | parent [-] | | They will smooth up the spike. Or be subtle and transform the existing quota so that they run out more quickly. Calling it caching, compression, optimisation, of course for the sacred benefit of the users. That would be, even is, the smart thing to do. |
|
|
|
|
| ▲ | stefan_ 3 hours ago | parent | prev | next [-] |
| Dunno, if in this day and age you are making inference more expensive, more scarce, you are honestly moving in the wrong direction and DeepSeek and others will gladly take your lunch. |
| |
| ▲ | Gigachad an hour ago | parent [-] | | The hardware to run deepseek is still incredibly expensive. | | |
| ▲ | cheema33 26 minutes ago | parent [-] | | > The hardware to run deepseek is still incredibly expensive. Deepseek API pricing is very low compared to Anthropic/OpenAI API pricing. For many, the 300% difference in pricing may be difficult to justify, if the quality difference is very small. And there will be many tasks where the most expensive/the best model, is not needed. Currently many people end up using Opus 4.7/GPT 5.5 for many tasks without thinking about it. |
|
|
|
| ▲ | glicvdfhsdf an hour ago | parent | prev | next [-] |
| [dead] |
|
| ▲ | bluescrn 5 hours ago | parent | prev [-] |
| Did anyone really expect AI to be cheap? If/when it gets to the point where it can replace a skilled worker, the service can be sold for close to the same price as that skilled labour. But the AI can run 24/7, reliably, and scale up/down at a moments notice. There's not going to be much competition to drive prices down, the barriers to entry are already huge. There'll likely to be one clear winner, becoming a near-monopoly, or maybe we'll get a duopoly at best. |
| |
| ▲ | hansmayer 4 hours ago | parent | next [-] | | > Did anyone really expect AI to be cheap? Yes, a lot of people (not me). Why? Well because that was the whole value proposition of these companies, relentlessly pushed by their PR and most of the media- rememmber it was something something Pocket PhDs, massive unemployment etc? | |
| ▲ | rwyinuse 4 hours ago | parent | prev | next [-] | | "There's not going to be much competition to drive prices down, the barriers to entry are already huge. There'll likely to be one clear winner, becoming a near-monopoly, or maybe we'll get a duopoly at best." Based on what exactly? So far every time OpenAI, Anthropic or whatever has released a new top performing model, competitors have caught up quickly. Open source models have greatly improved as well. I expect AI to be just like cloud computing in general - AWS, Azure, GCP being the main providers, with dozens of smaller competitors offering similar services as well. | |
| ▲ | flir 4 hours ago | parent | prev [-] | | I do. "Commoditize your complement". Want to sell lots of silicon? Give away good local models to run on that silicon. Even if SOTA models in the cloud are a few percentage points better, most work can be routed to local models most of the time. That leaves the cloud providers fighting over the most computationally intensive tasks. In the long term, I think models are going to be local-first. (Unless providers can figure out a network effect that local models can't replicate). | | |
| ▲ | vanviegen 4 hours ago | parent [-] | | > In the long term, I think models are going to be local-first. Why? There's an inherent efficiency advantage to scale, while the only real advantage for local models (privacy/secrecy) hasn't proven convincing for broader IT either. | | |
| ▲ | solid_fuel 2 hours ago | parent | next [-] | | Local first models aren't just more private than the API vendors, they also have the advantages of fixed cost, lower latency, and better stability - local models don't get nerfed/"updated" in the background like chatgpt does. Maybe in a world where these AI companies behaved with some semblance of ethics and user-friendliness they would be on even ground, but for anyone paying attention local models are obviously the future. | |
| ▲ | LtWorf 2 hours ago | parent | prev [-] | | To not depend on an external company that can decide the price. |
|
|
|