| ▲ | hparadiz 2 days ago |
| The economics is spending a few hundred bucks on software for an IC you're already paying over ten grand a month in order to make them more productive. How are supposedly smart industry experts not seeing this obvious fact? Are these guys actually experts? |
|
| ▲ | Yizahi 2 days ago | parent | next [-] |
| It's more of the spending potentially a thousand bucks (hypothetically - a heavy API usage by a developer utilizing top of the line agents to 100% every day, adjusted to actually be profitable) if you are paying that dev 4 to 6 grand before taxes. Now that would be a close call. |
|
| ▲ | Rury a day ago | parent | prev | next [-] |
| NVIDIA execs are now saying otherwise: https://fortune.com/2026/04/28/nvidia-executive-cost-of-ai-i... Maybe Ed is right even if he's wrong on some things? |
|
| ▲ | xienze 2 days ago | parent | prev | next [-] |
| > The economics is spending a few hundred bucks on software for an IC you're already paying over ten grand a month Let's be fair here, the endgame is not "a few hundred bucks a month." Not for how much money has been invested. How much extra you have to spend to make developers how much more productive, and will companies go along with it is the trillion dollar question. |
| |
| ▲ | koliber 2 days ago | parent | next [-] | | A long time ago a vast majority of people on earth were farmers. They used relatively simple tools like scathes. Over a few centuries better tools and technology made it so that <5% of the population in rich countries are farmers. They use tools like million dollar harvesters. | | |
| ▲ | legulere 2 days ago | parent [-] | | It's not the 20x efficiency of harvesting technology compared to what agrarian societies that make them make sense. It's the productivity of the other 95% of the population that makes their labor cost so high that such expensive machines make economic sense. |
| |
| ▲ | hparadiz 2 days ago | parent | prev [-] | | You know I can just lookup the costs per seat right? It's not that much and not everyone is a heavy user at an org. And for code the costs are falling per compute cycle. | | |
| ▲ | xienze 2 days ago | parent [-] | | First, the key phrase here is "end game." Whatever you're looking at now isn't where prices will be in short order. Second, it seems a hard to believe that hundreds of billions of dollars would be spent and untold numbers of data centers would be built just to gain a measly couple hundred dollars per seat. | | |
| ▲ | fragmede a day ago | parent [-] | | But it's a lot of seats. If you get 1 billion people to pay $20/month, that's $20 billion. Multiply that by 10 years and you have $200 billion. |
|
|
|
|
| ▲ | CodingJeebus 2 days ago | parent | prev [-] |
| It's a few hundred bucks per month for now, but that's not going to last. At some point, the industry is going to pivot towards tracking token-based productivity because it's not going to be cheap forever unless FOSS models catch up. |
| |
| ▲ | m4rtink 2 days ago | parent | next [-] | | Please don't call open weight models FOSS models - that's actually very wrong, unless you actually have all the training data and can modify the data and training methodology to retrain the model yourself. | |
| ▲ | zozbot234 2 days ago | parent | prev [-] | | FOSS models have effectively caught up wrt. scale, see e.g. the latest DeepSeek V4 series - but they still require major hardware resources (hundreds of gigabytes of RAM for a very lean deployment targeting single- or few-users inference) to run at acceptable throughput. |
|