Remix.run Logo
incomingpain a day ago

The exponential growth of parameters did pause. open weights are catching up, but for the most part that growth ended. The capability of the models did rise exponentially but where are we now? We hit a hardware limit. Even with datacenters full of huge gpus, we dont have good cases for 5 trillion parameter models.

Then came MOE, which in my opinion is like multiplying the parameters; but I'm pretty sure at that same time, the MOE models shrunk the size. It's organized better.

If you're still looking at that exponential growth, you're looking at giant CAT mining dump trucks and thinking sports cars arent big enough. This exponential growth is hiding now.

Then reasoning happened and it again shrunk the total size in parameters vs quality.

qwen3-coder 480 B is night and day better coder than say Llama 3 405B. Not even comparable and nobody debates. The exponential growth is happening, but not parameters.

How about AI usage? Stats are showing AI usage is 4x larger than January 1st of this year. Might not be exponential but wow! We dont even really know the private stats but openai has hundreds of billions in spend for 2x stargate datacenters. They know whats up.

>Did top insider tech people and VCs lied to us again?

Yep, all lies. You should ignore AI and stop using it.

hodgehog11 a day ago | parent [-]

> Yep, all lies. You should ignore AI and stop using it.

This is throwing the baby out with the bath water. There are a lot of really good things that we can do with this technology. Unfortunately, most tech companies and VCs are not as interested in these applications because they just want to make digital slaves.