Remix.run Logo
bluescrn 5 hours ago

Did anyone really expect AI to be cheap?

If/when it gets to the point where it can replace a skilled worker, the service can be sold for close to the same price as that skilled labour. But the AI can run 24/7, reliably, and scale up/down at a moments notice.

There's not going to be much competition to drive prices down, the barriers to entry are already huge. There'll likely to be one clear winner, becoming a near-monopoly, or maybe we'll get a duopoly at best.

hansmayer 3 hours ago | parent | next [-]

> Did anyone really expect AI to be cheap?

Yes, a lot of people (not me). Why? Well because that was the whole value proposition of these companies, relentlessly pushed by their PR and most of the media- rememmber it was something something Pocket PhDs, massive unemployment etc?

rwyinuse 4 hours ago | parent | prev | next [-]

"There's not going to be much competition to drive prices down, the barriers to entry are already huge. There'll likely to be one clear winner, becoming a near-monopoly, or maybe we'll get a duopoly at best."

Based on what exactly? So far every time OpenAI, Anthropic or whatever has released a new top performing model, competitors have caught up quickly. Open source models have greatly improved as well.

I expect AI to be just like cloud computing in general - AWS, Azure, GCP being the main providers, with dozens of smaller competitors offering similar services as well.

flir 4 hours ago | parent | prev [-]

I do. "Commoditize your complement". Want to sell lots of silicon? Give away good local models to run on that silicon.

Even if SOTA models in the cloud are a few percentage points better, most work can be routed to local models most of the time. That leaves the cloud providers fighting over the most computationally intensive tasks. In the long term, I think models are going to be local-first.

(Unless providers can figure out a network effect that local models can't replicate).

vanviegen 4 hours ago | parent [-]

> In the long term, I think models are going to be local-first.

Why? There's an inherent efficiency advantage to scale, while the only real advantage for local models (privacy/secrecy) hasn't proven convincing for broader IT either.

solid_fuel 2 hours ago | parent | next [-]

Local first models aren't just more private than the API vendors, they also have the advantages of fixed cost, lower latency, and better stability - local models don't get nerfed/"updated" in the background like chatgpt does.

Maybe in a world where these AI companies behaved with some semblance of ethics and user-friendliness they would be on even ground, but for anyone paying attention local models are obviously the future.

LtWorf an hour ago | parent | prev [-]

To not depend on an external company that can decide the price.