Remix.run Logo
Frieren 2 days ago

As financial markets get tighter AI companies will stop subsidizing their services and charge enough money to actually make a profit.

It is time to setup local models. It is cheaper, and you already have a computer. Why keep it idle and pay someone else for their CPU?

tempaccount5050 2 days ago | parent | next [-]

Because it doesn't even come close to frontier models in intelligence/speed/price. I can run my 3090 nonstop and rack up an electricity bill that costs more than a subscription and get worse results that are slower. They are ok for simple/non complex things, but that's not really what I need AI for.

datsci_est_2015 2 days ago | parent | next [-]

I feel the opposite. I do need AI for simple things. Complex things are usually so ill-defined that the actual bottleneck takes place in meatspace, not in my IDE.

Hendrikto 2 days ago | parent | prev | next [-]

Well, it is currently cheaper because it is massively subsidized. That will change when subsidies stop. I don’t think it is a good argument.

ihattendorf 2 days ago | parent | next [-]

The claim was "It is cheaper", not "It will be cheaper". Until it actually _is_ cheaper, it doesn't make much sense to purchase $10k+ in hardware to run local models that are still worse than the frontier offerings.

Chris2048 2 days ago | parent [-]

> Until it actually _is_ cheaper, it doesn't make much sense to purchase

Once it is cheaper, there will be more demand so it will no longer be cheaper. Buying now gets current prices (though demand is still fairly high).

hobofan 2 days ago | parent | prev | next [-]

No it's not. AI products are quite often subsidized. AI inference very certainly is not.

There are more and more independent AI inference providers without VC backing that serve open weight models on a ~cost-plus basis that show that subsidies are not significant for AI inference.

2 days ago | parent [-]
[deleted]
voxleone 2 days ago | parent | prev [-]

[dead]

cultofmetatron 2 days ago | parent | prev [-]

if only there was a place that was naturally cold to take advantage of airflow for cooling and cheap renewable electricity thats always on...

plopz 2 days ago | parent [-]

are you saying aluminum smelters are going to convert to ai datacenters?

yeswecatan 2 days ago | parent | prev | next [-]

I assume because local models are nowhere near as good. Hoping I’m wrong!

datsci_est_2015 2 days ago | parent [-]

The better your code is architected, the less powerful model you’ll need for it to make sense of it.

E.g. a well-designed deployment (infrastructure-as-code) repository doesn’t need a frontier model to be understood well-enough to create a new job / service using sibling jobs / services as templates.

And this already saves me dozens of minutes per week, although it’s not a 2x multiplier in my efficiency.

lukaslalinsky 2 days ago | parent | next [-]

I disagree, even though I'd love for it to be different. With models like Opus, I can give it a good architecture and expect good results. For many of the less expensive models, that is not the case, they make mistakes, you need to over specify, they get stuck in a loop, etc. As you get to the models you can realistically run locally, it gets so frustrating I'd rather be writing the code myself.

datsci_est_2015 a day ago | parent [-]

At what point will local inference catch up to today’s cloud inference? Will it ever? If it doesn’t, does that imply a certain dead-end for the LLM inference industry?

lukaslalinsky 5 hours ago | parent [-]

I don't think at any point in foreseeable future we will have terabytes of RAM for dedicated LLM chips at home.

varispeed 2 days ago | parent | prev [-]

The issue is that local models are dumb and tend to make mistakes than look good at a first glance. So any "saving" is quickly ruined by having to do an extensive review. You might as well just write things yourself.

datsci_est_2015 2 days ago | parent [-]

I use it as code scaffolding, which means in a way I’m often rewriting it. For me, writing from scratch isn’t the same amount of effort as using a code scaffolding tool.

aaarrm 2 days ago | parent | prev | next [-]

Won't competition likely keep prices low? At first maybe not, but sooner or later open models will catch up, then it's a completely open market for anyone to host and sell services.

afavour 2 days ago | parent | next [-]

Competition won't keep prices below cost. Only subsidy by investors can do that and they won't do that forever.

Frieren a day ago | parent | prev [-]

> Won't competition likely keep prices low?

If there are 100 companies you can choose from, yes.

If there are 3 oligarchs that own all options, not.

Capitalism only works when there is competition between many players. When you get less than a dozen players the prices are too easy to increase to maximize profit. They do not need even to talk between them, to not start a price war is the only logical strategy and they follow it. That is why big-tech is so problematic.

In the past, this kind of companies were highly regulated. Phone companies were not allowed to wiretap calls, prices were limited by law, etc. Internet providers had the same regulations applied to them. But service providers run amok without control abusing their position and hurting the rest of the economy. Do not expect lower prices in such unregulated environment.

varispeed 2 days ago | parent | prev | next [-]

Local models are nowhere near the performance of frontier models. Unless you can fork out like £100k to get something passable in terms of performance.

TacticalCoder 2 days ago | parent | prev [-]

> As financial markets get tighter ...

They never really get tight very long: the various states are way too busy flooding the world with endless money printing to kick the can of the public debt always further.

Covid financial crash? We went to new highs. 2022 tech flash crash (Meta and Netflix did -75% for example): we then went to new highs.

The only way for governments who ever spend way more than they bring in taxpayer dollars is to de-valuate the currency.

So "financial markets getting tighter": probably won't last.

Hendrikto 2 days ago | parent [-]

As you said yourself: Quantitative easing did not solve anything. We keep kicking the can down the road, and the problems grow exponentially every time. This approach won’t work forever. In fact, we may be past the tipping point already.