Remix.run Logo
TacticalCoder 3 hours ago

I understand that things can go wrong and there can be security issues, but I see at least two other issues:

1. what if, ChadGPT style, ads are added to the answers (like OpenAI said it'd do, hence the new "ChadGPT" name)?

2. what if the current prices really are unsustainable and the thing goes 10x?

Are we living some golden age where we can both query LLMs on the cheap and not get ad-infected answers?

I read several comments in different threads made by people saying: "I use AI because search results are too polluted and the Web is unusable"

And I now do the same:

"Gemini, compare me the HP Z640 and HP Z840 workstations, list the features in a table" / "Find me which Xeon CPU they support, list me the date and price of these CPU when they were new and typical price used now".

How long before I get twelve ads along with paid vendors recommendations?

spiderice 3 hours ago | parent | next [-]

> what if the current prices really are unsustainable and the thing goes 10x?

Where does this idea come from? We know how much it costs to run LLMs. It's not like we're waiting to find out. AI companies aren't losing money on API tokens. What could possibly happen to make prices go 10x when they're already running at a profit? Claude Max might be a different story, but AI is going to get cheaper to run. Not randomly 10x for the same models.

up-n-atom a minute ago | parent | next [-]

where did u get this notion from? you must not be old enough to know how subscription services play out. ask your parents about their internet or mobile billings.

heck we were spoiled by “memory is cheap” but here we are today wasting it at every expense and prices keep skyrocketing (ps they ain’t coming down). If you can’t see the shift to forceful subscriptions via technologies guised as “security” ie. secure boot and the monopolistic distribution (Apple, Google) or the OEM, you’re running with blinders. Computings future as it’s heading will be closed ecosystems and subscription serviced, mobile only. They’ll nickle and dime users for every nuanced freedom of expression they can.

overgard 2 hours ago | parent | prev [-]

From what I've read, every major AI player is losing a (lot) of money on running LLMs, even just with inference. It's hard to say for sure because they don't publish the financials (or if they do, it tends to be obfuscated), but if the screws start being turned on investment dollars they not only have to increase the price of their current offerings (2x cost wouldn't shock me), but some of them also need a (massive) influx of capital to handle things like datacenter build obligations (10s of billions of dollars). So I don't think it's crazy to think that prices might go up quite a bit. We've already seen waves of it, like last summer when Cursor suddenly became a lot more expensive (or less functional, depending on your perspective)

sothatsit a minute ago | parent | next [-]

Dario Amodei has said that their models actually have a good return, even when accounting for training costs [0]. It is R&D and building the next bigger models where they are losing money year-to-year.

[0] https://youtu.be/GcqQ1ebBqkc?si=Vs2R4taIhj3uwIyj&t=1088

lemming 4 minutes ago | parent | prev | next [-]

Sam Altman is on record saying that OpenAI is profitable on inference. He might be lying, but it seems an unlikely thing to lie about.

hyperadvanced 2 hours ago | parent | prev [-]

This is my understanding as well. If GPT made money the companies that run them would be publicly traded?

Furthermore, companies which are publicly traded show that overall the products are not economical. Meta and MSFT are great examples of this, though they have recently seen opposite sides of investors appraising their results. Notably, OpenAI and MSFT are more closely linked than any other Mag7 companies with an AI startup.

https://www.forbes.com/sites/phoebeliu/2025/11/10/openai-spe...

fragmede an hour ago | parent [-]

Going public is not a trivial thing for a company to do. You may want to bring in additional facts to support your thesis.

crystaln an hour ago | parent | prev | next [-]

Seems much more likely the cost will go down 99%. With open source models and architectural innovations, something like Claude will run on a local machine for free.

FuckButtons 2 hours ago | parent | prev [-]

I asked Gemini deep research to project when that will likely happen based on historical precedent. It guessed October 2027.