| ▲ | jdoliner 5 hours ago | |||||||||||||
It's not totally obvious to me that you can get the economics of this to work. A Google search costs ~.04 cents to serve, whereas a frontier reasoning LLM request costs about 2 cents. The revenue from a Google search is also around 2 cents. So the margins are dangerously thin on an LLM. Now there's lots of variables that can be tweaked on this. So it's possible to get it to work. But there's a lot less room for error. | ||||||||||||||
| ▲ | btheunissen 4 hours ago | parent | next [-] | |||||||||||||
The obvious knob to turn here is that the floor price of ad auctions will be incredibly high, with the justification that AI is expensive. As someone outside of the ad-tech space it blows my mind how much Instagram and Google ads cost these days, and OpenAI would certainly want to price their ad offering as more “premium” (see: $$$). | ||||||||||||||
| ▲ | ggregoire 3 hours ago | parent | prev | next [-] | |||||||||||||
Every Google search request triggers a Gemini request tho. Which is great… that's why I don't use chatGPT at all, having a LLM summary + a list of websites to deepen the search if I need, is just a superior user experience IMO. | ||||||||||||||
| ||||||||||||||
| ▲ | magixx 5 hours ago | parent | prev | next [-] | |||||||||||||
Maybe they'll gamify it with credits and make you watch ads for in order to gain them and thus use the service for free. | ||||||||||||||
| ||||||||||||||
| ▲ | pengaru 44 minutes ago | parent | prev [-] | |||||||||||||
Presumably they'll be embedding commercial influence in the interaction where you have no clue ad dollars are steering the conversation. That will no doubt have higher value than Google's $.02/search revenue, since the users will be completely incapable of separating the wheat from the chaff. | ||||||||||||||