Remix.run Logo
parliament32 20 hours ago

How could this possibly work? Google is profitable because they can insert ~4 ads into a search query. An LLM query costs about 2 orders of magnitude more resources to run than a Google search, so.. I'm not seeing it unless OAI can figure out how to shoehorn 400 ads per prompt into the interface somehow.

strongpigeon 20 hours ago | parent | next [-]

Longer sessions. You also have more potential for brand advertising compared to Google Search (which is mostly conversion ads). That being said, it's risky and can ruin your product if done wrong, but I think there are ways to do it right.

parliament32 20 hours ago | parent [-]

But longer sessions means more queries, which means more costs, which makes the problem even worse, right?

strongpigeon 19 hours ago | parent | next [-]

Only if the margins are negative. Yes, the cost to serve is most likely higher than Google's SRP, but I think the ads can be even better targeted and potentially have a higher CPM than Goog's.

What I'm saying is that I believe their ARPU could be higher than Google's, while what I think you're saying is that their cost will also be higher. I agree with that, but where we differ is that I think that while the margin will be lower, there is still potential to make a ton of money there.

whattheheckheck 10 hours ago | parent | prev [-]

They have a dataset of your deepest thoughts and questions AND a way to ask you stuff. Its literally the most valuable dataset on the planet

tim333 20 hours ago | parent | prev [-]

Google now ads an LLM result to about half the search results. I think they've figured out how to do that without too many resources.

parliament32 20 hours ago | parent [-]

Google doesn't have to fight context windows. They can cache and store an AI response to a Google query without having to worry about much other than locale etc. You can't do that a dozen messages into an LLM conversation.