▲ | Wowfunhappy 16 hours ago | ||||||||||||||||
> This echoes a lot of the rhetoric around "but how will facebook/twitter/etc make money?" back in the mid 2000s. The difference is that Facebook costs virtually nothing to run, at least on a per-user basis. (Sure, if you have a billion users, all of those individual rounding errors still add up somewhat.) By contrast, if you're spending lots of money per user... well look at what happened to MoviePass! The counterexample here might be Youtube; when it launched, streaming video was really expensive! It still is expensive too, but clearly Google has figured out the economics. | |||||||||||||||||
▲ | jsnell 15 hours ago | parent [-] | ||||||||||||||||
You're either overestimating the cost of inference or underestimating the cost of running a service like Facebook at that scale. Meta's cost of revenue (i.e. just running the service, not R&D, not marketing, not admin, none of that) was about $30B/year in 2024. In the leaked OpenAI financials from last year, their 2024 inference costs were 1/10th of that. | |||||||||||||||||
|