▲ | derf_ 8 days ago | ||||||||||||||||
> ...it's hardly a robust strategy. I disagree. Video is such a large percentage of internet traffic and licensing fees are so high that it becomes possible for any number of companies to subsidize the development cost of a new codec on their own and still net a profit. Google certainly spends the most money, but they were hardly the only ones involved in AV1. At Mozilla we developed Daala from scratch and had reached performance competitive with H.265 when we stopped to contribute the technology to the AV1 process, and our team's entire budget was a fraction of what the annual licensing fees for H.264 would have been. Cisco developed Thor on their own with just a handful of people and contributed that, as well. Many other companies contributed technology on a royalty-free basis. Outside of AV1, you regularly see things like Samsung's EVC (or LC-EVC, or APV, or...), or the AVS series from the Chinese.... If the patent situation were more tenable, you would see a lot more of these. The cost of developing the technology is not the limitation. I would argue the cost to get all parties to agree on a common standard and the cost to deploy it widely enough for people to rely on it is much higher, but people manage that on a royalty-free basis for many other standards. | |||||||||||||||||
▲ | thinkingQueen 8 days ago | parent | next [-] | ||||||||||||||||
You’re comparing apples to oranges. Daala was never meant to be widely adopted in its original form — its complexity alone made that unlikely. There’s a reason why all widely deployed codecs end up using similar coding tools and partitioning schemes: they’re proven, practical, and compatible with real-world hardware. As for H.265, it’s the result of countless engineering trade-offs. I’m sure if you cherry-picked all the most experimental ideas proposed during its development, you could create a codec that far outperforms H.265 on paper. But that kind of design would never be viable in a real-world product — it wouldn’t meet the constraints of hardware, licensing, or industry adoption. Now the following is a more general comment, not directed at you. There’s often a dismissive attitude toward the work done in the H.26x space. You can sometimes see this even in technical meetings when someone proposes a novel but impractical idea and gets frustrated when others don’t immediately embrace it. But there’s a good reason for the conservative approach: codecs aren’t just judged by their theoretical performance; they have to be implementable, efficient, and compatible with real-world constraints. They also have to somehow make financial sense and cannot be given a way without some form of compensation. | |||||||||||||||||
▲ | mike_hearn 8 days ago | parent | prev [-] | ||||||||||||||||
Mozilla is just Google from a financial perspective, it's not an independent org, so the financing point stands. H.264 was something like >90% of all video a few years ago and wasn't it free for streaming if the end user wasn't paying? IIRC someone also paid the fees for an open source version. There were pretty good licensing terms available and all the big players have used it extensively. Anyway, my point was only that expecting Google to develop every piece of tech in the world and give it all away for free isn't a general model for tech development, whereas IP rights and patent pools are. The free ride ends the moment Google decide they need more profit, feel threatened in some way or get broken up by the government. | |||||||||||||||||
|