Remix.run Logo
senfiaj a day ago

Yeah, but I think it has much higher CPU usage, at least when there is no native hardware decoder/encoder. Maybe this has more to do with adoption, since H264 has been an industry standard.

toast0 a day ago | parent [-]

Codec selection is always a complex task. You've got to weigh quality/bitrate vs availability of hardware encode/decode, licensing, and overall resource usage.

The ITU standards have had a lot better record of inclusion in devices that people actually have; and often using hardware encode/decode takes care of licensing. But hardware encode doesn't always have the same quality/bitrate as software and may not be able to do fancier things like simulcast or svc. Some of the hardware decoders are pretty picky about what kinds of streams they'll accept too.

IMHO, if you're looking at software h.264 vs software vp9, I think vp9 is likely to give you better quality at a given bitrate, but will take more cpu to do it. So, as always, it depends.

Dylan16807 16 hours ago | parent [-]

> IMHO, if you're looking at software h.264 vs software vp9, I think vp9 is likely to give you better quality at a given bitrate, but will take more cpu to do it. So, as always, it depends.

That's a pretty messy way to measure. h.264 with more CPU can also beat h.264 with less CPU.

How does the quality compare if you hold both bitrate and CPU constant?

How does the CPU compare if you hold both bitrate and quality constant?

AV1 will do significantly better than h.264 on both of those tests. How does VP9 do?