▲ | senfiaj a day ago | |||||||
Yeah, but I think it has much higher CPU usage, at least when there is no native hardware decoder/encoder. Maybe this has more to do with adoption, since H264 has been an industry standard. | ||||||||
▲ | toast0 a day ago | parent [-] | |||||||
Codec selection is always a complex task. You've got to weigh quality/bitrate vs availability of hardware encode/decode, licensing, and overall resource usage. The ITU standards have had a lot better record of inclusion in devices that people actually have; and often using hardware encode/decode takes care of licensing. But hardware encode doesn't always have the same quality/bitrate as software and may not be able to do fancier things like simulcast or svc. Some of the hardware decoders are pretty picky about what kinds of streams they'll accept too. IMHO, if you're looking at software h.264 vs software vp9, I think vp9 is likely to give you better quality at a given bitrate, but will take more cpu to do it. So, as always, it depends. | ||||||||
|