Remix.run Logo
lxgr a day ago

I believe Youtube's player is driving codec selection, not the browser (i.e. the player requests a list of supported codecs and then picks the one most beneficial for Google, not the other way around).

That said, I've solved this problem for myself on macOS and Firefox by setting media.webrtc.codec.video.av1.enabled to false on about:config, as all other codecs used by Youtube are hardware accelerated on my Mac.

zamadatix a day ago | parent [-]

> I believe Youtube's player is driving codec selection, not the browser (i.e. the player requests a list of supported codecs and then picks the one most beneficial for Google, not the other way around).

The way the browser can still participate in choosing is by e.g. not listing AV1 as supported when there is no hardware decoder on the local system. Both Safari and Edge took (approximately) that style of approach, but it comes with the downside that if the server only has AV1 video then the client gets nothing.

Practically, that downside isn't a big deal until codec support is high enough sites start assuming the codec is just supported and they don't need to host alternative options.

lxgr a day ago | parent [-]

Yes, I think Safari even did so dynamically based on the mac being plugged into external power or not for a while, which I think is a nice compromise.

Apparently, there's even an API attribute that indicates whether a given codec is power efficient (https://developer.mozilla.org/en-US/docs/Web/API/MediaCapabi...), which Google must also be ignoring – not their problem, after all. (I wonder if anybody did the math of the opportunity cost of losing a few ad impressions due to the user's battery dying early vs. the incremental bandwidth cost?)