| ▲ | IgorPartola a day ago |
| AV1 is an amazing codec. I really hope it replaces proprietary codecs like h264 and h265. It has a similar, if not better, performance to h265 while being completely free. Currently on an Intel-based Macbook it is only supported in some browsers, however it seems that newer video cards from AMD, Nvidia, and Intel do include hardware decoders. |
|
| ▲ | karn97 a day ago | parent | next [-] |
| 9070xt records gameplay by default in av1 |
| |
| ▲ | monster_truck a day ago | parent [-] | | RDNA3 cards also have AV1 encode. RDNA 2 only has decode. With the bitrate set to 100MB/s it happily encodes 2160p or even 3240p, the maximum resolution available when using Virtual Super Resolution (which renders at >native res and downsamples, is awesome for titles without resolution scaling when you don't want to use TAA) | | |
| ▲ | kennyadam a day ago | parent [-] | | Isn't that expected? 4K Blurays only encode up to like 128Mbps, which is 16MB/s. 100MB/s seems like complete overkill. | | |
| ▲ | monster_truck 5 hours ago | parent | next [-] | | It isn't for the amount of motion involved. Third person views in rally simulators and continuous fast flicks in shooters require it | |
| ▲ | vlovich123 a day ago | parent | prev [-] | | I think op just didn’t type Mbps properly. 100MB/s or ~800Mbps is way higher than the GPU can even encode at a HW level even I would think | | |
|
|
|
|
| ▲ | flashblaze a day ago | parent | prev | next [-] |
| I'm not really well versed with codecs, but is it up to the devices or the providers (where you're uploading them) to handle playback or both? A couple of days ago, I tried to upload an Instagram Reel in AV1 codec, and I was struggling to preview it on my Samsung S20 FE Snapdragon version (before uploading and during preview as well). I then resorted to H.264 and it worked w/o any issues. |
| |
| ▲ | sparrc a day ago | parent | next [-] | | Playback is 100% handled by the device. The primary (and essentially only) benefit of H.264 is that almost every device in the entire world has an H.264 hardware decoder builtin to the chip, even extremely cheap devices. AV1 hardware decoders are still rare so your device was probably resorting to software decoding, which is not ideal. | |
| ▲ | kevmo314 a day ago | parent | prev [-] | | Instagram (the provider) will transcode for compatibility but likely the preview is before transcoding, the assumption being that the device that uploads the video is able to play it. | | |
| ▲ | ta1243 a day ago | parent [-] | | Yes that sounds spot on. I don't know instagram, but I would expect any provider to be handle almost any container/codec/resolution combination going (they likely use ffmpeg underneath) and generate their different output formats at different bitrates for different playback devices. Either instagram won't accept av1 (seems unlikely) or they just haven't processed it yet as you infer. I'd love to know why your commend is greyed out. |
|
|
|
| ▲ | adzm a day ago | parent | prev | next [-] |
| Isn't VP9 more comparable to h265? AV1 seems to be a ton better than both of them. |
| |
| ▲ | senfiaj a day ago | parent | next [-] | | I think VP9 is more comparable to h264. Also if I'm not mistaken it's not good for live streaming, only for storing data. | | |
| ▲ | toast0 a day ago | parent [-] | | VP9 works for live streaming/real time conferencing too. | | |
| ▲ | senfiaj a day ago | parent [-] | | Yeah, but I think it has much higher CPU usage, at least when there is no native hardware decoder/encoder. Maybe this has more to do with adoption, since H264 has been an industry standard. | | |
| ▲ | toast0 a day ago | parent [-] | | Codec selection is always a complex task. You've got to weigh quality/bitrate vs availability of hardware encode/decode, licensing, and overall resource usage. The ITU standards have had a lot better record of inclusion in devices that people actually have; and often using hardware encode/decode takes care of licensing. But hardware encode doesn't always have the same quality/bitrate as software and may not be able to do fancier things like simulcast or svc. Some of the hardware decoders are pretty picky about what kinds of streams they'll accept too. IMHO, if you're looking at software h.264 vs software vp9, I think vp9 is likely to give you better quality at a given bitrate, but will take more cpu to do it. So, as always, it depends. | | |
| ▲ | Dylan16807 16 hours ago | parent [-] | | > IMHO, if you're looking at software h.264 vs software vp9, I think vp9 is likely to give you better quality at a given bitrate, but will take more cpu to do it. So, as always, it depends. That's a pretty messy way to measure. h.264 with more CPU can also beat h.264 with less CPU. How does the quality compare if you hold both bitrate and CPU constant? How does the CPU compare if you hold both bitrate and quality constant? AV1 will do significantly better than h.264 on both of those tests. How does VP9 do? |
|
|
|
| |
| ▲ | dagmx a day ago | parent | prev [-] | | They’re all in the same ballpark of each other and have characteristics that don’t make one an outright winner. | | |
| ▲ | CharlesW a day ago | parent [-] | | AV1 is the outright winner in terms of compression efficiency (until you start comparing against VVC/H.266¹), with the advantage being even starker at high resolutions. The only current notable downside of AV1 is that client hardware support isn't yet universal. ¹ https://www.mdpi.com/2079-9292/13/5/953 |
|
|
|
| ▲ | aaron695 a day ago | parent | prev | next [-] |
| Get The Scene involved. They shifted to h.264 successfully, but I haven't heard of any more conferences to move forward in over a decade. Currently "The Last of US S02E06" only has one AV1 - https://thepiratebay.org/search.php?q=The+Last+of+Us+S02E06 same THMT - https://thepiratebay.org/search.php?q=The+Handmaids+Tale+S06... These are low quality at only ~600MB, not really early adopter sizes. AV1 beats h.265 but not h.266 - https://www.preprints.org/manuscript/202402.0869/v1 - People disagree with this paper on default settings Things like getting hardware to The Scene for encoding might help, but I'm not sure of the bottleneck, it might be bureaucratic or educational or cultural. [edit] "Common Side Effects S01E04" AV1 is the strongest torrent, that's cool - https://thepiratebay.org/search.php?q=Common+Side+Effects+S0... |
| |
| ▲ | aidenn0 a day ago | parent | next [-] | | At higher quality/bitrates, the difference is much smaller and device support is universal for AVC and quite good for HEVC. Anything over 1.5GB for a single episode would probably only be farily similarly sized with AV1. There is one large exception, but I don't know the current scene well enough to know if it matters: sources that are grainy. I have some DVD and blurays with high grain content and AV1 can work wonders with those thanks to the in-loop grain filter and synthesis -- we are talking half the size for a high-quality encode. If I were to encode them for AVC at any reasonable bitrate, I would probably run a grain-removal filter which is very finicky if you don't want to end up with something that is overly blurry. | |
| ▲ | LtdJorge a day ago | parent | prev | next [-] | | This may be in part because people that automatized their media servers are using hardware acceleration for transcoding (from 4k for example), and hardware has only recently added decoding for AV1. In my case, I get both 4k (h265) and 1080p (h264) blurays and let the client select. | |
| ▲ | fishgoesblub a day ago | parent | prev | next [-] | | There are plenty of AV1 releases in other, better places than the scam bay. | |
| ▲ | phendrenad2 a day ago | parent | prev | next [-] | | Holy shadowban Batman! All of your comments are [dead]. What did you do to anger the HN Gods? | |
| ▲ | wbl a day ago | parent | prev [-] | | There was a conference?! |
|
|
| ▲ | a day ago | parent | prev [-] |
| [deleted] |