| ▲ | jacobgkau 3 days ago | |||||||||||||
That would be easier if both GPU and display manufacturers weren't eschewing newer DisplayPort versions for older versions with DSC (which is not lossless despite its subjective claims of being "visually lossless"), while building in newer HDMI versions with greater performance. | ||||||||||||||
| ▲ | jsheard 3 days ago | parent | next [-] | |||||||||||||
To be fair, the DisplayPort 2.0/2.1 standardisation process was riddled with delays and they ended up landing years after HDMI 2.1 did. It stands to reason that hardware manufacturers picked up the earlier spec first. | ||||||||||||||
| ▲ | AshamedCaptain 3 days ago | parent | prev [-] | |||||||||||||
what resolution is it that you can drive with "newer HDMI versions" but you cannot drive with DisplayPort 1.4 w/o DSC? The bandwidth difference is not really that much in practice, and "newer HDMI versions" also rely on DSC, or worse, chroma subsampling (objectively and subjectively worse). I mean, one has been able to drive 5K, 4K@120Hz, etc. for almost over a decade with DP1.4, for the same res you need literally the latest version of HDMI (the "non" TDMS one). It's no wonder that display screens _have_ to use the latest version of HDMI, because otherwise they cannot be driven from a single HDMI port at all. Having monitors that supported its native resolution through DP but not HDMI used to be a thing until very recently. | ||||||||||||||
| ||||||||||||||