| ▲ | mafuyu 5 hours ago | |
The keywords you're missing are color spaces and gamma curves. For a given bandwidth, we want to efficiently allocate color encoding as well as brightness (logarithmically to capture the huge dynamic range of perceptible light). sRGB is one such standard that we've all agreed upon, and output devices all ostensibly shoot for the sRGB target, but may also interpret the signal however they'd like. This is inevitable, to account for the fact that not all output devices are equally capable. HDR is another set of standards that aims to expand the dynamic range, while also pinning those values to actual real-life brightness values. But again, TVs and such may interpret those signals in wildly different ways, as evidenced by the wide range of TVs that claim to have "HDR" support. This was probably not the most accurate explanation, but hopefully it's enough to point you in the right direction. | ||