Remix.run Logo
kllrnohj 2 days ago

> You’re saying PQ is “downright wrong”? Why?

Yes, PQ is downright wrong because users view content in differing ambient environments. 50 nits in a movie theater and 50 nits on a sunny day are extremely different perceptually.

> How is the TV industry fixing it?

Who said anything about the TV industry fixing it? Mobile is what really highlights how broken PQ is in practice to such an extent that none of the mobile OS's ever show PQ as the spec demands.

BT 2048 arguably redefines PQ into a relative colorspace by defining that 203 nits == "graphics white" which OS' are starting to just treat as SDR white to redefine PQ as a relative colorspace instead of an absolute one. Wayland also adopted this.

But even the TV industry is starting to recognize that. The recently Dolby Vision 2.0 focuses a lot on similarly retroactively redefining PQ as a relative colorspace, as is SMPTE ST 2094-50.

> Absolutely! I was really referring to the LDR color spaces where 1.0 is relative to the max brightness, which is why I qualified my statement with “relative units”. Aside from PQ, nearly all the old color spaces that stop at 1.0 are relative, not absolute… all the non-physical, non-perceptual color spaces like RGB, CMYK, HSL, HSV… and even some perceptual spaces like CIE’s XYZ, xyY, and LAB are all relative with 1.0 meaning max brightness

Just because 1.0 is the max value in a relative colorspace doesn't mean it's necessarily LDR / SDR. Consider for example linear vs. non-linear sRGB. The value 0.5 in both of those means "half of max", yet they represent wildly different perceptual values, right? In that same way, the 1.0 max of any given working colorspace does not have to map to an "LDR" value. Consider for example being able to write directly to the display. 1.0 max value now maps to the max brightness of that display, and whether or not that's "HDR" depends on the display & ambient conditions more than anything else.

For internal passes you almost certainly want to work in something consistent, typically linear, and then yes 1.0 would be SDR white. But the output swapchain is ~never linear, and what 1.0 means depends entirely on what is communicated to the consumer of that swapchain. It has no inherent meaning, no inherent SDR/LDR/HDR-ness. Just like 1.0 of an sRGB swapchain and 1.0 of a Display P3 swapchain have different colors, you can also have different luminance ranges depending on what the specific platform supports and the specifics of the output mode.

dahart 2 days ago | parent [-]

I thought you were saying the TV industry is fixing it, maybe I misunderstood. PQ is a television standard, people don’t really use it in other scenarios. I thought you were defending PQ. If not, then I’m not sure why we’re talking about PQ, Game Maker doesn’t use PQ, does it?

I’m not sure what you mean about different ambient environments leading to the conclusion that PQ is wrong. No color space can fix that problem, that’s not a color space problem. You seemed to argue with the idea of keeping track of physical units. I guess I’m not understanding your point.

For the rest it sounds like we’re mostly in full agreement. Of course the output of HDR work isn’t linear, transfer functions and tonemapping guarantee that. I mentioned linear HDR workflows because you brought up PQ, which is primarily a delivery display color space for HDR displays. I agree that what 1.0 means depends on what the color space defines it to mean, especially when you don’t use a relative color space. We only have the 2 kinds, one that’s relative and where 1.0 is defined to mean the max brightness (or reflectivity) of the display medium, and one where you somewhere define an absolute reference point. My point was only, and still is, that for the relative color spaces, 1.0 always means the same thing. It’s true you can have values above 1 in a relative color space, if you want, but you can’t print or display them; that would only be useful for pre-output color processing.

kllrnohj 2 days ago | parent [-]

> We only have the 2 kinds, one that’s relative and where 1.0 is defined to mean the max brightness (or reflectivity) of the display medium,

This is just not true though. The most common relative colorspaces have 1.0 as an arbitrary user brightness setting, not the max brightness. This is where >1.0 values typically gain meaning and the typical "0-1 is ldr, >1 is hdr" comes from. This requires a float format for output. (linear) extended srgb is a standard example, but also Apple's EDR. But you can also have relative colorspaces where 1.0 is a value brighter than SDR white. This is critical for unorm formats which are typically much more efficient for display. HLG is a standard example of this, but Android's extended range brightness is another.

> Game Maker doesn’t use PQ, does it?

I have no idea what its HDR support looks like. If it can produce an HDR output on screen it's probably using PQ to do it since that's the most broadly supported option. PQ being crap though is why if you've ever played a game in HDR it pushes you to do a calibration wizard first

dahart 2 days ago | parent [-]

> The most common relative colorspaces have 1.0 as an arbitrary user brightness setting, not the max brightness.

You’re using different words to say the exactly same thing I was trying to say. You’re not arguing with me, you’re agreeing. Your “arbitrary brightness” means the same thing I meant by “max brightness”, because I meant max brightness as the maximum brightness the device will display at its current arbitrary settings, not the absolute maximum brightness the device is capable of if you change the settings later.

It would be better if you take a moment to think in terms of print rather than only video standards. You can’t change the brightness of paper. 1.0 always means reflect all the light you can reflect, and there is no brightness setting. Because print is reflective and not emissive, print always takes relative colors. The analogy extends pretty naturally to TVs at a given brightness setting. The most common relative colorspaces (by which I mean the old crappy non-perceptual ones like RGB, HSV, CMYK) are still relative, meaning the brightest colors they represent (what we’re calling 1.0) map to the brightest the display will do at that moment in time.

SDR, like PQ, is a video standard and has an absolute reference point (100 nits), so of course any relative color space can have a 1.0 value greater than SDR white, because most TVs these days can exceed 100 nits. Is that what you mean? I don’t see why that’s relevant to what 1.0 means in other color spaces.

You still haven’t really explained what’s wrong with PQ. I’ve never used it directly, but do you have any links or any explanation to support your claim that it’s “wrong”? Why is it wrong? What color spaces are doing it right?

If people use ACES shaders in GameMaker, as the article discussed, doesn’t that automatically mean that GameMaker is not using PQ? It doesn’t make any sense to have PQ after tonemapping. Maybe as a compression technique / format that the video card and the display use transparently without any interaction with the user or the application, but that’s not GameMaker, its the TV.

I still don’t understand why we’re talking about PQ. To circle back to my main point, I still believe that using physical units is the most important part of HDR conceptually. You seemed to disagree, but this entire discussion so far only seems to make that point even more clear and firm, and as far as I can tell you agree. IMO the canonical color space for HDR and best example would be linear float channels where 1.0 is defined as 1.0 nits (or substitute for another physical luminance unit). HDRI’s strengths and utility is in capture, storage, and intermediate workflows, and not output. As soon as you target a specific type of device, as soon as you tack on a transfer function, tonemapping, or lower bit rate, you’re limiting options and losing information.

kllrnohj a day ago | parent [-]

> You’re using different words to say the exactly same thing I was trying to say. You’re not arguing with me, you’re agreeing. Your “arbitrary brightness” means the same thing I meant by “max brightness”, because I meant max brightness as the maximum brightness the device will display at its current arbitrary settings, not the absolute maximum brightness the device is capable of if you change the settings later.

No, it's not! HDR exceeds that brightness, it's not what the display's maximum currently is.

> SDR, like PQ, is a video standard and has an absolute reference point (100 nits),

SDR isn't a standard at all, it's a catch-all to mean anything not-HDR. But no, it has no absolute reference point.

> To circle back to my main point, I still believe that using physical units is the most important part of HDR conceptually.

The only actual feature of HDR is the concept that display brightness is decoupled from user brightness, allowing content to "punch through" that ceiling. In camera terms that means the user is controlling middle grey, not the point at which clipping occurs.

dahart a day ago | parent [-]

> SDR isn’t a standard at all, it’s a catch-all to mean anything not-HDR.

Maybe you’re thinking of LDR. Low Dynamic Range is the catch-all opposite of HDR. SDR Standard Dynamic Range means Rec. 709 to most people? That’s how I’ve been using those 2 acronyms all along in this thread, in case you want to revisit.

https://en.wikipedia.org/wiki/Standard-dynamic-range_video

https://support.apple.com/guide/motion/about-color-space-mot...

> The only actual feature of HDR is the concept that display brightness is decoupled from user brightness, allowing content to “punch through” that ceiling.

There are lots of features of HDR, depending on what your goals are, and many definitions over the years. It’s fair to say that using absolute physical units instead of relative colors does decouple the user’s color from the display so maybe you’re agreeing.

Use of physical units was one of the important motivations for the invention of HDR. The first HDR file format I know of and used, created for the Radiance renderer in ~1985, was designed for lighting simulation and physical validation, for applications like aerospace, forensics, architecture, etc. https://en.wikipedia.org/wiki/Radiance_(software)#HDR_image_...

In CG film & video game production (e.g., the article we’re commenting on), it’s important that the pre-output HDR workflow is linear, has a maximum brightness significantly higher than any possible display device, and uses higher bit rate than final output, to allow for wide latitude in post-processing, compositing, re-exposure, and tone mapping.

In photography, where everyone could already always control middle grey, people use HDR because they care about avoiding hard clipping, they want to be able to control what happens to the sun and bright highlights, and because they want to be able to re-expose pictures that appear clipped in blacks or whites to reveal new detail.

> In camera terms that means the user is controlling middle grey, not the point at which clipping occurs.

I’m not sure what you mean, but that sounds like you’re referring to tonemapping specifically, or even just gamma, and not HDR generally. With an analog film camera, which I hope we can agree is not HDR imaging, I can control middle grey with my choice of exposure and with my choice of film and with lens filters (and a similar set of middle-grey controlling choices exists when printing). The same is true for a digital camera that captures only 8 bits/channel JPG files. Tonemapping certainly comes with a lot of HDR workflows, but does not define what HDR means nor is it necessary. You can do HDR imaging without any tonemapping, and without trying to control middle grey.

kllrnohj a day ago | parent [-]

Physical units to describe your scene before it hits your camera makes perfect sense. Physical units for the swapchain on the way to the display (which is where PQ enters the picture), however, does not make sense and is a bug with HDR10(+)/DolbyVision.

I think you've been focused entirely on the scene component, pre-camera/tone mapping, whereas I'm talking entirely after that. The part that's sent into the system or out to the display

dahart a day ago | parent [-]

Yes! You’re talking about the space between the application’s output and the display, and I’m talking about capture, storage, and processing before tone mapping. In many HDR workflows there are a lot of things in between camera and tone mapping, and that is where I see most of the value of HDR workflows. The original point of HDR was to capture physical units, and that means storage after the scene hits the camera, and before output. The original point of tone mapping was to squeeze the captured HDR images down to LDR output, to bake physical luminance down to relative colors. Before just a few years ago, tone mapping was the end of the HDR part of the pipeline, everything downstream was SDR/LDR. That’s getting muddy these days with “HDR” TVs that can do 3k nits, and all kinds of color spaces, but HDR conceptually and historically didn’t require any of that and still doesn’t. Plenty of workflows still exist where the input of tonemapping is HDR and the output is SDR/LDR, and there’s no swapchain or PQ or HDR TVs in the loop.