▲ | kllrnohj a day ago | ||||||||||||||||
> You’re using different words to say the exactly same thing I was trying to say. You’re not arguing with me, you’re agreeing. Your “arbitrary brightness” means the same thing I meant by “max brightness”, because I meant max brightness as the maximum brightness the device will display at its current arbitrary settings, not the absolute maximum brightness the device is capable of if you change the settings later. No, it's not! HDR exceeds that brightness, it's not what the display's maximum currently is. > SDR, like PQ, is a video standard and has an absolute reference point (100 nits), SDR isn't a standard at all, it's a catch-all to mean anything not-HDR. But no, it has no absolute reference point. > To circle back to my main point, I still believe that using physical units is the most important part of HDR conceptually. The only actual feature of HDR is the concept that display brightness is decoupled from user brightness, allowing content to "punch through" that ceiling. In camera terms that means the user is controlling middle grey, not the point at which clipping occurs. | |||||||||||||||||
▲ | dahart a day ago | parent [-] | ||||||||||||||||
> SDR isn’t a standard at all, it’s a catch-all to mean anything not-HDR. Maybe you’re thinking of LDR. Low Dynamic Range is the catch-all opposite of HDR. SDR Standard Dynamic Range means Rec. 709 to most people? That’s how I’ve been using those 2 acronyms all along in this thread, in case you want to revisit. https://en.wikipedia.org/wiki/Standard-dynamic-range_video https://support.apple.com/guide/motion/about-color-space-mot... > The only actual feature of HDR is the concept that display brightness is decoupled from user brightness, allowing content to “punch through” that ceiling. There are lots of features of HDR, depending on what your goals are, and many definitions over the years. It’s fair to say that using absolute physical units instead of relative colors does decouple the user’s color from the display so maybe you’re agreeing. Use of physical units was one of the important motivations for the invention of HDR. The first HDR file format I know of and used, created for the Radiance renderer in ~1985, was designed for lighting simulation and physical validation, for applications like aerospace, forensics, architecture, etc. https://en.wikipedia.org/wiki/Radiance_(software)#HDR_image_... In CG film & video game production (e.g., the article we’re commenting on), it’s important that the pre-output HDR workflow is linear, has a maximum brightness significantly higher than any possible display device, and uses higher bit rate than final output, to allow for wide latitude in post-processing, compositing, re-exposure, and tone mapping. In photography, where everyone could already always control middle grey, people use HDR because they care about avoiding hard clipping, they want to be able to control what happens to the sun and bright highlights, and because they want to be able to re-expose pictures that appear clipped in blacks or whites to reveal new detail. > In camera terms that means the user is controlling middle grey, not the point at which clipping occurs. I’m not sure what you mean, but that sounds like you’re referring to tonemapping specifically, or even just gamma, and not HDR generally. With an analog film camera, which I hope we can agree is not HDR imaging, I can control middle grey with my choice of exposure and with my choice of film and with lens filters (and a similar set of middle-grey controlling choices exists when printing). The same is true for a digital camera that captures only 8 bits/channel JPG files. Tonemapping certainly comes with a lot of HDR workflows, but does not define what HDR means nor is it necessary. You can do HDR imaging without any tonemapping, and without trying to control middle grey. | |||||||||||||||||
|