| ▲ | krackers a day ago |
| >if the linear data is displayed directly, it will appear much darker then it should be. This seems more a limitation of monitors. If you had very large bit depth, couldn't you just display images in linear light without gamma correction. |
|
| ▲ | Sharlin a day ago | parent | next [-] |
| No. It's about the shape of the curve. Human light intensity perception is not linear. You have to nonlinearize at some point of the pipeline, but yes, typically you should use high-resolution (>=16 bits per channel) linear color in calculations and apply the gamma curve just before display. The fact that traditionally this was not done, and linear operations like blending were applied to nonlinear RGB values, resulted in ugly dark, muddy bands of intermediate colors even in high-end applications like Photoshop. |
| |
| ▲ | Dylan16807 a day ago | parent | next [-] | | The shape of the curve doesn't matter at all. What matters is having a mismatch between the capture curve and the display curve. If you kept it linear all the way to the output pixels, it would look fine. You only have to go nonlinear because the screen expects nonlinear data. The screen expects this because it saves a few bits, which is nice but far from necessary. To put it another way, it appears so dark because it isn't being "displayed directly". It's going directly out to the monitor, and the chip inside the monitor is distorting it. | |
| ▲ | krackers a day ago | parent | prev [-] | | >Human light intensity perception is not linear... You have to nonlinearize at some point of the pipeline Why exactly? My understanding is that gamma correction is effectively a optimization scheme during encoding to allocate bits in a perceptually uniform way across the dynamic range. But if you just have enough bits to work with and are not concerned with file sizes (and assuming all hardware could support these higher bit depths), then this shouldn't matter? IIRC unlike crts, LCDs don't have a power curve response in terms of the hardware anyway, and emulate the overall 2.2 trc via LUT. So you could certainly get monitors to accept linear input (assuming you manage to crank up the bit depth enough to the point where you're not losing perceptual fidelity), and just do everything in linear light. In fact if you just encoded the linear values as floats that would probably give you best of both worlds, since floating point is basically log-encoding where density of floats is lower at the higher end of the range. https://www.scantips.com/lights/gamma2.html (I don't agree with a lot of the claims there, but it has a nice calculator) |
|
|
| ▲ | AlotOfReading a day ago | parent | prev | next [-] |
| Correction is useful for a bunch of different reasons, not all of them related to monitors. Even ISP pipelines without displays involved will still usually do it to allocate more bits to the highlights/shadows than the relatively distinguishable middle bits. Old CRTs did it because the electron gun had a non-linear response and the gamma curve actually linearized the output. Film processing and logarithmic CMOS sensors do it because the sensing medium has a nonlinear sensitivity to the light level. |
|
| ▲ | tobyhinloopen 15 hours ago | parent | prev | next [-] |
| The problem with their example is that you can display linear image data just fine, just not with JPEG. Mapping linear data to 255 RGB that expects the gamma-corrected values is just wrong. They could have used an image format that supports linear data, like JPEG-XL, AVIF or HEIC. No conversion to 0-255 required, just throw in the data as-is. |
|
| ▲ | dheera a day ago | parent | prev [-] |
| If we're talking about a sunset, then we're talking about your monitor shooting out blinding, eye-hurting brightness light wherever the sun is in the image. That wouldn't be very pleasant. |
| |
| ▲ | Dylan16807 15 hours ago | parent | next [-] | | Linear encoding doesn't change the max brightness of the monitor. More importantly, the camera isn't recording blinding brightness in the first place! It'll say those pixels are pure white, which is probably a few hundred or thousand nits depending on shutter settings. | |
| ▲ | krackers 21 hours ago | parent | prev | next [-] | | That's a matter of tone mapping which is separate from gamma encoding? Even today, linearized pixel value 255 will be displayed at your defined SDR brightness no matter what. Changing your encoding gamma won't help that because for correct output the transform necessarily needs to be be undone during display. | |
| ▲ | myself248 21 hours ago | parent | prev [-] | | Which is why I'm looking at replacing my car's rear-view mirror with a camera and a monitor. Because I can hard-cap the monitor brightness and curve the brightness below that, eliminating the problem of billion-lumens headlights behind me. |
|