▲ | SomeoneOnTheWeb 2 days ago | |||||||
That's normal. For HDR to look good, you need a monitor that hits approximately 1000 nits in brightness. Your monitor only hits 250, which is completely insufficient to display HDR content. This is one of the stupid things with many monitors, showing HDR at 250 nits is worse than showing no HDR at all. So no matter what you do, 99% of HDR content will look bad on your screen. | ||||||||
▲ | Tade0 2 days ago | parent | next [-] | |||||||
I agree that 250 nits is too low, but my monitor clocks in at 400 and HDR already looks better, if only thanks to the increased colour channel resolution - particularly visible in highlights, clouds etc. Where there previously was just a single colour blob I now can observe details impossible to display with just eight bits per channel. Interestingly my laptop's display reaches 500 nits and that is already painfully high outside of midday hours. My phone goes to 875 and I find that only to be useful outside in the summer sun. | ||||||||
| ||||||||
▲ | geraldwhen 2 days ago | parent | prev | next [-] | |||||||
I have a C3 OLED and everything also looks better with HDR off. Games are just truly awful in making scenes completely in viewable, even when the HDR areas, the blacks and whites, have interactive elements in them you need to see and know about. | ||||||||
| ||||||||
▲ | simoncion a day ago | parent | prev [-] | |||||||
> For HDR to look good, you need a monitor that hits approximately 1000 nits in brightness. I disagree. The wide color gamut is -for me- a huge thing about HDR. My VA monitor provides ~300 nits of brightness and I've been quite happy with the games that didn't phone in their HDR implementation. Plus, any non-trash HDR monitor will tell the computer it's attached to what its maximum possible brightness is, so the software running on that computer can adjust its renderer accordingly. | ||||||||
|