Remix.run Logo
SomeoneOnTheWeb 2 days ago

That's normal. For HDR to look good, you need a monitor that hits approximately 1000 nits in brightness. Your monitor only hits 250, which is completely insufficient to display HDR content.

This is one of the stupid things with many monitors, showing HDR at 250 nits is worse than showing no HDR at all. So no matter what you do, 99% of HDR content will look bad on your screen.

Tade0 2 days ago | parent | next [-]

I agree that 250 nits is too low, but my monitor clocks in at 400 and HDR already looks better, if only thanks to the increased colour channel resolution - particularly visible in highlights, clouds etc. Where there previously was just a single colour blob I now can observe details impossible to display with just eight bits per channel.

Interestingly my laptop's display reaches 500 nits and that is already painfully high outside of midday hours. My phone goes to 875 and I find that only to be useful outside in the summer sun.

SomeoneOnTheWeb 2 days ago | parent [-]

The difference is between SDR and HDR. Going full blast with a full image at 500 nits or having an image averaging 200 nits with only peaks at 500 are two vastly différence things.

geraldwhen 2 days ago | parent | prev | next [-]

I have a C3 OLED and everything also looks better with HDR off.

Games are just truly awful in making scenes completely in viewable, even when the HDR areas, the blacks and whites, have interactive elements in them you need to see and know about.

zapzupnz 2 days ago | parent [-]

I have a C4 OLED and I thought what you said was also true for me until I figured out what settings I needed to change on my TV to match my console (Nintendo Switch 2). Had to turn on HGiG, manually adjust the peak brightness level on the console itself, and suddenly things looked great.

Not that many games on the console that take advantage of it, mind you. More testing needed.

simoncion a day ago | parent | prev [-]

> For HDR to look good, you need a monitor that hits approximately 1000 nits in brightness.

I disagree. The wide color gamut is -for me- a huge thing about HDR. My VA monitor provides ~300 nits of brightness and I've been quite happy with the games that didn't phone in their HDR implementation.

Plus, any non-trash HDR monitor will tell the computer it's attached to what its maximum possible brightness is, so the software running on that computer can adjust its renderer accordingly.

dartharva a day ago | parent [-]

> Plus, any non-trash HDR monitor will tell the computer it's attached to what its maximum possible brightness is, so the software running on that computer can adjust its renderer accordingly.

My monitor does do that, but alas the software itself (Windows 10) wasn't good enough to adjust stuff correctly. It did made the decision to switch to ArchLinux easier by being one less thing I'll be missing