| ▲ | dartharva 2 days ago |
| It's not just games, it's regular day-to-day UI too. I'm using an Acer 185Hz VRR HDR10 Gaming monitor.. on Eco mode with HDR disabled. Everything just looks better with HDR turned off for some reason I can't explain. |
|
| ▲ | SomeoneOnTheWeb 2 days ago | parent | next [-] |
| That's normal. For HDR to look good, you need a monitor that hits approximately 1000 nits in brightness. Your monitor only hits 250, which is completely insufficient to display HDR content. This is one of the stupid things with many monitors, showing HDR at 250 nits is worse than showing no HDR at all. So no matter what you do, 99% of HDR content will look bad on your screen. |
| |
| ▲ | Tade0 2 days ago | parent | next [-] | | I agree that 250 nits is too low, but my monitor clocks in at 400 and HDR already looks better, if only thanks to the increased colour channel resolution - particularly visible in highlights, clouds etc. Where there previously was just a single colour blob I now can observe details impossible to display with just eight bits per channel. Interestingly my laptop's display reaches 500 nits and that is already painfully high outside of midday hours. My phone goes to 875 and I find that only to be useful outside in the summer sun. | | |
| ▲ | SomeoneOnTheWeb 2 days ago | parent [-] | | The difference is between SDR and HDR.
Going full blast with a full image at 500 nits or having an image averaging 200 nits with only peaks at 500 are two vastly différence things. |
| |
| ▲ | geraldwhen 2 days ago | parent | prev | next [-] | | I have a C3 OLED and everything also looks better with HDR off. Games are just truly awful in making scenes completely in viewable, even when the HDR areas, the blacks and whites, have interactive elements in them you need to see and know about. | | |
| ▲ | zapzupnz 2 days ago | parent [-] | | I have a C4 OLED and I thought what you said was also true for me until I figured out what settings I needed to change on my TV to match my console (Nintendo Switch 2). Had to turn on HGiG, manually adjust the peak brightness level on the console itself, and suddenly things looked great. Not that many games on the console that take advantage of it, mind you. More testing needed. |
| |
| ▲ | simoncion a day ago | parent | prev [-] | | > For HDR to look good, you need a monitor that hits approximately 1000 nits in brightness. I disagree. The wide color gamut is -for me- a huge thing about HDR. My VA monitor provides ~300 nits of brightness and I've been quite happy with the games that didn't phone in their HDR implementation. Plus, any non-trash HDR monitor will tell the computer it's attached to what its maximum possible brightness is, so the software running on that computer can adjust its renderer accordingly. | | |
| ▲ | dartharva a day ago | parent [-] | | > Plus, any non-trash HDR monitor will tell the computer it's attached to what its maximum possible brightness is, so the software running on that computer can adjust its renderer accordingly. My monitor does do that, but alas the software itself (Windows 10) wasn't good enough to adjust stuff correctly. It did made the decision to switch to ArchLinux easier by being one less thing I'll be missing |
|
|
|
| ▲ | tobyhinloopen 2 days ago | parent | prev | next [-] |
| Unless it's a MINI LED or OLED display, it simply doesn't have the contrast to properly render a lot of what makes HDR... HDR. Calibrate the display with HDR enabled for a better SDR response. |
| |
| ▲ | simoncion a day ago | parent [-] | | VA screens have pretty damn good contrast, and OLED monitors tend to have low peak (and sometimes even spot!) brightness. A while back, I tried an OLED gaming monitor that was widely reviewed as being very good. While it was somewhat better than the VA monitor that I've been using for years, it was nowhere near 1,500 USD good. I could see someone coming from an IPS or TN screen being very impressed with it, though. | | |
| ▲ | tobyhinloopen 16 hours ago | parent [-] | | VA screens have terrible black smearing though. I also bought an OLED display and returned it because it was just very dim. I own a miniled display that peaks at 1000cd full screen (it has a fan to handle the heat) and I'm still looking for an OLED replacement. | | |
| ▲ | simoncion 15 hours ago | parent [-] | | > VA screens have terrible black smearing though. It depends on the monitor and the colors involved in the transition. My VA monitor (a BenQ EW3270U) has limited-but-noticeable smearing between certain dark colors. Blacks and dark colors against mid-brightness and brighter are just fine. [0] It's my understanding that this monitor has quite-a-bit-less-bad color smearing than most VA panels, and has roughly the same -er- amount of slow transitions (just with a different set of colors) as my Asus PA246 IPS monitor. I play a variety of video games, so I see both muddily-dark and high-contrast areas. I'm fairly pleased with the performance of the panel they dropped into this monitor. Honestly, the off-axis color and contrast shifting is way more noticeable than the color smearing... and folks who sit down in front of this monitor don't tend to notice those shifts. (Plus, if you play 3D video games released within the last five years, crap like temporal-anti-aliasing and its bastard children add so much smearing and rendering artifacts that it becomes quite challenging to determine what visual artifacts are actual pixels commanded to be on the screen by the renderer, and what might from too-slow flipping of the pixels in the screen. Is this a good state of affairs? Definitely not. But it's the one we find ourselves in.) [0] It's entirely unlike the OLED screen in the Nexus 5a which has incredible smearing between black and a huge array of dark-to-medium-brightness colors. This smearing reduces as you increase the brightness of the screen, but doesn't go away entirely until you get to like the top quarter of the screen's brightness. (If you have one of these phones, drop your screen brightness to the bottom quarter and browse through back issues of the Gunnerkrigg Court webcomic. There are PLENTY of black-on-X color combinations to get a really obvious demonstration of the problem.) |
|
|
|
|
| ▲ | whywhywhywhy a day ago | parent | prev [-] |
| Something is poorly implemented with Windows UI on HDR, and Macbooks it all looks fine then HDR content just appears brighter, I think the rest of the UI becomes duller too at that point but on windows it feels like running HDR on the Windows desktop means the whole screen looks dull, least it does on my 5K HDR Dell. Not sure if I'm missing a setting, but I end up having to manually turn HDR on before playing a game and off after. |