Remix.run Logo
EnPissant 4 hours ago

I don't think this is true. I can go into my display settings in kde plasma and enable HDR and configure the brightness. I have a nvidia blackwell card.

bitanarch 4 hours ago | parent [-]

You can enable, yes. But (assuming you're on an LCD display and not an OLED), you're likely still on XRGB8888 - i.e. 8-bit per channel. Check `drm_info`.

Also, go to YouTube and play this video: https://www.youtube.com/watch?v=onVhbeY7nLM

Do it once on "HDR" on Linux, and then on Windows. The "HDR" in nVidia/Linux is fake.

The brightness you see on Plasma or Mutter is indeed related to the HDR support in the driver. But - it's not really useful for the most common HDR tasks at the moment.

EnPissant 3 hours ago | parent | next [-]

I asked claude to investigate:

  Your Display Configuration

  Both monitors are outputting 10-bit color using the ABGR2101010 pixel format.

  | Monitor                | Connector | Format      | Color Depth | HDR          | Colorspace |
  |------------------------|-----------|-------------|-------------|--------------|------------|
  | Dell U2725QE (XXXXXXX) | HDMI-A-1  | ABGR2101010 | 10-bit      | Enabled (PQ) | BT2020_RGB |
  | Dell U2725QE (XXXXXXX) | HDMI-A-2  | ABGR2101010 | 10-bit      | Disabled     | Default    |

* Changed the serial numbers to XXXXXXX

I am on Wayland and outputting via HDMI 2.1 if that helps.

EDIT: Claude explained how it determined this with drm_info, and manually verified it:

> Planes 0 and 3 are the primary planes (type=1) for CRTCs 62 and 81 respectively - these are what actually display your desktop content. The Format: field shows the pixel format of the currently attached framebuffer.

EDIT: Also note that I am slowbanned on this site, so may not be able to respond for a bit.

EDIT: You should try connecting with HDMI 2.1 (you will need a 8k HDMI cable or it will fall back to older standards instead of FRL).

EDIT: HDR on youtube appears to work for me. Youtube correctly indentifies HDR on only 1 of my monitors and I can see a big difference in the flames between them on this scene: https://www.youtube.com/watch?v=WjJWvAhNq34

bitanarch 3 hours ago | parent | next [-]

I don't have a Dell U2725QE, but on InnoCN 27M2V and Cooler Master GP27U there's no ABGR2101010 support. These monitors would only work with ARGB2101010 or XRGB2101010 which nVidia drivers do not provide.

Here's what I'm getting on both monitors, with HDR enabled on Gnome 49: https://imgur.com/a/SCyyZWt

Maybe you're lucky with the Dell. But as I understand, HDR playback on Chrome is still broken.

bitanarch 3 hours ago | parent | prev [-]

Ok. I've been using DisplayPort 1.4a with my 4090 at the moment. Maybe I'll try HDMI 2.1 and see what happens.

I'm actually surprised that YouTube HDR works on your side - perhaps it's tied to the ABGR2101010 output mode being available.

bitanarch 2 hours ago | parent [-]

No luck for me with HDMI 2.1 - still seeing XRGB8888 on my monitors after HDR enabled.

That's still pretty crappy. Monitors do not say whether they support BGR input signals or not as opposed to RGB.

EnPissant 2 hours ago | parent [-]

Was it an 8k cable? Are you on wayland?

bitanarch an hour ago | parent [-]

I'm on Wayland and the cable is HDMI 2.1 ultra high speed, which means 8k. Xorg is already gone on Ubuntu 25.10.

The GPU and monitor combination has full 10-bit HDR in Windows. But in Linux it's stuck at 8bpp due to nVidia driver not having 10-bit RGB output.

EnPissant an hour ago | parent [-]

I don’t think your problem is RGB instead of BGR. That’s just the compositor’s work area and your monitor never sees it (it includes an alpha channel). Have you tried KDE Plasma? It sounds like KWin uses 10-bit planes by default when available. Maybe Ubuntu’s compositor (Mutter?) doesn’t support 30 bit color or must be configured? I actually think this is the most likely cause: that Mutter has worse support or is fragile with 30-bit color.

EnPissant 3 hours ago | parent | prev [-]

It's not obvious how to interpret the output. I pasted it into chatgpt and it thinks I am using "Format: ABGR2101010" for both monitors (only 1 has HDR on) so I don't trust it.

EDIT: See my sibling comment.

bitanarch 3 hours ago | parent [-]

Under the Planes section, look for planes that have non-zero "CRTC_ID". Those are the planes that actually get output to your monitor.

Here's what I'm getting on an RTX 4090 / InnoCN 27M2V and Cooler Master Tempest GP27U.

https://imgur.com/a/SCyyZWt