Remix.run Logo
wmf 2 days ago

This is not a normal retina configuration. This is a highly unusual configuration where the framebuffer is much larger than the screen resolution and gets scaled down. Obviously it sucks if it used to work and now it doesn't but almost no one wants this which probably explains why Apple doesn't care.

smcleod 2 days ago | parent | next [-]

In my case it's a standard LG UltraFine 4K monitor plugged into a standard 16" M5 MacBook Pro via standard Thunderbolt (via USB-C) - not sure what's not normal about this? I've confirmed it with other monitors and M5 Macbook Pros as well.

petersellers 2 days ago | parent | next [-]

In macOS display settings, what scaling mode are you using? This bug appears to only affect 4K monitors that are configured to use the maximum amount of screen space (which makes text look uncomfortably tiny unless you have a very large monitor). Most people run at the default setting which gives you the real estate of a 1080p screen at 2x scale, hence the "not normal" part of this configuration.

Actually, I don't even think it's possible to run HiDPI mode at the native resolution scale from within the macOS settings app, you'd need something like `Better Display` to turn it on explicitly.

smcleod 2 days ago | parent [-]

If you use the middle screen scaling you're given absolutely huge UI elements and it's the case for the inbuild 16" screen as well as external displays but when you get up to 32" displays it's almost comical how large the UI is on the middle / default setting.

petersellers 2 days ago | parent [-]

Yeah, on larger monitors it's more common to run at the monitor's native resolution without scaling but even so macOS will not turn on HiDPI mode - you'd still need to do this explicitly via another app (I didn't even know it was possible to turn on HiDPI mode at native scaling until reading this article)

big_toast 2 days ago | parent | prev | next [-]

I use a 43" 4k tv at the standard non-retina 4k with an m1 pro. I tried your 8k supersampling but it doesn't seem to improve on the default 4:4:4 8bit rgb non-retina for me. (smoother but not as crisp outside terminals?)

The TV is unusable without BetterDisplay because the apple default negotiation preference. I hope waydabber can figure something out with you.

2 days ago | parent | prev [-]
[deleted]
phonon 2 days ago | parent | prev | next [-]

Isn't that just 2x supersampling? If you want "perfect" antialiasing that's the minimum you need, no?

wmf 2 days ago | parent [-]

Yes, it is supersampling but historically almost no one runs that way.

2 days ago | parent | prev | next [-]
[deleted]
sgerenser 2 days ago | parent | prev | next [-]

I don’t know why this was downvoted, I agree that this is a highly unusual configuration. Why render to a frame buffer with 2x the pixels in each direction va the actual display, only to then just scale the whole thing down by 2x in each direction?

eptcyka 2 days ago | parent | next [-]

Because Apple no longer implements subpixel rendering for fonts?

Rohansi 2 days ago | parent [-]

Supersampling the entire framebuffer is a bad way to anti-alias fonts. Especially since your font rendering is almost certainly doing grayscale anti-aliasing already, which is going to look better than 2x supersampling alone. And supersampling will not do subpixel rendering.

mlyle 2 days ago | parent | prev [-]

Because it's a decent way to get oversampling.

NBJack 2 days ago | parent | prev | next [-]

To be frank, it's kind of embarrassing if an entry-level Windows laptop with a decent integrated GPU handles this without much effort.

Apple is free to make its own choices on priority, but I'm disappointed when something that's considered the pinnacle of creative platforms sporting one of the most advanced consumer processors available can't handle a slightly different resolution.

sgerenser a day ago | parent [-]

Nope, no windows laptop will render to an 8K framebuffer then downsample that by 2x in each direction to display it at 4K. That’s what the OP is complaining that MacOS won’t let him do.

wpm 2 days ago | parent | prev [-]

This is what us proles on third-party monitors have to do to make text look halfway decent. My LG DualUps (~140ppi if I recall) run at 2x of a scaled resolution to arrive at roughly what would be pixel-doubled 109ppi, which is the only pixel density the UI looks halfway decent at. It renders an 18:16 2304 x something at 2x, scaled down by 2.

It's also why when you put your Mac into "More Space" resolution on the built-in or first-party displays, it tells you this could hurt performance because thats exactly what the OS is going to do to give you more space without making text unreadable aliased fuzz, it renders the "apparent" resolution pixel doubled, and scales it down which provides a modicum of sub-pixel anti-aliasing's effect. Apple removed subpixel antialiasing a while back and this is the norm now.

I have a 4K portable display (stupid high density but still not quite "retina" 218) on a monitor arm I run at, as you suggest, 1080p at 2x. Looks ok but everything is still a bit small. If you have a 4K display and want to use all 4K, you have the crappy choice between making everything look terrible, or wasting GPU cycles and memory on rendering an 8K framebuffer and scaling it down to 4K.

I'm actually dealing with this right now on my TV (1080p which is where I'm writing this comment from). My normal Linux/Windows gaming PC that I have hooked up in my living room is DRAM-free pending an RMA, so I'm on a Mac Mini that won't let me independently scale text size and everything else like Windows and KDE let me do. I have to run it at 1600x900 and even then I have to scale every website I go to to make it readable. Text scaling is frankly fucked on macOS unless you are using the Mac as Tim Cook intended: using the built-in display or one of Apple's overpriced externals, sitting with the display at a "retina appropriate" distance for 218ppi to work.

toxik 2 days ago | parent [-]

Pedantry: 18:16 is the same as 9:8 since it's a ratio.