Remix.run Logo
4ggr0 a day ago

as a gamer 8k makes me sweat because i can't imagine what kind of hardware you'd need to run a game :O probably great for text-based work, though!

Aurornis a day ago | parent | next [-]

Once you get into the high pixel densities you stop running everything at native resolution. You have enough pixel density that scaling the output doesn’t produce significant visible artifacts.

With 8K small pixels you could pick a number of resolutions up to 4K or higher and you wouldn’t even notice that the final product was scaled on your monitor.

People with Macs with retina displays have been doing this for years. It’s really nice once you realize how flexible it is.

4ggr0 a day ago | parent [-]

i'm actually going to do the reverse move, was gaming on a 4K display, but going to downgrade to 3440x1440 to get more performance. but of course the gaming displays i find apparently aren't ideal for working, because text looks worse. add to that that the internet seems to be split if wide-monitors are the best thing ever or actually horrible. why is it all so complicated, man.

zamadatix a day ago | parent | next [-]

My only gripe is nearly all common "ultrawide" models should really be thought of as "ultrashort" in that they don't offer more width, just less height.

E.g. a 21:9 ultrawide variant of 4k should really be 5040x2160. Instead they are nearly all 3840x1600. That may well be cost/price optimal for certain folks, I'm not saying it's bad for the product itself to exist, but nobody was looking at a 1600p monitor thinking "man, I wish they'd make a wider variant!" they started with 4k and decided it would be nice if it were permanently shortened.

4ggr0 19 hours ago | parent | next [-]

yeah, that really confused me as well. the whole 4K, 2K, 2.5K, ultrawide, ultrahigh, microwide, 8K shit just gets confusing, especially because it's neither accurate nor standardized.

tstrimple 16 hours ago | parent | prev [-]

I think they are calling those 5k2k monitors. I'm quite happy with my 45" LG 5k2k OLED monitor. Much more usable than the 32:9 monitors after seeing both in person.

simoncion 21 hours ago | parent | prev [-]

If the game offers it [0], set the output resolution to 4K, and the render resolution to something smaller. A multiplier of ~0.66 is roughly 1440p output, and 0.5 is 1080p.

If the game doesn't offer that, then I've found that the HUD/UI uglification isn't too bad when one sets the output resolution to 1440p.

If Windows is getting in the way of doing this, and most or all of your games have been purchased through Steam, give Linux a try. I've heard good things about the Fedora variant known as Bazzite, but have never, ever tried it myself. [1]

[0] And shockingly few do! There's all sorts of automagic "AI" upscaling shit with mystery-meat knobs to turn, but precious few bog-standard "render everything but the HUD and UI with this many fewer pixels" options.

[1] I've been using Gentoo for decades and (sadly) see no reason to switch. I strongly disrecommend Gentoo as a first Linux distro for most folks, and especially for folks who primarily want to try out Linux for video gaming.

4ggr0 19 hours ago | parent [-]

> set the output resolution to 4K, and the render resolution to something smaller

doesn't that make everything blurry? that's the gripe i have with circa post-2020 PC gaming, barely any pc can run a AAA or AA game in native resolution and instead has to use artificial upscaling and whatnot. haven't specifically tried it.

also can't test it anymore, as my gaming monitor is now our TV (48 inch OLED gaming TV, what a blast it was). now using my "old" 32in 2560x1440 IPS display, really miss OLED :( which is why i want to buy a new monitor. but i can't decide if i should take a 27in one (seems to be the 16:9 standard right now, but seems so small to me) or a ultrawide one. i switch games very frequently and also sometimes like to play old(er) games, so a bit scared of the "ultrawides are cool if your game supports it"-vibe...

> I've heard good things about the Fedora variant known as Bazzite

haha, this message was written on Bazzite, so i got that part covered :D switched about a month ago, funny to get the recommendation now.

simoncion 18 hours ago | parent [-]

> doesn't that make everything blurry?

My experience for the 3D parts of a great many games that my 5700 XT can't reasonably run at panel-native resolution is that the game's art style is to blur up the picture with all sorts of postprocessing (and sometimes (especially with UE5 games) with the ever-more-popular "it looks so bad it makes you wonder if the renderer is totally busted unless you leave TAA on" rendering technique). Sometimes this blurring ends up looking absolutely great, and other times it's just lazy, obnoxious, and awful.

So, not that I notice? For the games that permit it, the HUD and menus stay nice and sharp, and the 3D stuff that's going to be all smudged up no matter what you do just renders at a higher frame rate.

For games that don't have independent 2D and 3D render resolutions, I find 1440p to be quite tolerable, and (weirdly) 1080p to be much less tolerable... despite the fact that you'd expect it to fit nicely on the panel. I guess I'm expecting a much more crisp picture with integer scaling than I get? Or maybe this is like what some games did way back when where they totally change the font and render style of the UI once they get past some specific breakpoint. [0] I haven't looked closely at what's going on, so I don't have any even vaguely-reasonable explanation for it.

> [ultrawide monitors]

I like the comment someone else made somewhere that described them as "ultrashort" monitors. Personally, even if I was willing to move my head around enough to scan the whole damn monitor, I'm unwilling to lose so much vertical resolution. But as always, one should definitely choose what one likes.

Personally, I find a 32" 3840 pixel wide monitor to be good. It's really great for doing work on, and perfectly acceptable for playing video games.

> [Linux]

Gratz on moving to Linux for gaming. Hope you don't have much trouble with it, and any trouble you have is either caused by super-invasive kernel-level anticheat that will never, ever work on Linux, or is trouble that's easy and/or fun to resolve.

[0] One such game that sticks out in my memory is the OG Deus Ex game. At 1024x768, the font for the in-game UI switched from what -at lower resolutions- seemed a lot like a bitmapped font to what seemed a lot like a proper vector font. The difference was dramatic.

4ggr0 16 hours ago | parent [-]

> Sometimes this blurring ends up looking absolutely great, and other times it's just lazy, obnoxious, and awful

yeah, maybe i should give this way of setting the graphics a try. should try to find a game which looks great with it.

> I find a 32" 3840 pixel wide monitor to be good

just looked them up, surprised that they're quite a bit more expensive (800+ vs 600-750 for an ultrawide), but i guess the panels are more expensive due to the higher resolution. but your comment now makes me think what path i want to go. gotta read up on some opinions :D

> Hope you don't have much trouble with it

luckily i work on and with unix systems, so the new things are just the those related to gaming. but bazzite really has been very nice so far :) and as you say, the only times i had to boot up the windows on a separate disk are when i wanted to play games which don't run on linux at all, especially the kernel-level anticheat slopware.

but enough is enough. i've kept using windows at home just because of gaming, but i'm sick of M$. can't spend the whole day making fun of windows and then go home and game on it, feels dirty.

pornel a day ago | parent | prev [-]

You don't really need 8K for gaming, but upscaling and frame generation have made game rendering resolution and display resolution almost independent.

jsheard a day ago | parent [-]

And if all else fails, 8K means you can fall back to 4K, 1440p or 1080p with perfect integer scaling.

layer8 a day ago | parent [-]

Except that the hardware doesn’t necessarily offer perfect integer scaling. Oftentimes, it only provides blurry interpolation that looks less sharp than a corresponding native-resolution display.

jsheard a day ago | parent | next [-]

The monitor may or may not offer perfect scaling, but at least on Windows the GPU drivers can do it on their side so the monitor receives a native resolution signal that's already pixel doubled correctly.

Aurornis a day ago | parent | prev [-]

Most modern games already have built-in scaling options. You can set the game to run at your screen’s native resolution but have it do the rendering at a different scale factor. Good games can even render the HUD at native resolution and the graphics at a scaled resolution.

Modern OSes also scale fine.

It’s really not an issue.

layer8 21 hours ago | parent [-]

Games are not what I had in mind. Last time I checked, most graphics drivers didn’t support true integer scaling (i.e. nearest-neighbor, no interpolation).

jsheard 21 hours ago | parent | next [-]

> most graphics drivers didn’t support true integer scaling

https://www.nvidia.com/content/Control-Panel-Help/vLatest/en...

https://www.amd.com/en/resources/support-articles/faqs/DH3-0...

https://www.intel.com/content/www/us/en/support/articles/000...

I don't know what the situation is on Mac and Linux, but all of the Windows drivers offer it.

Aurornis 20 hours ago | parent | prev [-]

With very high PPI displays the gamma corrected interpolation scaling is far better than nearest neighbor scaling.

The idea is to make the pixels so small that your eyes aren’t resolving individual pixels anyway. Interpolation appears correct to your eyes because you’re viewing it through a low-pass filter (the physical limit of your eyes) anyway.

Reverting to nearest neighbor at high PPI would introduce new artifacts because the aliasing effects would create unpleasant and unnatural frequencies in the image.

Most modern GPU drivers (nVidia in particular) will do fixed multiple scaling if that’s what you want. Nearest neighbor is not good though.