| ▲ | martinald a day ago |
| Me and a friend were just chatting how annoying it is monitors stalled out at 4K. I think I got my first set of 4k monitors ~15 years ago (!) and there's been no improvements since then apart from high end pro monitors resolution wise. Why is this? 5k/6k at 27" would be the sweet spot for me, and potentially 8k at 32". However, I'm not willing to drop $2k per monitor to go from a very nice 27" 4k to 27" 5k. You can get 8K TVs for <$1000 now. And an Quest 3 headset has 2 displays at far higher PPI for $600. |
|
| ▲ | throw0101d a day ago | parent | next [-] |
| > Me and a friend were just chatting how annoying it is monitors stalled out at 4K. There's been a bit of a 'renaissance' of 5K@27" in the last ~year: > In just the past few months, we've taken a look at the ASUS ProArt Display 5K, the BenQ PD2730S, and the Alogic Clarity 5K Touch with its unique touchscreen capabilities, and most recently I've been testing out another new option, the $950 ViewSonic VP2788-5K, to see how it stacks up. * https://www.macrumors.com/review/viewsonic-vp2788-5k-display... There are 15 monitors discussed in this video: * https://www.youtube.com/watch?v=EINM4EysdbI The ASUS ProArt PA27JCV is USD 800 (a lot less than $2k): * https://www.youtube.com/watch?v=ojwowaY3Ccw |
|
| ▲ | Aurornis a day ago | parent | prev | next [-] |
| > You can get 8K TVs for <$1000 now. 8K at jumbo TV size has relatively large pixels compared to an 8K desktop monitor. It’s easier to manufacture. > And an Quest 3 headset has 2 displays at far higher PPI for $600 Those displays are physically tiny. Easier to deal with lower yields when it’s only taking a few square inches. Ultra high resolution desktop monitors would exist in the middle: Very small pixel sizes but also relatively large unit area. However, the demand side is also not there. There are already a number of 5K, 6K, and 8K monitors on the market. They’re just not selling well. Between difficult software support for scaling legacy apps, compatibility issues with different graphics cards and cables, and the fact that normal monitors are good enough, the really high resolution monitors don’t sell well. That doesn’t incentivize more. If we get to a place where we could reliably plug a 6K monitor into any medium to high end laptop or desktop and it just works, there might be more. Until then, making a high res monitor is just asking for an extremely high return rate. |
| |
| ▲ | Kon5ole 21 hours ago | parent | next [-] | | >> You can get 8K TVs for <$1000 now. >8K at jumbo TV size has relatively large pixels compared to an 8K desktop monitor. It’s easier to manufacture. I don't think that's true. I've been using a 8k 55" TV as my main monitor for years now. It was available for sub-800 USD before all such tv's vanished from the market. Smaller pixels were not more expensive even then, the 55"s were the cheapest. 4k monitors can be had for sub-200 usd, selling 4x the area of the same panel should be at most 4x that price. And it was, years ago. So they were clearly not complicated or expensive to manufacture - but there was no compelling reason for having 8k on a TV so they didn't sell. However, there IS a compelling reason to have 8K on a desktop monitor! That such monitors sell for 8000 usd+ is IMO a very unfortunate situation caused by a weird incompetence in market segmentation by the monitor makers. I firmly believe that they could sell 100x as many if they cut the price to 1/10th, which they clearly could do. The market that never appeared for tv's is present among the world's knowledge workers, for sure. | | |
| ▲ | dmayle 19 hours ago | parent [-] | | I've been using an 8k 65" TV as a monitor for four years now. When I bought it, you could buy the Samsung QN700B 55" 8k, but at the time it was 50% more than the 65" I bought (TCL). I wish the 55" 8k TVs still existed (or that the announced 55" 8k monitors were ever shipped). I make do with 65", but it's just a tad too large. I would never switch back to 4k, however. | | |
| ▲ | murkt 18 hours ago | parent | next [-] | | What standard does reliably work to drive 8K at 60 Hz and how expensive cables are? How far away do you sit from it? Does it sit on top of your desk? What do you put on all this space, how do you handle it? I don’t think you’re maximizing one browser window over all 33 million pixels | | |
| ▲ | Kon5ole 15 hours ago | parent [-] | | HDMI 2.1 is required, and the cables are not too expensive now. For newer gpus (nvidia 3000+ or equivalent) and high end (or M4+) macs hdmi 2.1 works fine but Linux drivers have some licensing issue that makes hdmi 2.1 problematic. It works with certain nvidia drivers but I ended up getting a DP to HDMI 8K cable which was more reliable. I think it could work with AMD and Intel also but I haven't tried. In my case I have a 55 and sit normal monitor distance away. I made a "double floor" on my desk and a cutout for the monitor so the monitor legs are some 10cm below the actual desk, and the screen starts basically at the level of the actual desk surface. The gap between the desk panels is nice for keeping usb hubs, drives, headphone amps and such. And the mac mini. I usually have reference material windows upper left and right, coding project upper center, coding editor bottom center, and 2 or 4 terminals, teams, slack and mail on either side of the coding window. The center column is about tice as wide as the sides. I also have other layouts depending on the kind of work. I use layout arrangers like fancyzones (from powertoys) in windows and a similar mechanism in KDE, and manual window management on the mac. I run double scaling, so I get basically 4K desktop area but at retina (ish) resolution. 55 is a bit too big but since I run doubling I can read stuff also in the corners. 50" 8K would be ideal. Basically the biggest problem with this setup is it spoils you and it was only available several years ago. :( |
| |
| ▲ | r0b05 19 hours ago | parent | prev | next [-] | | What is the model number and how has the experience been? I've mostly read that TV's don't make great monitors. I have a TLC Mini LED TV which is great as a TV though. | |
| ▲ | mrguyorama 18 hours ago | parent | prev [-] | | What do you watch on an 8K TV? There's no content Average bitrate from anything not a Bluray for even HD is not good, so you do not benefit from more pixels anyway. Sure, you are decompressing and displaying 8K worth of pixels, but the actual resolution of your content is more like 1080p anyway, especially in color space. Normally, games are the place where arbitrarily high pixel counts could shine, because you could literally ensure that every pixel is calculated and make real use of it, but that's actually stupidly hard at 4k and above, so nvidia just told people to eat smeary and AI garbage instead, throwing away the entire point of having a beefy GPU. I was even skeptical of 1440p at higher refresh rates, but bought a nice monitor with those specs anyway and was happily surprised with the improvement, but it's obvious diminishing returns. | | |
| ▲ | Kon5ole 4 hours ago | parent [-] | | >There's no content This is exactly why 8K tv's failed in the market, but the point here is that your computer desktop is _great_ 8k content. The tv's that were sold for sub-1000 usd just a few years ago should be sold as monitors instead. Replace the TV tuners, app support, network cards and such and add a displayport. Having a high-resolution desktop that basically covers your useable FOV is great, and is a way more compelling use case than watching TV on 8K ever was. |
|
|
| |
| ▲ | nicoburns 19 hours ago | parent | prev [-] | | > There are already a number of 5K, 6K, and 8K monitors on the market. They’re just not selling well. Between difficult software support for scaling legacy apps, compatibility issues with different graphics cards and cables, and the fact that normal monitors are good enough, the really high resolution monitors don’t sell well. They're available, but they never seem to have become a mass-market product at mass-market prices. The cheapest 5k monitor is at least double the price of the cheapest 4k monitor. And it was more like 4x until recently. You're probably right that we're starting to hit the point where people don't care though. |
|
|
| ▲ | rickdeckard a day ago | parent | prev | next [-] |
| Because the vast majority of Monitor Sales-Volume are (public) tenders from companies buying huge volume, and those companies still mostly look for monitors <4K (without fancy specs and without i.e. USB-C). If 4K reaches mass-market for those, the specs will shift down and there will be room in the (much smaller) Premium-Tier monitor segment Heck, even if you just want USB-C and an integrated webcam on an average display, the price-hike compared to one without it is crazy, because everything except those basic office-monitors is still niche-production... |
|
| ▲ | 4ggr0 a day ago | parent | prev | next [-] |
| as a gamer 8k makes me sweat because i can't imagine what kind of hardware you'd need to run a game :O probably great for text-based work, though! |
| |
| ▲ | Aurornis a day ago | parent | next [-] | | Once you get into the high pixel densities you stop running everything at native resolution. You have enough pixel density that scaling the output doesn’t produce significant visible artifacts. With 8K small pixels you could pick a number of resolutions up to 4K or higher and you wouldn’t even notice that the final product was scaled on your monitor. People with Macs with retina displays have been doing this for years. It’s really nice once you realize how flexible it is. | | |
| ▲ | 4ggr0 a day ago | parent [-] | | i'm actually going to do the reverse move, was gaming on a 4K display, but going to downgrade to 3440x1440 to get more performance. but of course the gaming displays i find apparently aren't ideal for working, because text looks worse. add to that that the internet seems to be split if wide-monitors are the best thing ever or actually horrible. why is it all so complicated, man. | | |
| ▲ | zamadatix a day ago | parent | next [-] | | My only gripe is nearly all common "ultrawide" models should really be thought of as "ultrashort" in that they don't offer more width, just less height. E.g. a 21:9 ultrawide variant of 4k should really be 5040x2160. Instead they are nearly all 3840x1600. That may well be cost/price optimal for certain folks, I'm not saying it's bad for the product itself to exist, but nobody was looking at a 1600p monitor thinking "man, I wish they'd make a wider variant!" they started with 4k and decided it would be nice if it were permanently shortened. | | |
| ▲ | 4ggr0 19 hours ago | parent | next [-] | | yeah, that really confused me as well. the whole 4K, 2K, 2.5K, ultrawide, ultrahigh, microwide, 8K shit just gets confusing, especially because it's neither accurate nor standardized. | |
| ▲ | tstrimple 16 hours ago | parent | prev [-] | | I think they are calling those 5k2k monitors. I'm quite happy with my 45" LG 5k2k OLED monitor. Much more usable than the 32:9 monitors after seeing both in person. |
| |
| ▲ | simoncion 21 hours ago | parent | prev [-] | | If the game offers it [0], set the output resolution to 4K, and the render resolution to something smaller. A multiplier of ~0.66 is roughly 1440p output, and 0.5 is 1080p. If the game doesn't offer that, then I've found that the HUD/UI uglification isn't too bad when one sets the output resolution to 1440p. If Windows is getting in the way of doing this, and most or all of your games have been purchased through Steam, give Linux a try. I've heard good things about the Fedora variant known as Bazzite, but have never, ever tried it myself. [1] [0] And shockingly few do! There's all sorts of automagic "AI" upscaling shit with mystery-meat knobs to turn, but precious few bog-standard "render everything but the HUD and UI with this many fewer pixels" options. [1] I've been using Gentoo for decades and (sadly) see no reason to switch. I strongly disrecommend Gentoo as a first Linux distro for most folks, and especially for folks who primarily want to try out Linux for video gaming. | | |
| ▲ | 4ggr0 19 hours ago | parent [-] | | > set the output resolution to 4K, and the render resolution to something smaller doesn't that make everything blurry? that's the gripe i have with circa post-2020 PC gaming, barely any pc can run a AAA or AA game in native resolution and instead has to use artificial upscaling and whatnot. haven't specifically tried it. also can't test it anymore, as my gaming monitor is now our TV (48 inch OLED gaming TV, what a blast it was). now using my "old" 32in 2560x1440 IPS display, really miss OLED :( which is why i want to buy a new monitor. but i can't decide if i should take a 27in one (seems to be the 16:9 standard right now, but seems so small to me) or a ultrawide one. i switch games very frequently and also sometimes like to play old(er) games, so a bit scared of the "ultrawides are cool if your game supports it"-vibe... > I've heard good things about the Fedora variant known as Bazzite haha, this message was written on Bazzite, so i got that part covered :D switched about a month ago, funny to get the recommendation now. | | |
| ▲ | simoncion 18 hours ago | parent [-] | | > doesn't that make everything blurry? My experience for the 3D parts of a great many games that my 5700 XT can't reasonably run at panel-native resolution is that the game's art style is to blur up the picture with all sorts of postprocessing (and sometimes (especially with UE5 games) with the ever-more-popular "it looks so bad it makes you wonder if the renderer is totally busted unless you leave TAA on" rendering technique). Sometimes this blurring ends up looking absolutely great, and other times it's just lazy, obnoxious, and awful. So, not that I notice? For the games that permit it, the HUD and menus stay nice and sharp, and the 3D stuff that's going to be all smudged up no matter what you do just renders at a higher frame rate. For games that don't have independent 2D and 3D render resolutions, I find 1440p to be quite tolerable, and (weirdly) 1080p to be much less tolerable... despite the fact that you'd expect it to fit nicely on the panel. I guess I'm expecting a much more crisp picture with integer scaling than I get? Or maybe this is like what some games did way back when where they totally change the font and render style of the UI once they get past some specific breakpoint. [0] I haven't looked closely at what's going on, so I don't have any even vaguely-reasonable explanation for it. > [ultrawide monitors] I like the comment someone else made somewhere that described them as "ultrashort" monitors. Personally, even if I was willing to move my head around enough to scan the whole damn monitor, I'm unwilling to lose so much vertical resolution. But as always, one should definitely choose what one likes. Personally, I find a 32" 3840 pixel wide monitor to be good. It's really great for doing work on, and perfectly acceptable for playing video games. > [Linux] Gratz on moving to Linux for gaming. Hope you don't have much trouble with it, and any trouble you have is either caused by super-invasive kernel-level anticheat that will never, ever work on Linux, or is trouble that's easy and/or fun to resolve. [0] One such game that sticks out in my memory is the OG Deus Ex game. At 1024x768, the font for the in-game UI switched from what -at lower resolutions- seemed a lot like a bitmapped font to what seemed a lot like a proper vector font. The difference was dramatic. | | |
| ▲ | 4ggr0 16 hours ago | parent [-] | | > Sometimes this blurring ends up looking absolutely great, and other times it's just lazy, obnoxious, and awful yeah, maybe i should give this way of setting the graphics a try. should try to find a game which looks great with it. > I find a 32" 3840 pixel wide monitor to be good just looked them up, surprised that they're quite a bit more expensive (800+ vs 600-750 for an ultrawide), but i guess the panels are more expensive due to the higher resolution. but your comment now makes me think what path i want to go. gotta read up on some opinions :D > Hope you don't have much trouble with it luckily i work on and with unix systems, so the new things are just the those related to gaming. but bazzite really has been very nice so far :) and as you say, the only times i had to boot up the windows on a separate disk are when i wanted to play games which don't run on linux at all, especially the kernel-level anticheat slopware. but enough is enough. i've kept using windows at home just because of gaming, but i'm sick of M$. can't spend the whole day making fun of windows and then go home and game on it, feels dirty. |
|
|
|
|
| |
| ▲ | pornel a day ago | parent | prev [-] | | You don't really need 8K for gaming, but upscaling and frame generation have made game rendering resolution and display resolution almost independent. | | |
| ▲ | jsheard a day ago | parent [-] | | And if all else fails, 8K means you can fall back to 4K, 1440p or 1080p with perfect integer scaling. | | |
| ▲ | layer8 a day ago | parent [-] | | Except that the hardware doesn’t necessarily offer perfect integer scaling. Oftentimes, it only provides blurry interpolation that looks less sharp than a corresponding native-resolution display. | | |
| ▲ | jsheard a day ago | parent | next [-] | | The monitor may or may not offer perfect scaling, but at least on Windows the GPU drivers can do it on their side so the monitor receives a native resolution signal that's already pixel doubled correctly. | |
| ▲ | Aurornis a day ago | parent | prev [-] | | Most modern games already have built-in scaling options. You can set the game to run at your screen’s native resolution but have it do the rendering at a different scale factor. Good games can even render the HUD at native resolution and the graphics at a scaled resolution. Modern OSes also scale fine. It’s really not an issue. | | |
| ▲ | layer8 21 hours ago | parent [-] | | Games are not what I had in mind. Last time I checked, most graphics drivers didn’t support true integer scaling (i.e. nearest-neighbor, no interpolation). | | |
|
|
|
|
|
|
| ▲ | swiftcoder a day ago | parent | prev | next [-] |
| > and potentially 8k at 32" What's your actual use-case for this? I run a 32" 4K, and I have to stick my nose within a foot (~30cm) of the display to actually spot individual pixels. Maybe my eyesight isn't what it used to be I'd kill for a 40" 5k or 6k to be available - that's significantly more usable desktop real estate, and I still wouldn't be able to see the pixels. |
| |
| ▲ | ak217 a day ago | parent | next [-] | | Pixels are very noticeable at 32" 4K. If you don't notice them, your eyes still do - they try to focus on blurry lines, causing eye strain. You might not notice, but it adds up over the years. It's simple math. A 32" 4K monitor is about 130 PPI. Retina displays (where you could reasonably say the pixels are not noticeable, and the text is sharp enough to not strain the eyes) start at 210 PPI. Subjectively, the other problem with 32" 4K (a very popular and affordable size now) is that the optimal scaling is a fractional multiple of the underlying resolution (on MacOS - bizarrely I think Windows and Linux both know how to do this better than MacOS). Which again causes blur and a small performance hit. I myself still use an old 43" 4K monitor as my main one, but I know it's not great for my eyes and I'd like to upgrade. My ideal would be a 40" or 42" 8K. A 6K at that size would not be enough. I am very excited about this 32" 6K Asus ProArt that came out earlier this year: https://www.asus.com/displays-desktops/monitors/proart/proar... - it finally gets Retina-grade resolution at a more reasonable price point. I will probably switch to two of these side-by-side once I can get them below $1K. | | |
| ▲ | swiftcoder a day ago | parent [-] | | > It's simple math. A 32" 4K monitor is about 130 PPI. Retina displays (where you could reasonably say the pixels are not noticeable, and the text is sharp enough to not strain the eyes) start at 210 PPI. It's also incorrectly applied math. You need to take into account the viewing distance - the 210 PPI figure often quoted is for smartphone displays (at the distance one typically holds a smartphone). For a 32" monitor, if your eyeballs are 36" away from the monitor's surface, you are well beyond the limit of normal visual acuity (and the monitor still fills a massive 42 degrees of your field of view). | | |
| ▲ | ak217 21 hours ago | parent [-] | | Take a look at this article: https://www.nature.com/articles/s41467-025-64679-2 - the limits at "normal visual acuity" (18 observers ~25 years old) are far beyond what you imply. You need over 95 ppd to exhaust normal visual acuity. > For a 32" monitor, if your eyeballs are 36" away from the monitor's surface Why are you assuming 36"? Nobody I know uses 32" monitors at 36" away. Most people use less than half that distance for their laptops, and just over half for desktops. > the 210 PPI figure often quoted is for smartphone displays The 210 PPI figure is a minimum, it was used as marketing when Apple first started offering Retina displays. Apple's modern iPhone displays have far higher PPI. Apple's own marketing was challenged by critics who noted that visual acuity may top out closer to 200 ppd. Perhaps Retina doesn't matter to you - that's OK. But for most of us, 32" 4K is nowhere near the limit of our vision, and by staring at these monitors all day, we are slowly degrading it. | | |
| ▲ | eertami 20 hours ago | parent | next [-] | | > and by staring at these monitors all day, we are slowly degrading it Yes, but that is probably accelerated more by sitting closer to screens than is healthy for too long, than it is by the resolution of the screen. It's anecdata so maybe truly everyone you know does sit 45cm away from a desktop monitor - but I can't say I've ever experienced that. Of course if you do sit that close then higher resolution is resolvable. Perhaps what your statement actually should be is: "Perhaps Retina doesn't matter if you sit at a (perfectly comfortable and healthy) further distance away from the screen - that's OK", otherwise I can a reader may think you are trying to imply the OP is somehow inferior, but really the only thing that differs is your viewing distance. | |
| ▲ | 20 hours ago | parent | prev | next [-] | | [deleted] | |
| ▲ | swiftcoder 21 hours ago | parent | prev | next [-] | | > You need over 95 ppd to exhaust normal visual acuity 32" 4K at 36" is 91 ppd. Which I guess is good enough, seeing as I'm well the far side of 25 year old. > Why are you assuming 36"? Nobody I know uses 32" monitors at 36" away. 36" is the point where I can see all 4 corners of the monitor at the same time (and significantly too close to focus on one corner and have the other 3 corners in view at the same time). 40 degrees of FoV is massive for a single monitor! I'm sitting here wondering how much you have to turn your head to use this size monitor up close | | |
| ▲ | ak217 21 hours ago | parent | next [-] | | I actually have two more monitors, one on each side of my main one, in portrait mode :) And yes, I turn my head when I want to see them. I'm glad the low resolution monitors work for you. I just don't want people to proclaim that everything about displays is solved - it's not. There are meaningful, physiologically relevant improvements to be made. It's been over a decade since 4k60 became the standard. A lot of younger people would really benefit from mass produced 6k120 monitors. | |
| ▲ | Aurornis 20 hours ago | parent | prev [-] | | > 40 degrees of FoV is massive for a single monitor! I'm sitting here wondering how much you have to turn your head to use this size monitor up close You move your eyes, not your head. Plus or minus 20 degrees is a trivial amount of eye movement. Most people are fine with this. Your requirement to comfortably see everything with minimal eye/head movement is atypical. Even if you do have to move your head, that’s not a bad thing. A little head movement during long computing sessions is helpful. | | |
| ▲ | swiftcoder 19 hours ago | parent [-] | | > You move your eyes, not your head. Plus or minus 20 degrees is a trivial amount of eye movement. Maybe this varies a lot between humans, because I'm trying the experiment, and any closer than 24 inches requires physically moving my head to comfortably read text in the corner of the 32" display. Even at 36" it's fatiguing to focus on a corner of the display solely through eye-movement for more than a few seconds. > Your requirement to comfortably see everything with minimal eye/head movement is atypical I don't think it's by any means an uncommon requirement. Movie-watchers want to be able to see the whole screen at once (with the exception of some intentionally-over-the-top IMAX theatres), gamers want to be able to see their radar/heath/ammo/etc in the corners of the screen. I'd like to be able to notice notifications arriving in the corner of the screen. |
|
| |
| ▲ | simoncion 21 hours ago | parent | prev [-] | | > Nobody I know uses 32" monitors at 36" away. I suppose it's still true that nobody you know uses monitors of that size three feet away, but I'm very definitely one of those people. Why on earth would you put the monitor so close to your face that you have to turn your head to see all of it? That'd be obnoxious as all hell. > ...by staring at these monitors all day, we are slowly degrading it. No, that's age. As you age, the tissues that make up your eye and the muscles that control it fail more and more to get rebuilt correctly. I think the colloquial term for this is that they "wear out". It sucks shit, but we're currently too bad at bioengineering to really stop it. |
|
|
| |
| ▲ | FuriouslyAdrift a day ago | parent | prev | next [-] | | This is the only large true monitor I know of. It used to be branded by Acer, but now it is branded through Viewsonic. We have a bunch at work and everyone loves them. $570 for 43" 4K https://www.viewsonic.com/us/vx4381-4k-43-4k-uhd-monitor-wit... | |
| ▲ | Aurornis a day ago | parent | prev [-] | | > I'd kill for a 40" 5k or 6k to be available There are a number of 40” 5K wide monitors on the market. They have the same vertical resolution as a 4K but with more horizontal pixels. | | |
| ▲ | swiftcoder a day ago | parent [-] | | Yeah. I guess that's the way. I'm not wild about such a wide aspect ratio, and all the head-turning or chair-swivelling it implies. |
|
|
|
| ▲ | layer8 a day ago | parent | prev | next [-] |
| The likelihood of dead pixels increases quadratically with resolution, hence panel yield drops correspondingly. In addition, the target audience who has hardware (GPUs) that can drive those resolutions is smaller. |
|
| ▲ | ebbi 17 hours ago | parent | prev | next [-] |
| One of the best things I've done for my setup is convert old 5k iMacs to work as external display. Only downside are the massive borders by todays standards, but it still has the Apple aesthetics, the 5k resolution is beautiful for my use cases (spreadsheets, documents, photo editing), and has HDMI inputs so I can play PS5 on it. |
|
| ▲ | prpl 15 hours ago | parent | prev | next [-] |
| 30 or 32" 5k is what I'd love - maybe 6k at 32 |
|
| ▲ | Paianni a day ago | parent | prev | next [-] |
| The Asus PA27JCV is rather less than $2k... |
|
| ▲ | mschuster91 a day ago | parent | prev | next [-] |
| > Me and a friend were just chatting how annoying it is monitors stalled out at 4K. I think I got my first set of 4k monitors ~15 years ago (!) and there's been no improvements since then apart from high end pro monitors resolution wise. Multiple reasons. The first one being yield - yes you can get 8K screens, but the larger they get, the more difficult it is to cut a panel with an acceptably low rate of dead/stuck pixels out of a giant piece of glass. Dead pixels are one thing and bad enough, but stuck-bright pixels ruin the entire panel because they will be noticeable in any dark-ish movie or game scene. That makes them really darn expensive. The second reason is the processing power required to render the video signal to the screen, aka display controllers. Even if you "just" take regular 8 bit RGB - each frame takes up 33 million pixels, so 796.262.400 bits. Per frame. Per second? Even at just 30 FPS, you're talking about 23.887.872.000 bits per second - 23 gigabits/s. It takes an awful, awful lot of processing power just to shuffle that data from the link SerDes around to all the control lines and to make sure they all switch their individual pixels at the very same time. The third is transferring all the data. Even if you use compression and sub-sampling, you still need to compress and sub-sample the framebuffer on the GPU side, transfer up to 48 GBit/s (HDMI 2.3) or 77 GBit/s (DP 2.1) of data, and then uncompress it on the display side. If it's HDCP-encrypted, you need to account for that as well - encrypting and decrypting at such line speeds used to be unthinkable even two decades ago. The fact that the physical transfer layer is capable of delivering such data rates over many meters of copper cable of varying quality is nothing short of amazing anyway. And the fourth is generating all the data. You need absurdly high definition textures, which requires lots of VRAM, lots of regular RAM, lots of disk I/O, lots of disk storage (your average AAA game is well beyond 100GB of data at-rest for a reason!), and then render power to actually render the scene. 8K has 16x (!) the pixels of regular FullHD (1080p). What's stopping further progress? Other than yield and simple physics (similar to microchips, the finer the structures get the more difficult and expensive it is to make them), the most pressing issue is human visual acuity - even a human with very good vision can only make useful sense of about 74 of the theoretical 576 megapixels [1]. As we already established, 8K is at 33-ish megapixels, so the usual quadratic increase would already be far too detailed for 99.999% of humans to perceive. Yes, you could go for intermediate sizes. 5K, 6K, weird aspect ratios, whatever - but as soon as you go there, you'll run into issues with video content because it can't be up- or downscaled to such intermediates without a perceptible loss in quality and, again, a lot of processing power. [1] https://clarkvision.com/articles/eye-resolution.html |
| |
| ▲ | Aurornis a day ago | parent | next [-] | | > And the fourth is generating all the data. You need absurdly high definition textures, which requires lots of VRAM, lots of regular RAM, lots of disk I/O, lots of disk storage (your average AAA game is well beyond 100GB of data at-rest for a reason!), and then render power to actually render the scene. 8K has 16x (!) the pixels of regular FullHD (1080p). You don’t need to scale everything up to match the monitor. There are already benefits to higher resolution with the same textures for any object that isn’t directly next to the player. This isn’t a problem at all. We wouldn’t have to run games at 4K. | |
| ▲ | zamadatix a day ago | parent | prev [-] | | ~half of these reasons state sub $2000 8k TVs shouldn't exist, but they do. | | |
| ▲ | Aurornis a day ago | parent [-] | | The individual pixels on a 60 inch 8K TV are the same size as the pixels on a 30 inch 4K computer monitor. Most 8K TVs are even bigger than that, so their individual pixels are already easier to manufacture than your average 4K monitor or laptop screen. You can’t compare large TVs to medium size computer monitors. | | |
| ▲ | simoncion 21 hours ago | parent [-] | | > You can’t compare large TVs to medium size computer monitors. When half of those four reasons don't require having a PC attached to the display, and three fourths of the four have nothing to do about the panel manufacturing process, you totally can. |
|
|
|
|
| ▲ | littlestymaar a day ago | parent | prev | next [-] |
| > Me and a friend were just chatting how annoying it is monitors stalled out at 4K. I think I got my first set of 4k monitors ~15 years ago (!) and there's been no improvements since then apart from high end pro monitors resolution wise. It's mostly because the improvement over 4k is marginal. In fact, even from 1920x1080 it's not so big of a deal, which is why people keep buying such monitors in 2025. A the worse is that the higher spending consumer segment of PC parts, the gamers, can't really use high resolution display at their full potential because it puts such a burden on the GPU (DLSS helps, but the results is even smaller of an improvement over 1920x1080 than regular 4k is) |
|
| ▲ | znpy a day ago | parent | prev [-] |
| Ah yes. It’s the same with memory… 8gb/16gb is incredibly common, even though 16gb memory was a thing in like 2008 already. It’s only with high end machines that you get 64/128gb memory, which should be much more common in my opinion. |