Slightly off topic but I can’t wait until there’s a 120 Hz 8K monitor. It’s the only thing holding me back from upgrading from 4K. I wonder if the current limitation is on the panels, cable bandwidth or absurd price tag…
Absolutely. Mediocre resolution increases has been one of many disappointments for me when it comes to technology over the past 20 years. We had CRT monitors with better resolution than 1080p back in the late 90's early 2000's before LCD panels saddled us with 1080p resolution for 15 years. 4k is the bare minimum that should be available right now. I can't wait until I can have a 16k monitor at about 3'x5-6' on my desk. Maybe it will happen in my lifetime, but I'm not holding my breath.
8K is four times the pixels and therefore four times the bandwidth as a 4K monitor.
It took us a long time to go from 1080p to 4K. It has taken even longer for 4K at 120-144Hz to be practical.
It’s more likely that you’ll end up with intermediate steps to 5K, 6K, than getting 8K 120Hz.
The other limitation is lack of demand. You need a gigantic monitor for 8K to be worth it, and you need a powerful video card to drive it. The number of people who would buy such a monitor is very, very small.
Which makes me wonder what's the point of the article's author, 4k vs 6k on 32 inch one is already far into diminishing returns, 8k on 32 inch is just numbers for numbers sake
I don’t know if that’s necessarily true. I use a 32” 4k 144hz monitor at 100% scaling just fine. I’d loooove to replace it with an 8k monitor with similar refresh rate to run at 200% scaling and keep the same amount of workspace I have now
I think you're going to be waiting a long time, even the geforce 4080 and 4090 don't support displayport 2.0.
Additionally much of the demand for >60Hz is for gaming purposes, and there is nowhere near a powerful enough GPU anyone can afford that would be able to render games in high quality or extreme detail level at 8K above 60 FPS. Right now a GPU that costs $1500 USD can maybe render a 4K game with extreme detail level at framerates that vary between 55 to 75 fps.
120 Hz feels a lot smoother for scrolling or anything with movement on the screen.
8K is the point in which at the monitor size I use, individual pixels would be too hard to see. They're already a bit hard to see at 4K, but at 8K it'd be perfect.
An interesting part of the recent book An Immense World was its coverage of how mantis shrimp likely don't use photoreceptors like human's do.
Marshall now thinks that the mantis shrimp sees colors in a unique way. Rather than discriminating between millions of subtle shades, its eye actually does the opposite, collapsing all the varied hues of the spectrum into just 12.
From Science: "A Different Form of Color Vision in Mantis Shrimp"
The mantis shrimps (stomatopods) can have up to 12 photoreceptors, far more than needed for even extreme color acuity. Thoen et al. conducted paired color discrimination tests with stomatopods and found that their ability to discriminate among colors was surprisingly low. Instead, stomatopods appear to use a color identification approach that results from a temporal scan of an object across the 12 photoreceptor sensitivities. This entirely unique form of vision would allow for extremely rapid color recognition without the need to discriminate between wavelengths within a spectrum.
I can't speak for him, but for me? Straight integer scaling of 4k and 1440p. I loathe fractional scaling, and I cannot wait for the day that I can run an 8k display at > 90hz without compromise
I definitely can't afford an 8k monitor, but this is also the reason I'd want one for gaming. Realistically, you'll probably never run it at 4320p, but depending on how demanding the game is, you can choose between 2160p, 1440p or 1080p without weird scaling. With a 4k monitor your only option is to go all the way down to 1080p.
Manufacturers can also justify making pretty big screens with that many pixels. Samsung's 8k TVs are 80-something inches. Since it's that big, you can sit further back, making up for any loss in pixel density, and still have a huge picture.
Not the original commenter but, it... has a certain look to it. It's not something that can be totally solved by the OS or any scaling algorithm because fundamentally you have to deal with rendering fractional portions of a pixel to a physical screen where there is no such thing as a fractional pixel. It just always looks different, and IMO worse, than integer scaling.
You have to trade off distortions for bluriness and you can't avoid both of those artifacts at the same time. The higher your pixel ppi, the less noticeable any scaling artifacts will be though.
I don't understand why 'distortion' (things being rendered ± 1 pixel over on some screens vs. on others) is a problem unless your GUI frameworks and/or apps count on pixel-exact layouts for some UI elements. But why would they? Isn't the entire web built on a reflowable format that works pretty well? Shouldn't those tiny 1-pixel differences be like the easiest possible variation for a GUI system's layout engine to cope with?
Do we have lots of scalable UI elements that expect to line up with raster images a certain way on most operating systems?
For me, a 1 pixel asymmetry in a button at medium ppi is noticeable and mildly distracting. I don't mind it much for UI, but I understand why others would want to avoid it. I tend to set my ui and toolbars to auto-hide. Personally, the reasons why I avoid it are text rendering and gaming usage.
It's a shame (for me, who can't afford HiDPI displays to replace his current ones, and would have difficulty pushing all those pixels even if he could) that Apple removed subpixel anti-aliasing. :(
In CAD/EDM tools higher resolution means more productivity, (to a point) as you can fit more useful information on the screen at a time - you can “zoom out” more and still keep a useful level of detail. Especially useful in schematic and pcb design where dense areas of interest can be spatially disparate. I don’t like large screens and currently use 24” 4k screens which seem to be either unavailable or expensive, they were ~$350 in 2015 and don’t seem to have any equivalent nowadays.
The 120 Hz i don’t understand however i am not a gamer.
I'd agree with you. I think people who use those sorts of displays frequently could tell the difference between 60 and 120hz but 120 vs 144hz seems way too close together.
Took me 2 seconds to notice that a friends monitor was at 60hz by just moving the mouse. Just look at the distance between each cursor icon update as you move it.
Agreed. I have a number of 4K 144 Hertz monitors, I'd like a 6 or 8K monitor but until they have it in high refresh create I'm not switching. I'm not much of a gamer, but when I do occasionally game it is significantly more fluid as well.
I have 60, 144 and 165hz displays. I have to say I don't really see much of a difference. Around 30 hz yes. But not over 60. It's probably some sort of genetic vision difference thing.
Previously, I actually often felt motion sickness when scrolling through code on small 60hz laptop screens. Had no problems with larger (> 23") desktop screens, though.
> It's venturing into cryptocurrency-space-heater levels of pointless number crunching to render at that level of detail for anyone who has human eyes.
Depends on your monitor size.
Might be a waste at 27", but if you want to use a 48" display, I can assure you that you'd notice the move from 4k -> 8k.
Yeah, even driving the display at 4K, you start to notice the higher pixel fill factor for 8K displays above 48in. I love my 65in 8K Q900 - even though it mostly lives at 4k (120hz!).
Can you explain this a bit more, I tried googling but I can't quite understand what you mean here by pixel fill factor and how it would differ between the resolutions?
The gaps between the pixels tend to be smaller the higher the resolution - so even if you drive the display at a lower resolution, it can look better the same lower res display that has larger spacing between each subpixel.
User interfaces shouldn't load a GPU very much even at that resolution.
8K in a game will use tremendous amounts of power, but it's not pointless. It's like having antialiasing turned on. And high frame rates are important for motion because normal rendering only gives you a single point in time.
I’d argue that AI driven super resolution like DLSS should be more than sufficient to upscale 4k to 8k with minimal performance loss and acceptable image quality even for gaming.
Unless people are far more sensitive than I am, I don't see how >60Hz is needed for a desktop workstation environment. High frame rate is really only noticeable for very fast reaction time gaming.
4K 120Hz may be noticeable if editing ultra high frame rate video on a video editing workstation, but if you are a video production crew with a camera capable of recording at that framerate, you probably already know that.
60Hz is really noticeable when you’ve been using 120Hz even for a few minutes. 120Hz feels a lot less tiring and work in a terminal, editor and websites is just a lot smoother.
It's immediately noticeable when scrolling in a browser, dragging stuff around or just moving the mouse. If you haven't seen it in person, go to an Apple store and do a quick comparison between the Macbook Pro (120hz) and the Air (60hz), or iPad vs iPad Pro. They're always next to each other.
ARM macs can probably handle that, the M2 can do 10 8k video streams at once, 22 simultaneously on the Ultra, and people are running 4k 120hz on the M1 with a couple hacks.
For some reason all 40 inch monitors have disappeared off the market!
For me 40 inch is the sweet spot for coding: any larger and it gets too pixelated at 4K (you can make out the pixels, but that's OK for coding), the UI scale can be set at 100% so all is well proportioned. Entire classes/methods fit on a screen without scrolling.
I own two Philips 4K 40 inch monitors, they only cost ~$600 at the time, and I dread the day they stop working. I would be first in line for 6K or 8K, or any ≥4K really, at 40 inch.
It's been pretty amazing how stagnant the monitor space is. I too am really craving an 8k@120 monitor, although there's a decent chance I'll balk at the price.
It’s crazy how much of a regression there was in resolution and picture quality when we went from CRT to LCD displays. In the late 90’s you could get a CRT that did 2048x1536 no sweat with great color and decent refresh rate. Then suddenly LCD displays became the standard and they looked awful. Low resolutions, terrible colors and bad viewing angles. The only real advantage they had was size. It took a decade or so to get back to decent resolutions and color reproduction.
LCDs didn't replace CRTs because they offered better quality to consumers. They were worse for all the reasons you mentioned and then some. LCDs were cheaper to make, much lighter and less frail so they cost less to ship, and they took up much less space while in transport, and while sitting in warehouses, or on store shelves. We were sold an inferior product so that other people could save money. Gradually, some of those savings made it to consumers, especially when it became possible to generate profit continuously though TVs by collecting our data and pushing ads, but it was always a shitty deal for consumers who wanted a quality picture.
I imagine that in the future, people will look back at much of the media from recent decades and think that it looks a lot worse than we remember because it was produced on bad screens or made to look good on all of our crappy screens.
While I appreciate a bit of sarcasm, I'm not sure if this is what actually happened. In the CRT era, you either had good monitors which were expensive or a bunch of actually crap monitors. I had the former, but most of the people had latter and using those monitors for any extended period of time would give you headaches and dry eyes because of poor refresh rates, and terrible flicker.
As a personal anecdote: when I was choosing components for my first desktop computer (instead of using dad's work laptops), I selected components which are affordable. Also, as a coincidence a local IT magazine had a big test of desktop CRT monitors. So I've chosen some inexpensive one which wasn't terrible and as every kid asked parents for money. My mum who was already working on computers on her job had a look through that magazine and said that she'll pay for the whole computer only on the condition we buy the best monitor on that test. So we did (it was a trinitron Nokia @ 100Hz which was a lot), and I think with that move she saved my eyes long term, as I'm in my early 40ties and the only healthy thing I still have are my eyes. In any case, I've soon realized when I got that monitor is that I'll never save money when buying stuff which I use all day long.
Back to the topic. CRT monitors also were space heaters, and had a large volume which was only fine when being permanently placed on a geek's desk.
When LCDs arrived they actually were considerably better than average CRTs. The picture was rock solid without flicker or refresh rate artifacts, perfectly rectangular (a big problem with an average CRT as a matter of fact) and very sharp and crisp. All for a little bit more money. After two or three years they were actually even cheaper than CRTs. And I forgot to mention, they took much less space so you could place it on a POS counter or wherever. It took much more time to replace the top end CRTs, but I guess this is always the case when talking about some tech product.
It's still not reached a point where you can just choose high resolutions with no drawbacks.
2048x1536 19" (135ppi) at up to 72Hz was common at reasonable prices in the late 90s if my memory is correct. Although OS scaling sucked and text looked weird due to the shadow mask at that size. 1600x1200 (105ppi) was the sweet spot for me. And actually in my first job in 2004 I had two 20" 1600x1200 (100ppi) LCDs that I recall were reasonably priced and they were nicer overall. This was around the time LCDs became the default choice. Then "HD" became a thing a couple of years later and you are right, for the next ten years virtually all monitors were "widescreen HD", which was 1280x720 if you fell for marketing of the lower-priced units or or 1920x1080 at best. Anything higher was very expensive.
In 2012 the retina macbooks came out and I got a 13(.3)" with 2560x1600 resolution (227ppi). This was the first time for me that LCDs were finally great. But you couldn't get a resolution like that in an external display. So at that time I mostly just didn't use external monitors until 2016 when suddenly 4K 27" (163ppi) became reasonably priced. So I used 2 of those for years and they were good enough but still left me wanting.
Now still to this day, 4K is the max for external monitors at reasonable prices at any size. About 2 years ago I got an M1 macbook and realized it only supported 1 external monitor. I felt like I needed to keep the real estate I was used to and anyway, with the pandemic and WFH, managing multiple monitors with multiple (work and personal) machines sucked. All I could really find at a reasonable price was 32"/4K and 49" ultrawide. I begrudgingly downgraded to a 49" 5120x1440 monitor (109ppi). I will admit that going from 60Hz to 120Hz was nicer than I expected.
So in 2023 my laptop screen is great and has been great for 10+ years but this was my story about how I am still using the same pixel density as I did 25 years ago.
You are way too optimistic about the weight. ViewSonic p225f with 20" visible display, that reportedly was capable of 2560x1920/63Hz weighted 30.5 kgs!
I am not sure with that dot pitch of 0.25mm it was worth it.
I'm struggling to remember how much they did cost, but with $2000 price tag for the top of the notch machine the monitors on the low end tended (well, AFAIR, don't take my word for it) to be less than $150, and hi-end were like $700 for not the ultra-uber-special cases.
And Moore's Law. LCDs are semiconductors so their price goes down by a factor of 2 every 18 months.
However, even size would be enough. CRTs were ridiculously heavy. My GDM-FW900 was almost 100 pounds. And I used two side by side. I had to shop specifically for a desk that wouldn't collapse when I put them on it.
I agree, but we got LG's 16:18 DualUp monitors a year ago. Having a 43'' monitor in the middle and these two on the sides creates a better setup than it was than what was previously possible.
That's basically what I do. 24" 4K in the middle and 2 17" eizos beside it. They're 1280x1024 though so I have 200% scaling in the middle and 100% at the sides. This causes some OS issues in FreeBSD (I mitigate with xrandr) and Windows which is still screwy to this day. On Mac it works perfectly but I don't use Mac much anymore.
I wish some monitors would have good hybrid uses like being able to do say 6K 60Hz and some much lower res (2K) at 120 or something.
I can definitely live with 60Hz on desktop if I have to, and I can't game at 6K anyway, so doing 2K@120Hz gaming and 6K@60Hz or 8K@60Hz desktop work would be ideal, and wouldn't get into the silly bandwidths of 8K@120.