Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Slightly off topic but I can’t wait until there’s a 120 Hz 8K monitor. It’s the only thing holding me back from upgrading from 4K. I wonder if the current limitation is on the panels, cable bandwidth or absurd price tag…


Absolutely. Mediocre resolution increases has been one of many disappointments for me when it comes to technology over the past 20 years. We had CRT monitors with better resolution than 1080p back in the late 90's early 2000's before LCD panels saddled us with 1080p resolution for 15 years. 4k is the bare minimum that should be available right now. I can't wait until I can have a 16k monitor at about 3'x5-6' on my desk. Maybe it will happen in my lifetime, but I'm not holding my breath.


8K is four times the pixels and therefore four times the bandwidth as a 4K monitor.

It took us a long time to go from 1080p to 4K. It has taken even longer for 4K at 120-144Hz to be practical.

It’s more likely that you’ll end up with intermediate steps to 5K, 6K, than getting 8K 120Hz.

The other limitation is lack of demand. You need a gigantic monitor for 8K to be worth it, and you need a powerful video card to drive it. The number of people who would buy such a monitor is very, very small.


>You need a gigantic monitor for 8K to be worth it

I have a 4k 24" monitor that I can still see aliasing on with AA disabled.

8K 32" would give me more real estate and should, in theory, completely eliminate the need for AA.


Which makes me wonder what's the point of the article's author, 4k vs 6k on 32 inch one is already far into diminishing returns, 8k on 32 inch is just numbers for numbers sake


I don’t know if that’s necessarily true. I use a 32” 4k 144hz monitor at 100% scaling just fine. I’d loooove to replace it with an 8k monitor with similar refresh rate to run at 200% scaling and keep the same amount of workspace I have now


I think you're going to be waiting a long time, even the geforce 4080 and 4090 don't support displayport 2.0.

Additionally much of the demand for >60Hz is for gaming purposes, and there is nowhere near a powerful enough GPU anyone can afford that would be able to render games in high quality or extreme detail level at 8K above 60 FPS. Right now a GPU that costs $1500 USD can maybe render a 4K game with extreme detail level at framerates that vary between 55 to 75 fps.


The 3090 does support displayport 2.0, though. The 40 series are geared towards productivity use cases where people are fine with 60hz.


What is your actual use case apart from technology fetishism?


120 Hz feels a lot smoother for scrolling or anything with movement on the screen.

8K is the point in which at the monitor size I use, individual pixels would be too hard to see. They're already a bit hard to see at 4K, but at 8K it'd be perfect.


Just sit further away or get older :)


I've recently started noticing latter has been happening to me without my consent ;)


I'm trying to avoid either solution


Tell HN when you solve the latter!


Perhaps parent is a gun shrimp or pigeon.

I try not to be overly sapien-centric when making assumptions about my fellow HN readers.


An interesting part of the recent book An Immense World was its coverage of how mantis shrimp likely don't use photoreceptors like human's do.

Marshall now thinks that the mantis shrimp sees colors in a unique way. Rather than discriminating between millions of subtle shades, its eye actually does the opposite, collapsing all the varied hues of the spectrum into just 12.

From Science: "A Different Form of Color Vision in Mantis Shrimp"

The mantis shrimps (stomatopods) can have up to 12 photoreceptors, far more than needed for even extreme color acuity. Thoen et al. conducted paired color discrimination tests with stomatopods and found that their ability to discriminate among colors was surprisingly low. Instead, stomatopods appear to use a color identification approach that results from a temporal scan of an object across the 12 photoreceptor sensitivities. This entirely unique form of vision would allow for extremely rapid color recognition without the need to discriminate between wavelengths within a spectrum.


I can't speak for him, but for me? Straight integer scaling of 4k and 1440p. I loathe fractional scaling, and I cannot wait for the day that I can run an 8k display at > 90hz without compromise


I definitely can't afford an 8k monitor, but this is also the reason I'd want one for gaming. Realistically, you'll probably never run it at 4320p, but depending on how demanding the game is, you can choose between 2160p, 1440p or 1080p without weird scaling. With a 4k monitor your only option is to go all the way down to 1080p.

Manufacturers can also justify making pretty big screens with that many pixels. Samsung's 8k TVs are 80-something inches. Since it's that big, you can sit further back, making up for any loss in pixel density, and still have a huge picture.


> I loathe fractional scaling

Why? Is that an OS problem?


Not the original commenter but, it... has a certain look to it. It's not something that can be totally solved by the OS or any scaling algorithm because fundamentally you have to deal with rendering fractional portions of a pixel to a physical screen where there is no such thing as a fractional pixel. It just always looks different, and IMO worse, than integer scaling.

You have to trade off distortions for bluriness and you can't avoid both of those artifacts at the same time. The higher your pixel ppi, the less noticeable any scaling artifacts will be though.


I don't understand why 'distortion' (things being rendered ± 1 pixel over on some screens vs. on others) is a problem unless your GUI frameworks and/or apps count on pixel-exact layouts for some UI elements. But why would they? Isn't the entire web built on a reflowable format that works pretty well? Shouldn't those tiny 1-pixel differences be like the easiest possible variation for a GUI system's layout engine to cope with?

Do we have lots of scalable UI elements that expect to line up with raster images a certain way on most operating systems?


For me, a 1 pixel asymmetry in a button at medium ppi is noticeable and mildly distracting. I don't mind it much for UI, but I understand why others would want to avoid it. I tend to set my ui and toolbars to auto-hide. Personally, the reasons why I avoid it are text rendering and gaming usage.


This post has more information, as well as tons of other related info: https://tonsky.me/blog/monitors/


This was really helpful, thanks!

It's a shame (for me, who can't afford HiDPI displays to replace his current ones, and would have difficulty pushing all those pixels even if he could) that Apple removed subpixel anti-aliasing. :(


This is the exact same reason I want it.

But with OLED.


In CAD/EDM tools higher resolution means more productivity, (to a point) as you can fit more useful information on the screen at a time - you can “zoom out” more and still keep a useful level of detail. Especially useful in schematic and pcb design where dense areas of interest can be spatially disparate. I don’t like large screens and currently use 24” 4k screens which seem to be either unavailable or expensive, they were ~$350 in 2015 and don’t seem to have any equivalent nowadays.

The 120 Hz i don’t understand however i am not a gamer.


Once you try a high refresh rate monitor, even for work, you just don't want to go back. Every movement and animation is buttery smooth.

Try setting your refresh rate to 30hz for an hour.


People say this, but I’ve had people fail double blind tests for 120hz vs 90hz vs 60hz. I’ve yet to find anyone that can reliably tell 144hz vs 120hz.

What people mostly notice is latency not refresh rate.


I'd agree with you. I think people who use those sorts of displays frequently could tell the difference between 60 and 120hz but 120 vs 144hz seems way too close together.


Took me 2 seconds to notice that a friends monitor was at 60hz by just moving the mouse. Just look at the distance between each cursor icon update as you move it.


Some people use really slow mice (low resolution and/or configured for low sensitivity), which I suspect is a factor here.

It's noticeable for me on every single mouse movement— no special effort is required to move the cursor 'quickly'.


That’s a rendering artifact.

You can show multiple mouse images at 60 fps which shouldn’t trick people if they can actually see 90 vs 60 vs 120 fps.


Anecdotal, but my friends and I have passed such tests. YMMV


You mean double blind 144hz vs 120hz when latency isn’t an issue?

If you don’t mind me asking how old are your friends?


That's crazy to me, just move the mouse around really quickly and you'll quickly notice it's not 120hz (if you are used to it).

What blind test did you use?


They watched normal desktop use and a video game loop.

I added mouse trails to verify people were actually noticing the FPS not just artifacts from the rendering pipeline.


Oh, pretty sure the mouse trails will kill any ability, that’s by far the most noticeable thing about the higher refresh rate in my experience.


Agreed. I have a number of 4K 144 Hertz monitors, I'd like a 6 or 8K monitor but until they have it in high refresh create I'm not switching. I'm not much of a gamer, but when I do occasionally game it is significantly more fluid as well.


Theres a hill people are willing to die on. Resolution matters for me, I don't care about refresh rate.

I'm constantly switching between my M1 Air and work M2 16 Pro and refresh rate has never bothered me.


I have 60, 144 and 165hz displays. I have to say I don't really see much of a difference. Around 30 hz yes. But not over 60. It's probably some sort of genetic vision difference thing.


Even moving the mouse quickly should be a very different experience between 60 and 165


I appreciate that. I can't discern the difference like most people for some reason, unless I am really looking for it.


I guess you could look at the use-cases for 120Hz displays on MacBook Pros.

It's useful for smoother scrolling amongst other things.

I'd like a display that has parity to my laptop but is just bigger so I can fit more on it.


MBP's 120hz display is a lifesaver for me.

Previously, I actually often felt motion sickness when scrolling through code on small 60hz laptop screens. Had no problems with larger (> 23") desktop screens, though.


Smooth scrolling does that to me, scrolling where screen just jumps a bunch of lines at once doesn't bother me


Literally a 3'x5-6' monitor that can render text as crisply and cleanly as print. That's all I want.


8k@120Hz is going to need one heck of a video card


It's venturing into cryptocurrency-space-heater levels of pointless number crunching to render at that level of detail for anyone who has human eyes.


> It's venturing into cryptocurrency-space-heater levels of pointless number crunching to render at that level of detail for anyone who has human eyes.

Depends on your monitor size.

Might be a waste at 27", but if you want to use a 48" display, I can assure you that you'd notice the move from 4k -> 8k.


Yeah, even driving the display at 4K, you start to notice the higher pixel fill factor for 8K displays above 48in. I love my 65in 8K Q900 - even though it mostly lives at 4k (120hz!).


Can you explain this a bit more, I tried googling but I can't quite understand what you mean here by pixel fill factor and how it would differ between the resolutions?


The gaps between the pixels tend to be smaller the higher the resolution - so even if you drive the display at a lower resolution, it can look better the same lower res display that has larger spacing between each subpixel.


Interesting, does this give you a noticeable benefit for text, or does this mostly apply to images or video?


User interfaces shouldn't load a GPU very much even at that resolution.

8K in a game will use tremendous amounts of power, but it's not pointless. It's like having antialiasing turned on. And high frame rates are important for motion because normal rendering only gives you a single point in time.


I’d argue that AI driven super resolution like DLSS should be more than sufficient to upscale 4k to 8k with minimal performance loss and acceptable image quality even for gaming.


Not for tmux + firefox


Unless people are far more sensitive than I am, I don't see how >60Hz is needed for a desktop workstation environment. High frame rate is really only noticeable for very fast reaction time gaming.

4K 120Hz may be noticeable if editing ultra high frame rate video on a video editing workstation, but if you are a video production crew with a camera capable of recording at that framerate, you probably already know that.


I can immediately tell just from moving the mouse a little. I wouldn't say it's needed either though.


When I throw my iPhone into low power mode and it drops to a 60FPS cap, it is immediately noticeable.


Many people are more sensitive than you are. I can easily tell the difference between 60 and 120 on both my phone and my desktop.

Though response times also matter a lot for ghosting and such.


Yeah, I'm a gamer and I definitely notice the difference between 60 hz and 120+ hz.

But for desktop productivity? I don't feel I gain anything from it. 60 hz is fine.


60Hz is really noticeable when you’ve been using 120Hz even for a few minutes. 120Hz feels a lot less tiring and work in a terminal, editor and websites is just a lot smoother.


It's immediately noticeable when scrolling in a browser, dragging stuff around or just moving the mouse. If you haven't seen it in person, go to an Apple store and do a quick comparison between the Macbook Pro (120hz) and the Air (60hz), or iPad vs iPad Pro. They're always next to each other.


Not in text mode. Hercules FTW


Not for productivity.


Not if you ignore overhyped (imo) rtx graphics and return back to gaming worlds, instead of smoke-and-neon-lights-in-mirrors pseudorealism.


I'd likely not be gaming at 8K for a while. But for productivity tools it'd be amazing.


ARM macs can probably handle that, the M2 can do 10 8k video streams at once, 22 simultaneously on the Ultra, and people are running 4k 120hz on the M1 with a couple hacks.


I would love a 120Hz 8k 40 inch monitor. You could use 8k for productivity and then 4k for gaming.


For some reason all 40 inch monitors have disappeared off the market!

For me 40 inch is the sweet spot for coding: any larger and it gets too pixelated at 4K (you can make out the pixels, but that's OK for coding), the UI scale can be set at 100% so all is well proportioned. Entire classes/methods fit on a screen without scrolling.

I own two Philips 4K 40 inch monitors, they only cost ~$600 at the time, and I dread the day they stop working. I would be first in line for 6K or 8K, or any ≥4K really, at 40 inch.


It's been pretty amazing how stagnant the monitor space is. I too am really craving an 8k@120 monitor, although there's a decent chance I'll balk at the price.


It’s crazy how much of a regression there was in resolution and picture quality when we went from CRT to LCD displays. In the late 90’s you could get a CRT that did 2048x1536 no sweat with great color and decent refresh rate. Then suddenly LCD displays became the standard and they looked awful. Low resolutions, terrible colors and bad viewing angles. The only real advantage they had was size. It took a decade or so to get back to decent resolutions and color reproduction.


LCDs didn't replace CRTs because they offered better quality to consumers. They were worse for all the reasons you mentioned and then some. LCDs were cheaper to make, much lighter and less frail so they cost less to ship, and they took up much less space while in transport, and while sitting in warehouses, or on store shelves. We were sold an inferior product so that other people could save money. Gradually, some of those savings made it to consumers, especially when it became possible to generate profit continuously though TVs by collecting our data and pushing ads, but it was always a shitty deal for consumers who wanted a quality picture.

I imagine that in the future, people will look back at much of the media from recent decades and think that it looks a lot worse than we remember because it was produced on bad screens or made to look good on all of our crappy screens.


While I appreciate a bit of sarcasm, I'm not sure if this is what actually happened. In the CRT era, you either had good monitors which were expensive or a bunch of actually crap monitors. I had the former, but most of the people had latter and using those monitors for any extended period of time would give you headaches and dry eyes because of poor refresh rates, and terrible flicker.

As a personal anecdote: when I was choosing components for my first desktop computer (instead of using dad's work laptops), I selected components which are affordable. Also, as a coincidence a local IT magazine had a big test of desktop CRT monitors. So I've chosen some inexpensive one which wasn't terrible and as every kid asked parents for money. My mum who was already working on computers on her job had a look through that magazine and said that she'll pay for the whole computer only on the condition we buy the best monitor on that test. So we did (it was a trinitron Nokia @ 100Hz which was a lot), and I think with that move she saved my eyes long term, as I'm in my early 40ties and the only healthy thing I still have are my eyes. In any case, I've soon realized when I got that monitor is that I'll never save money when buying stuff which I use all day long.

Back to the topic. CRT monitors also were space heaters, and had a large volume which was only fine when being permanently placed on a geek's desk.

When LCDs arrived they actually were considerably better than average CRTs. The picture was rock solid without flicker or refresh rate artifacts, perfectly rectangular (a big problem with an average CRT as a matter of fact) and very sharp and crisp. All for a little bit more money. After two or three years they were actually even cheaper than CRTs. And I forgot to mention, they took much less space so you could place it on a POS counter or wherever. It took much more time to replace the top end CRTs, but I guess this is always the case when talking about some tech product.


It's still not reached a point where you can just choose high resolutions with no drawbacks.

2048x1536 19" (135ppi) at up to 72Hz was common at reasonable prices in the late 90s if my memory is correct. Although OS scaling sucked and text looked weird due to the shadow mask at that size. 1600x1200 (105ppi) was the sweet spot for me. And actually in my first job in 2004 I had two 20" 1600x1200 (100ppi) LCDs that I recall were reasonably priced and they were nicer overall. This was around the time LCDs became the default choice. Then "HD" became a thing a couple of years later and you are right, for the next ten years virtually all monitors were "widescreen HD", which was 1280x720 if you fell for marketing of the lower-priced units or or 1920x1080 at best. Anything higher was very expensive.

In 2012 the retina macbooks came out and I got a 13(.3)" with 2560x1600 resolution (227ppi). This was the first time for me that LCDs were finally great. But you couldn't get a resolution like that in an external display. So at that time I mostly just didn't use external monitors until 2016 when suddenly 4K 27" (163ppi) became reasonably priced. So I used 2 of those for years and they were good enough but still left me wanting.

Now still to this day, 4K is the max for external monitors at reasonable prices at any size. About 2 years ago I got an M1 macbook and realized it only supported 1 external monitor. I felt like I needed to keep the real estate I was used to and anyway, with the pandemic and WFH, managing multiple monitors with multiple (work and personal) machines sucked. All I could really find at a reasonable price was 32"/4K and 49" ultrawide. I begrudgingly downgraded to a 49" 5120x1440 monitor (109ppi). I will admit that going from 60Hz to 120Hz was nicer than I expected.

So in 2023 my laptop screen is great and has been great for 10+ years but this was my story about how I am still using the same pixel density as I did 25 years ago.


IBM T221 (2001, over 4K) was popped up from the future.


Very cool but not really relevant to what was/is available for reasonable prices.


Second hand one was very cheap ($600?) in 2010 IIRC, still futuristic at the time.


How much did a 2048x1536 CRT monitor cost though? That's usually high and I bet it probably priced similar to what a 6K or 8K monitor is today.


Also that CRT was probably 21" max, and weighed 20% of the human looking at it.


You are way too optimistic about the weight. ViewSonic p225f with 20" visible display, that reportedly was capable of 2560x1920/63Hz weighted 30.5 kgs!

I am not sure with that dot pitch of 0.25mm it was worth it.

[1] https://www.backoffice.be/prod_uk/ViewSonic/p225f_viewsonic_...


If anything - you can get it waaay cheaper now.

I'm struggling to remember how much they did cost, but with $2000 price tag for the top of the notch machine the monitors on the low end tended (well, AFAIR, don't take my word for it) to be less than $150, and hi-end were like $700 for not the ultra-uber-special cases.


Not at all! Your common as milk Philips and ViewSonic 19-21” could do that easily!


> The only real advantage they had was size.

And Moore's Law. LCDs are semiconductors so their price goes down by a factor of 2 every 18 months.

However, even size would be enough. CRTs were ridiculously heavy. My GDM-FW900 was almost 100 pounds. And I used two side by side. I had to shop specifically for a desk that wouldn't collapse when I put them on it.


The power draw is also lower for LCD.


I agree, but we got LG's 16:18 DualUp monitors a year ago. Having a 43'' monitor in the middle and these two on the sides creates a better setup than it was than what was previously possible.


That's basically what I do. 24" 4K in the middle and 2 17" eizos beside it. They're 1280x1024 though so I have 200% scaling in the middle and 100% at the sides. This causes some OS issues in FreeBSD (I mitigate with xrandr) and Windows which is still screwy to this day. On Mac it works perfectly but I don't use Mac much anymore.


I wish some monitors would have good hybrid uses like being able to do say 6K 60Hz and some much lower res (2K) at 120 or something.

I can definitely live with 60Hz on desktop if I have to, and I can't game at 6K anyway, so doing 2K@120Hz gaming and 6K@60Hz or 8K@60Hz desktop work would be ideal, and wouldn't get into the silly bandwidths of 8K@120.


I run the DELL G3223Q (144Hz 4K) and mine calibrated at ~98% DCI-P3 for reference. I'm quite happy with it.

https://www.rtings.com/monitor/reviews/dell/g3223q


I have the same monitor but would get an 8k equivalent in a heartbeat. I run it at 100% scaling, and would love that sweet sweet 200% scaling




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: