Reading these comments, it surprises me just how controversial something as simple as a very high refresh rate monitor can be.
Because very few monitors can achieve good results at the optimistic refresh rate they claim.
VA and IPS technology has reached a stage where 75Hz "normal" monitors typically finish most or all of their pixel transitions before the next frame arrives in 13.3ms. It is a 75Hz display not just because the signal is fed to it 75 times a second, but because it can draw the whole frame within 1/75 of a second. At even 'just' 144Hz, the next frame arrives after 6.9ms and the problem with both VA and IPS is that many transitions take more than 10ms, meaning that you never actually see all of the extra frame that your expensive GPU just spent
your money making. They claim 144Hz but in reality, the pixels themselves are only capable of fully drawing a frame in 1/90th of a second. With aggressive overdrive, maybe some of these 'bad' 144Hz displays can draw 75% of the frame in 1/144th of a second, but the remaining 25% of the frame with either still be changing from the previous frame, or completely overshot the mark and is displaying something utterly unwanted and distracting.
There is a very tangible benefit from moving beyond 60-75Hz non-gaming monitors. I know sensitivity varies from person to person but for me, the point at which individual frames turn into motion is about 50Hz, depending on how fast the content is moving, so I consider 60Hz smooth (and do a lot of gaming at 4K60) but true fluidity - the point at which higher refresh rates stop mattering* to me is at about 105Hz.
*
All of that last paragraph was based on CRT testing and more recently OLED+black-frame-insertion, which is the most CRT-like experience you can get these days, so for me (and I'm reasonably typical, I think) a good, strobing, 100% refresh compliant display at 105Hz is the point of diminishing returns. Yes, I can see some very minor gains from 120Hz to 144Hz, to 165Hz, but on a 240Hz monitor, I can barely perceive any difference at all between 240Hz and 144Hz unless trying to read fast-scrolling labels/text.
The higher refresh rates serve one primary function beyond diminishing returns - and that's a reduction of sample-and-hold blur. My brain can barely process the extra information its given at 240Hz, but if my eye is tracking an object moving across the screen, the reduction of the sample-and-hold "wiggle" is very noticeable at higher refresh rates. The caveat to this is that the monitor
HAS to have
completed its pixel transitions before the next frame arrives, otherwise all of that extra Hz and FPS is wasted.
Personally, I wish manufacturers would work on implementing better backlight strobing and black frame insertion. I would pick a 100Hz monitor with excellent strobing over the 240Hz monitor I currently have, any day of the week, because 100fps gaming is great, it means I don't need an RTX 4090 to reach 240fps, and (provided the pulses are typical 25% duration) I can track moving objects with the clarity I'd have using a hypothetical 100% refresh-compliant 400Hz display. The BFI on my G75T is mediocre, but I still prefer it at fixed 120Hz to 240Hz VRR without it.
So, 120 or 144Hz gaming is a big upgrade over 60Hz, but beyond that people who want higher refresh rates probably
think they want higher refresh rates, but in reality are trying to get smoother motion tracking which would be better achieved with a good strobing/BFI implementation.