• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Only some humans can see refresh rates faster than others, I am one of those humans.

i wouldnt class whats wrote on wiki as the end all bud. did you know our brains can stax images ?.
I verified what's written on Wikipedia first. If you don't believe that then your living in the world of alternative facts. Its a fact that flies see more fps with their eyes. It's a dream (placebo effect) to think that one sees the differences above 50-60 fps .

... Now it is a major selling point on monitors pretty much over any other monitor fuction.
Also on GPU's. They have to deliver the FPS at the target resolution. ;) Btw. right now i use 2 75hz 2560x1440 Monitors. The radeon System tells me that i run my to games on 65-75 fps. I will change to 2 40" UWQHD monitors. There i will need a Radeon 7900XT at least to deliver the 165Hz advertised on the Monitor. ;)
 
Last edited:
I can distinguish between 75hz and 90hz
That is indeed true for CRTs, because of the flickering. Even if not as much as sub-70 Hz, where it bugs me, because of the waviness.
 
everyone DOES see higher refreshrates, it just changes accordingly to the person perspective.
i used 60hz monitors and tvs for my entire life, but i still can notice the 30-60HZ (FPS) difference on videos and video games (except for beamgndrive)
if i get an higher rate monitor, even at 75hz, i would notice huge differences.
i think i would be okay with 144hz, but i probably can get more.
 
I had about a dozen brain damage incidents (concussions, substance abuse, comas, you name it) so I had a period of me being unable to see any difference between 40 and 60 Hz, let alone 60+. Now, I've recovered to see difference at least until it goes 130ish Hz... Not sure. Can't test now, don't have anything that clocks faster than 83 Hz.

You made me curious. Perhaps I can see more than 130 Hz now...
 
I can't see the diff between 60 and 75 Hz, when I've upgraded to my current 75Hz ultrawide from my 60Hz standard FHD monitor I thought that it will be different but nope I did not notice the difference in any game.
So even if I could notice the 144 or higher its like whatever to me, I can game on a 60 Hz display with no issues or any discomfort. 'its the ultrawide aspect ratio that improved the immersion for me'

Part of me is glad cause this saves me a lot of money.:laugh:
 
I can't see the diff between 60 and 75 Hz, when I've upgraded to my current 75Hz ultrawide from my 60Hz standard FHD monitor I thought that it will be different but nope I did not notice the difference in any game.
So even if I could notice the 144 or higher its like whatever to me, I can game on a 60 Hz display with no issues or any discomfort. 'its the ultrawide aspect ratio that improved the immersion for me'

Part of me is glad cause this saves me a lot of money.:laugh:
Exactly my thoughts. :)

I'm on a 144 Hz ultrawide, but I can't say I see much, if any difference above 60 FPS. The refresh rate being variable (FreeSync) is much more beneficial than it being high.
 
I can't see the diff between 60 and 75 Hz
With LCD and OLED, it's hard, compared to CRT!

With CRT, you probably will notice a difference between 60 Hz and 75 Hz easily! Because of the flickering of CRTs, less than 70 Hz, can cause a headache or fatigue!

Also, the standard CRT ratings for the better CRTs, were often 60 Hz, 70 Hz, 75 Hz, 85 Hz, 90 Hz, 100 Hz and 120 Hz. CRTs often don't support more than 120 Hz, anyways. That's probably why Halo 1 Custom Edition don't support more than 120 Hz.
 
Exactly my thoughts. :)

I'm on a 144 Hz ultrawide, but I can't say I see much, if any difference above 60 FPS. The refresh rate being variable (FreeSync) is much more beneficial than it being high.
Yeah I miss having Freesync, used that with my old RX 570 and it was great, if there is one thing I'm sensitive to is screen tearing and the likes.
I rather have a lower but stable frames with no tearing.
With LCD and OLED, it's hard, compared to CRT!

With CRT, you probably will notice a difference between 60 Hz and 75 Hz easily! Because of the flickering of CRTs, less than 70 Hz, can cause a headache or fatigue!

Also, the standard CRT ratings for the better CRTs, were often 60 Hz, 70 Hz, 75 Hz, 85 Hz, 90 Hz, 100 Hz and 120 Hz. CRTs often don't support more than 120 Hz, anyways. That's probably why Halo 1 Custom Edition don't support more than 120 Hz.

No idea about that since the last time I've used CRT was in my high school years and at the time I did not even care whats any of that.:oops: 'used to play Unreal Tournamet 2003-4 on a semi competitive level but I have no idea at what refresh rate or fps but it was smooth'
 
No idea about that since the last time I've used CRT was in my high school years and at the time I did not even care whats any of that.:oops: 'used to play Unreal Tournamet 2003-4 on a semi competitive level but I have no idea at what refresh rate or fps but it was smooth'
Heck, I imagine I could notice a difference between 60 Hz and 66 Hz on CRTs, LOL. Some CRTs, had 60 Hz and 66 Hz options, I believe.
 
2024 and still "human eye can only see 30-60fps"?
Talk about nonsense science.
C'mooooooooooon...

That's right up there with Monosodium Glutamate and Aspartame being "super bad / evil / poison / turns you into a zombie / brain melting / cancer giving / makes people hate cats & dogs" which are more or less equally as silly nonsense science.

Yeah I miss having Freesync, used that with my old RX 570 and it was great, if there is one thing I'm sensitive to is screen tearing and the likes.
I rather have a lower but stable frames with no tearing.


No idea about that since the last time I've used CRT was in my high school years and at the time I did not even care whats any of that.:oops: 'used to play Unreal Tournamet 2003-4 on a semi competitive level but I have no idea at what refresh rate or fps but it was smooth'

What name did you use in UT2K4? :)
 
Last year I got a 144hz monitor, and it's pretty nice, but unnecessary I think. I use VRR and can't really tell what the refresh rate is till it dips to the mid 50s. Above that everything is smooth as butter.
 
Last year I got a 144hz monitor, and it's pretty nice
I got my first 144 Hz monitor in the late-2010s, in 2018, IIRC. Still have it, it's a TN LCD. It's an HP Omen 25. I was Halo'ing with that one. But you have to use 120 Hz in Halo CE 1.0.10, sadly.
 
There are diminishing returns for this though.

I've done rotorscope testing up to 480Hz and was basically guessing beyond about 250Hz.

On displays, I can accurately guess the refresh rate of a single display in isolation or so up to 150Hz. I'm not perfect, might confuse 144Hz and 165Hz, but I'm typically within 10-15% without any side-by-side comparison needed.

I can definitely see and feel the difference between 150Hz and 200Hz, and 240Hz is the fastest display I've ever seen/owned but it's wasted on me. I had my 180Hz laptop display next to my G7 at 240 and I could tell the difference side by side when both were showing the same testufo.com content, but in the real world I think 180Hz is close enough to my limit that I need to a-b side-by-side compare two to correctly identify the faster one.

Despite that, I generally game at 4K120 and 1440p120. As alluded to by others in this thread already, the higher your framerate, the more jarring and immersion-breaking stutters become. At 120Hz, I feel that's a happy medium between the fluidity and low-latency of high-refresh gaming, but it's also a realistic target for 0.1% lows. On the G7, I'd much rather enable strobing at 120Hz than try and push a game to run well at 240Hz - that gets me the CRT-like motion clarity that only a strobing display can provide, whilst also being within the realms of stutter-free gaming for my 5800X3D.
 
I think most people with good vision can tell a difference.
I can easily distinguish 30 versus 60, 60 versus 120, 120 versus 165.

I don't particularly care about spending money to attain anything greater than 120... I have diminishing returns on enjoyment relative to cost for frame rate higher than 120. My monitor goes to 165 but the graphics card I use is pretty underpowered.
 
2024 and still "human eye can only see 30-60fps"?
Talk about nonsense science.
C'mooooooooooon...

That's right up there with Monosodium Glutamate and Aspartame being "super bad / evil / poison / turns you into a zombie / brain melting / cancer giving / makes people hate cats & dogs" which are more or less equally as silly nonsense science.



What name did you use in UT2K4? :)
totally agree, we can test this out literally with an LED. you get something, like an raspberry and just make him flicker, 60times a second, 75 times a second.

i dindt make this code yet, but i gotta try to see how many flicks per second i can see before i just see him off or fully on.
 
but after that it's hard to notice without looking at blur busters and latency feels the same
The display is just one part of the click-to-pixel latency.

Mouse polling > DxInput passthrough > Game engine tick rate (not always tied to fps) > GPU frame draw time > double buffer for an additional half frame on average > display input lag > pixel response time of your display.

I said diminishing returns because the whole chain might be 60-80ms on a 60Hz display, and cutting 17ms refresh interval down to a 7ms on a 144Hz display lops off ~20 ms in frame drawing/buffering alone, and then your pixel response/input lag are likely each 5ms+ faster on a dedicated gaming display.

So, you're going from ~70ms to ~40ms of lag moving from 60Hz to 144Hz, which is a 75% improvement

If you make a huge (and difficult-to-obtain) jump from 144Hz to 360Hz gaming, you're only really cutting about 8ms more out of the complete chain, which is only a 25% further improvement at most - and you'll only get that full 25% improvement if your system can reliably peg the current game at 360fps - that's definitely not possible (even with unobtainium hardware!) in many games...
 
Last edited:
The display is just one part of the click-to-pixel latency.

Mouse polling > DxInput passthrough > Game engine tick rate (not always tied to fps) > GPU frame draw time > double buffer for a second frame interval > pixel response time.

I said diminishing returns because the whole chain might be 50-70ms on a 60Hz display, and cutting 17ms refresh interval down to a 7ms on a 144Hz display lops off 20ms in buffered frames alone, and then your pixel response is likely 5ms+ faster on a 144Hz panel.

So, you're going from 60ms to 35ms of lag moving from 60Hz to 144Hz, which is a 70% improvement
If you make a huge (and difficult-to-obtain) jump from 144Hz to 360Hz gaming, you're only really cutting about 8ms more out of the complete chain, which is an "up to" 30% improvement, and that comes only if your system can reliably peg your FPS at 360, definitely not possible even with unobtainium hardware in many games.
Yeah total system and input latency are crucial.

Besides this there are xxx Hz displays, and "xxx Hz displays".

Not all panels are created equal. Actual pixel response times and other performance metrics are incredibly varying between models.

In many ways the 13.3" 60 Hz 1440p OLED on my old Alienware was superior to the 120 Hz cheap panel on the laptop that replaced it because OLED pixel response is borderline instant.

My current 240 Hz panel is better than both, but it's not perfect.
 
I think most people with good vision can tell a difference.
I can easily distinguish 30 versus 60, 60 versus 120, 120 versus 165.

I don't particularly care about spending money to attain anything greater than 120... I have diminishing returns on enjoyment relative to cost for frame rate higher than 120. My monitor goes to 165 but the graphics card I use is pretty underpowered.
I have a very good visual acuity, as in I see perfectly fine both far and close, even very small things, and I have a quite good peripheral vision, but yet I can't really tell the difference above ~50FPS. So I don't think both are related.
 
I have a very good visual acuity, as in I see perfectly fine both far and close, even very small things, and I have a quite good peripheral vision, but yet I can't really tell the difference above ~50FPS. So I don't think both are related.
Yeah it's a brain thing not an eye thing.

I also know people who don't care about 30/60 fps they're fine, whereas I find it hard to play at that, unless it's OLED 60 Hz on my phone, but even then I don't really play on my phone anymore.
 
Yeah total system and input latency are crucial.

Besides this there are xxx Hz displays, and "xxx Hz displays".

Not all panels are created equal. Actual pixel response times and other performance metrics are incredibly varying between models.

In many ways the 13.3" 60 Hz 1440p OLED on my old Alienware was superior to the 120 Hz cheap panel on the laptop that replaced it because OLED pixel response is borderline instant.

My current 240 Hz panel is better than both, but it's not perfect.
Ah yeah, I forgot to add display input lag to the latency chain above, I'll go back and edit. Display input lag is never zero and where OLEDs typically have near-zero pixel response, they've not historically had input lag as low as LCD displays. I guess all that brightness control and anti-burn in stuff is an extra processing step that you can't afford to skip for expensive OLED displays.

We're seeing OLED gaming displays with mediocre input lag, and we're seeing IPS/VA displays that barely hit 50% of their pixel transitions within the refresh window for their highest refresh rates. What's the point in paying for 240Hz if you pixels smear around at 80Hz?!

I have a love/hate relationship with my G7. It's a stupid, overpriced gaming monitor with a huge number of pointless features I don't ever want to use - but it's a fast VA screen that can actually deliver pixel transitions flawlessly at 120Hz and makes a reasonably good 90% of transitions in the 4.2ms window at 240Hz. I've bought and returned maybe a dozen monitors over the years that promised high framerates but were a smeary mess incapable of delivering anywhere close to the outright, ridiculous lies of their "1ms response time" claims.
 
I have a very good visual acuity, as in I see perfectly fine both far and close, even very small things, and I have a quite good peripheral vision, but yet I can't really tell the difference above ~50FPS. So I don't think both are related.
Same here.

It also depends on the games you play, methinks. Multiplayer shooters are not the same as single player walking simulators.
 
Ah yeah, I forgot to add display input lag to the latency chain above, I'll go back and edit. Display input lag is never zero and where OLEDs typically have near-zero pixel response, they've not historically had input lag as low as LCD displays. I guess all that brightness control and anti-burn in stuff is an extra processing step that you can't afford to skip for expensive OLED displays.

We're seeing OLED gaming displays with mediocre input lag, and we're seeing IPS/VA displays that barely hit 50% of their pixel transitions within the refresh window for their highest refresh rates. What's the point in paying for 240Hz if you pixels smear around at 80Hz?!

I have a love/hate relationship with my G7. It's a stupid, overpriced gaming monitor with a huge number of pointless features I don't ever want to use - but it's a fast VA screen that can actually deliver pixel transitions flawlessly at 120Hz and makes a reasonably good 90% of transitions in the 4.2ms window at 240Hz. I've bought and returned maybe a dozen monitors over the years that promised high framerates but were a smeary mess incapable of delivering anywhere close to the outright, ridiculous lies of their "1ms response time" claims.
Yeah my G7 is a good one, 32" model.

Very little smearing or blur.
 
I have a very good visual acuity, as in I see perfectly fine both far and close, even very small things, and I have a quite good peripheral vision, but yet I can't really tell the difference above ~50FPS. So I don't think both are related.
Same here.
Wow, that's slow!
Count yourself lucky - you can afford to turn up the eye candy and resolution without having to spend silly money on high-refresh displays and overpriced GPUs to feed them :)
 
On the G7, I'd much rather enable strobing at 120Hz than try and push a game to run well at 240Hz - that gets me the CRT-like motion clarity that only a strobing display can provide, whilst also being within the realms of stutter-free gaming for my 5800X3D.
I think this is an important point a lot of people just gloss over. Yes, OLED is far closer theoretically to CRT motion clarity, but the reason why those were so smooth even at refreshes far below what is achievable today was the fact that they were strobed. Having instant (well, near-instant) transitions is all well and good, but OLEDs are still inherently sample-and-hold displays and they still have a measure of blur due to this that’s unavoidable. It’s essentially a “pick your poison” situation - are you willing to accept some blurriness, but have a comfortable visual experience, or do you want absolute clarity, but that requires breaking away from sample-and-hold, which means strobing, and that is something that can be quite unpleasant for some people. I know for a fact that in my case long term use of strobed displays (or ones with flickering backlight, though those are rare nowadays) does trigger my migraines something fierce.
 
Back
Top