• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

What monitor refresh rate do you game at?

What monitor refresh rate do you game at?

  • 60 Hz

    Votes: 4,354 75.2%
  • 120 Hz

    Votes: 727 12.6%
  • 144 Hz

    Votes: 308 5.3%
  • I don't even know

    Votes: 194 3.4%
  • Software timings above 60 Hz

    Votes: 208 3.6%

  • Total voters
    5,791
  • Poll closed .

He is probably using the lightboost trick so the motion blur is comparable to what people used to get on CRTs (remember those huge , heavy monitors?).

Not only that but the increase in Hz and FPS at the same time results in less input lag so it's even closer to a CRT in that respect as well.

So many people have gotten used to the motion blur that's practically on every single LCD display that they've forgotten how clean motion is supposed to look.
 
Last edited:
Me too hate that, including those that say they notice if a LCD monitor is 60, 75 or 125, etc. : )

But I remember years ago I always had to set up the CRTs of the office, everybody had it at default minimum refresh rates and I had to set it to 75hz. THAT was a flickering even Blind Lemon Jefferson would detect!!! ;)

My LCDs do not have any option to change from 60Hz. And it's technologically totally different also...

I just had to quote the wikipedia on this:
"Refresh rate or the temporal resolution of an LCD is the number of times per second in which the display draws the data it is being given. Since activated LCD pixels do not flash on/off between frames, LCD monitors exhibit no refresh-induced flicker, no matter how low the refresh rate. High-end LCD televisions now feature up to 240 Hz refresh rate, which requires advanced digital processing to insert additional interpolated frames between the real images to smooth the image motion...."

Couple of things.

You can tell the difference in refresh rate for motion. For a static picture of course, you can't.

LCD monitors actually do flicker, although you can't normally see it, even at 60Hz. The pixels are actually flipping back and forth at usually the refresh rate to stop them degrading and increase the panel's life. However, the way they do it, you don't perceive flicker. I found a website once that showed special patterns which revealed this. Depending on how your monitor is designed, different patterns would flicker. I'm sorry, I can't remember which website it was. A bit of googling could find it.

He is probably using the lightboost trick so the motion blur is comparable to what people used to get on CRTs (remember those huge , heavy monitors?).

Not only that but the increase in Hz and FPS at the same time results in less input lag so it's even closer to a CRT in that respect as well.

So many people have gotten used to the motion blur that's practically on every single LCD display that they've forgotten how clean motion is supposed to look.

Oh, this old boy is familiar with CRTs all right and the smooth animation they could do!

Thing is, LightBoost only goes up to 120Hz, while he is running at 144Hz, so he's not using LB.

I've got LB on 100% of the time now with the ToastyX Strobelight beta utility and it's awesome.

For so long, people had indeed forgotten what smooth motion was like and thought that the smeared mess you get with a standard LCD was smooth. Trying to explain to them that it wasn't smooth motion but lots of motion blur instead would tend to start as flame war with these muppets. :rolleyes:

Here's a great site all about strobed backlights: www.blurbusters.com
 
Last edited:
I need to visit a shop with running high end monitors NAUWWWWWWWW!!!
 
120 Hz is overrated. It's nice, but not as nice as higher resolution. And higher resolution is only available at 60 Hz (standard... I know some 1440p models are capable of custom higher refresh rates).

The difference in gameplay between 60 and 120 Hz is all placebo effect unless you are one of the top FPS players in the world.

I've used both, and while 120 Hz ruins 60 Hz and 1440p+ ruins 1080p, the effect of increased resolution is more significant than the effect of increased refresh rate to me. The only time I could see 120 Hz being better than higher resolution is if you are one of those folks who can get headaches/nausea/etc from 60 Hz gameplay.


This is just simply not true. First of all, if you want a real world test go search for the blind test they did on Linus Tech Tips Youtube. They grabbed one of their buddies who has never used 120hz (or played BF3, the game they used) and were trying to see if the average gamer could tell the difference. The results were interesting enough they did a second test to see if Linus could tell the difference as someone who games at 120hz and is familiar with the game.

The long and short is this: If you haven't ever used it you are less likely to be able to tell the difference at a glance, that doesn't mean it isn't there, it means that you aren't prepared to notice the difference because you're ignorant. Just like people said the same thing about 1080p vs 720p, or for older people VHS vs DVD. The fact of the matter is that whenever you get something of higher quality you don't often see the difference until you get used to the higher quality. Once you adjust to the higher quality the difference when you see something worse is much more drastic. Anyone today could see a difference between DVD and VHS or Bluray and DVD because we have adjusted to the higher quality level. It's the same thing.

I used to work in sales and most customers have a hard time telling 1080i from 1080p until they get it in their homes for a month or two. Same with calibrated TVs, if you go into a store and look at the oversaturated too bright screens it all looks great until you use a calibrated TV for a few months...then the ones in the store look pixelated and you can see the whites/blacks crushing. Audio is the same way. I'm a second generation audiophile, I cringe when most people have me listen to their "killer new home theater system" or their "awesome new car stereo" because it sounds awful. However, to them it's better than the even worse sound they had before from their TV or their factory stereo. Just because you've "used it" doesn't mean you're qualified to say it's a placebo effect only noticable by the best FPS gamers in the world. That's a preposterous statement, it's noticable by and idiot with a mouse and a copy of ANY shooter ever, they just need to play on it long enough to adapt.

120hz is a HUGE difference. It's not subtle, it's not a "placebo" effect. It is a real and noticable difference in motion blur, it is half the lag time and especially with lightboost it is the most accurate picture you can get without a CRT. I get that you might think that 1440p outweighs 120hz, I disagree. Most people can't run 1440p with max settings at a reasonable framerate, a lot of people can't even run 1080p at max settings. If given a choice of one or the other I'll take 120hz anyday. With that being said I'm going to overclock a 1440p korean panel myself, so I'll do both, but that isn't for everyone and I'm willing to open it up and fix any backlight bleed.

Edit: As an aside, I'll make the same argument about "input lag" or "lag" in general. If all you've ever played on was a shitty gaming setup then that new IPS panel with 30ms of processing lag probably seems fine to you, it doesn't mean it's awesome for gaming, it means you don't know any better. If you went to a house with fiber internet (I'm drooling over Google fiber right now, wish it was available in my area), a high speed gaming monitor, a nice gaming rig with appropriate mouse and surface you would notice a major difference after using it for a couple weeks. You'd never be able to go back, it's both a blessing and a curse to be able to afford nice things. All I'm saying is that everything adds up to your experience, if you're fine with how your things are...AWESOME. CONGRATS. But I'll thank you to not come into a tech forum where people look for advice and often make purchase decisions and make subjective claims about a technology because you haven't put the time in to really become an expert on a topic.
 
Last edited:
This is just simply not true. First of all, if you want a real world test go search for the blind test they did on Linus Tech Tips Youtube. They grabbed one of their buddies who has never used 120hz (or played BF3, the game they used) and were trying to see if the average gamer could tell the difference. The results were interesting enough they did a second test to see if Linus could tell the difference as someone who games at 120hz and is familiar with the game.

The long and short is this: If you haven't ever used it you are less likely to be able to tell the difference at a glance, that doesn't mean it isn't there, it means that you aren't prepared to notice the difference because you're ignorant. Just like people said the same thing about 1080p vs 720p, or for older people VHS vs DVD. The fact of the matter is that whenever you get something of higher quality you don't often see the difference until you get used to the higher quality. Once you adjust to the higher quality the difference when you see something worse is much more drastic. Anyone today could see a difference between DVD and VHS or Bluray and DVD because we have adjusted to the higher quality level. It's the same thing.

I used to work in sales and most customers have a hard time telling 1080i from 1080p until they get it in their homes for a month or two. Same with calibrated TVs, if you go into a store and look at the oversaturated too bright screens it all looks great until you use a calibrated TV for a few months...then the ones in the store look pixelated and you can see the whites/blacks crushing. Audio is the same way. I'm a second generation audiophile, I cringe when most people have me listen to their "killer new home theater system" or their "awesome new car stereo" because it sounds awful. However, to them it's better than the even worse sound they had before from their TV or their factory stereo. Just because you've "used it" doesn't mean you're qualified to say it's a placebo effect only noticable by the best FPS gamers in the world. That's a preposterous statement, it's noticable by and idiot with a mouse and a copy of ANY shooter ever, they just need to play on it long enough to adapt.

120hz is a HUGE difference. It's not subtle, it's not a "placebo" effect. It is a real and noticable difference in motion blur, it is half the lag time and especially with lightboost it is the most accurate picture you can get without a CRT. I get that you might think that 1440p outweighs 120hz, I disagree. Most people can't run 1440p with max settings at a reasonable framerate, a lot of people can't even run 1080p at max settings. If given a choice of one or the other I'll take 120hz anyday. With that being said I'm going to overclock a 1440p korean panel myself, so I'll do both, but that isn't for everyone and I'm willing to open it up and fix any backlight bleed.

Edit: As an aside, I'll make the same argument about "input lag" or "lag" in general. If all you've ever played on was a shitty gaming setup then that new IPS panel with 30ms of processing lag probably seems fine to you, it doesn't mean it's awesome for gaming, it means you don't know any better. If you went to a house with fiber internet (I'm drooling over Google fiber right now, wish it was available in my area), a high speed gaming monitor, a nice gaming rig with appropriate mouse and surface you would notice a major difference after using it for a couple weeks. You'd never be able to go back, it's both a blessing and a curse to be able to afford nice things. All I'm saying is that everything adds up to your experience, if you're fine with how your things are...AWESOME. CONGRATS. But I'll thank you to not come into a tech forum where people look for advice and often make purchase decisions and make subjective claims about a technology because you haven't put the time in to really become an expert on a topic.

All great points you make there. :) As a 120Hz LightBoost monitor user I couldn't agree more.

I'd just like to add that my previous monitor (which I still have) is a good quality Iiyama 26" 1920x1200 monitor that only supports 60Hz properly. While I could always clearly see the motion blur (and hated it) it seemed quite playable on my favourite fast and twitchy FPS of all time, Unreal Tournament 2004, which I played on it a lot.

When I upgraded to my new Asus 120Hz monitor (see specs) the effect of the higher refresh was profound, with all the advantages you describe. After that, I could easily notice the lag on the old monitor just by moving the mouse around on the desktop, let alone play UT! On top of that, when I discovered how to activate LightBoost in 2D mode (ie no 3D Vision) and eliminate motion blur my jaw literally dropped!

Yeah, if you're FPS gaming, higher refresh is waay more important than higher resolution. People underestimate the importance of higher temporal resolution. It's great to see the likes of Benq now making monitors with their own generic version of LightBoost that will work with anything.

Even with an undemanding game like The Stanley Parable where you just walk around a lot and push buttons, having it animated at 120Hz with no dropped frames and with LightBoost on is amazing. Nothing else compares, not even the old CRTs, finally.

Oh and I know what you mean about audio. ;)
 
Hey, thanks Aithos, half of that is dedicated to me, I know. But I cannot say for sure that I am gonna notice anything just as I do not notice high end audiophile stuff (I do appreciate the mid range better from the dirt cheap stuf, that is true). And we don't know how much demographic will notice. And then do not know how much demographic will appreciate it enough to buy it.
Many idiots have huge screens in cramped living rooms just because it is football season and every neighbor upgraded his screen.... And loving technology is not equal to being a tech junkie that NEEEEEEEDS oh so badly everything that is announced.

The best option would be to create a tech reviews site and receive parts for free, that, no doubt, would be understood by lots of people. :)

Edit: so it is not like ultra high DPI mouses hoax (totally counterproductive for me) and "Fatal1ty tweaked mobos with game oriented PS2 ports for ruling at FPS games"? No snake oil? :)
 
I use a 120hz monitor and prefer to play with 120+ fps
Yes, that does mean occasional screen-tearing, but OH SNAP due to paying attention at much more important in-game stuff I hardly ever notice the tearing at all.
Yes, I do see the difference between 60 Hz / FPS and 120 Hz / 120 FPS. The point where I stop noticing significant difference is over 90+ FPS point.
I still do play games, including twitch-shooters with 25-40 FPS if such a situation arises and can still play pretty well. Though, I do not like it.
Serious Sam games are where I see the biggest impact. Having a 120Hz refresh rate and playing with ~120 FPS (For SSHD) or 90+ FPS (For SS3) makes me perform a bit better compared to ~60 fps. Let alone ~40...

Overall, the the biggest difference I see:
"120Hz with 90 FPS or more makes me tired much slower than running 60Hz and/or 60 fps or less."
Yes - the biggest difference is that 120Hz tires me more slowly. And in my book, that is super worth it.
 
I agree with Aithos and Qubit.

When I did change my 1080p 60Hz to a 3D 1080p 120Hz, in 2011 I was playing Borderlands 2 a lot and I simply couldn't believe how smooth this game got with the new 120Hz monitor.

Also for racing games is way better @ 120Hz
 
but your grafix card will have to "feed" your monitor a twice the rate..... how is that possible?
 
but your grafix card will have to "feed" your monitor a twice the rate..... how is that possible?

I hope You're trolling.
 
Under the O.S. display settings I set it to 60, you or whoever to 120 or whatever rate. If it were only an "internal monitor feature" I suppose the O.S. would not need to be configurated at all... So what is the performance "cost" of going from 60 to 120 Hz.?

I hope you enjoy yourself very much.
 
kn7n.jpg
 
I hate people that try to make me unhappy, being perfectly happy. :)
 
Last edited by a moderator:
This stops here for both of you. Move along if you do not have anything worthy to contribute to this thread.
 
I agree with Aithos and Qubit.

When I did change my 1080p 60Hz to a 3D 1080p 120Hz, in 2011 I was playing Borderlands 2 a lot and I simply couldn't believe how smooth this game got with the new 120Hz monitor.

Also for racing games is way better @ 120Hz

Oh yes indeed - TrackMania is already an insanely fast game and 120Hz just makes it even insaner! :D
 
but your grafix card will have to "feed" your monitor a twice the rate..... how is that possible?

I'm not an expert here but for what I understand refresh rate does not interfere in your GPU resources utilization unless you are playing a Game in 3D mode.

Like "nvidia 3d vision", if you are using it, your Frame rate will fall around 50% 'cause your Graphics Card will simulate 2x 60Hz in your monitor for you to gain the 3D effect.
 
I'm not an expert here but for what I understand refresh rate does not interfere in your GPU resources utilization unless you are playing a Game in 3D mode.

Like "nvidia 3d vision", if you are using it, your Frame rate will fall around 50% 'cause your Graphics Card will simulate 2x 60Hz in your monitor for you to gain the 3D effect.

For "not an expert" you're quite correct here. :)

Assuming you're running with vsync off, the framerate the GPU runs at will be the same regardless of the monitor refresh rate.

And yes, 3D Vision drops your framerate, because the graphics card has to draw the scene twice, from different angles. I don't know the fine details of it, but I believe NVIDIA can take rendering shortcuts which stops twice the load being put on the GPU preventing the framerate from halving.
 
60Hz, but i can overclock my monitor to 90Hz i dont realy see a difference while i game, so i leave it at 60 with vsync.

How does one overclock their monitor?

And does this mean I can overclock frequency or does it apply to resolution as well?
 
For "not an expert" you're quite correct here. :)

Thanks Sir!

I don't know the fine details of it, but I believe NVIDIA can take rendering shortcuts which stops twice the load being put on the GPU preventing the framerate from halving.

Yes, I believe you are right here, unless the game is out of the 3D Vision list/profile of compatible games.
 
for who do not plan to switch 120hz, do it because you loose so much of enjoyment...
 
I agree with Aithos and Qubit.

When I did change my 1080p 60Hz to a 3D 1080p 120Hz, in 2011 I was playing Borderlands 2 a lot and I simply couldn't believe how smooth this game got with the new 120Hz monitor.

Also for racing games is way better @ 120Hz

What does the refresh rate of a monitor have in common with the smoothness of a game? Were you playing with vsync on?
 
What does the refresh rate of a monitor have in common with the smoothness of a game? Were you playing with vsync on?

a lot.

starcraft II for example, is a lot smoother at 90FPS than 60FPS, even on a 60Hz refresh rate/monitor.

with a higher refresh rate i'd just get less tearing.
 
What does the refresh rate of a monitor have in common with the smoothness of a game? Were you playing with vsync on?

I will have to say you need to experiment yourself to believe. Try out a game you used to play @60Hz on a 120Hz monitor.
 
I will have to say you need to experiment yourself to believe. Try out a game you used to play @60Hz on a 120Hz monitor.
Well actually I tried to be honest, and found no big significant impact. Just to be clear, the games I've tested (UT3 and GRiD) were running with more than 100fps, they were looking a little smoother tbh, but not with much compared to my 60hz monitor. Nothing impressive. ;)
 
Back
Top