• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

V-Sync helps reducing lagging?

Ah... Lesson time. Strap in kids.

G-sync module (not Compatible which is just rebranded Freesync) is still premium despite Freesync handily matching it is because the whole suite of features and Quality Control it brings. You buy a G-sync module monitor you get G-sync Vriable Refresh Rate, ULMB Strobe Backlit and Variable Overdrive. Basically the module will change the Trace Free (on ASUS monitors) on the fly according to fps/refresh rate.

But I don't recommend it. Freesync is just Variable Refresh Rate with all the other features developed, tweaked by monitor makers as it should be. That's why ASUS could develop ELMB-sync, their simultaneous ELMB (strobe backlit) and Freesync (variable refresh rate) feature. And need for variable overdrive can be negated by competent monitor maker tweaking to make a single good overdrive setting e.g. that BenQ and usually LG.

As for overdrive setting on that VG259QM, it's just simple that you don't know how it performs. ASUS usually have 3+ overdrive setting but unless reviewed by HardwareUnboxed or TFTCentral you won't know if it performs similar to VG27AQ. I love RTINGS but their data is quite lacking.

And while I agree that the numbers provided by RTINGS is objective but what I said was that how you'll experience the monitor is subjective. For me personally I can't stand motion blur. But I can't notice tearing unless I'm actively looking for it. My periphery is on other things. So you can bet that when I get a high end monitor (any day now :p) I won't be using Freesync/G-sync. Only motion blur reduction.
 
Many games are designed to be synced at 60hz, the Dead Space trilogy is a good example, you couldn't set the in game refresh more than 60hz or the game would be bugged.

Many OLD games.
 
Ah... Lesson time. Strap in kids.

G-sync module (not Compatible which is just rebranded Freesync) is still premium despite Freesync handily matching it is because the whole suite of features and Quality Control it brings. You buy a G-sync module monitor you get G-sync Vriable Refresh Rate, ULMB Strobe Backlit and Variable Overdrive. Basically the module will change the Trace Free (on ASUS monitors) on the fly according to fps/refresh rate.
Now I understand why it's so expensive than others.

But I don't recommend it. Freesync is just Variable Refresh Rate with all the other features developed, tweaked by monitor makers as it should be. That's why ASUS could develop ELMB-sync, their simultaneous ELMB (strobe backlit) and Freesync (variable refresh rate) feature. And need for variable overdrive can be negated by competent monitor maker tweaking to make a single good overdrive setting e.g. that BenQ and usually LG.
Which BenQ and LG you would recommend? My budget is anything around $400 .


As for overdrive setting on that VG259QM, it's just simple that you don't know how it performs. ASUS usually have 3+ overdrive setting but unless reviewed by HardwareUnboxed or TFTCentral you won't know if it performs similar to VG27AQ. I love RTINGS but their data is quite lacking.
In my region here, the VG27AQ is not available anymore. They say VG27AQL1A is the successor and is better. But I doubt...

[QUOTE="Khonjel, post: 4404610, member: 154148
And while I agree that the numbers provided by RTINGS is objective but what I said was that how you'll experience the monitor is subjective. For me personally I can't stand motion blur. But I can't notice tearing unless I'm actively looking for it. My periphery is on other things. So you can bet that when I get a high end monitor (any day now :p) I won't be using Freesync/G-sync. Only motion blur reduction.
[/QUOTE]
The trouble with motion blur is due to the high refresh rate isn't it? Even Asus can't overcome that, they have 6 different levels of overdrive but only 2 are practically useful, meaning without overshoot.
 

Sir,
I don't quite get you there when you said VA and TN offer you a different game when refresh rate goes close to 240 and you also said IPS gives you a much better game when at lower Hz...

Can you elaborate a bit more?

Sure - there's two groups of gamers - more FPS and less input lag and ok quality, and ok input lag and FPS but more image quality. After experimenting with both I tend to favor the image quality as I vary the games I play. I get more enjoyment out of really being blown away by the visuals so I prefer having 4k @ 120hz on an IPS rather than 1440P @ 240 hz on a VA or 1440P @ 165hz on a TN.
 
  • Like
Reactions: Rei
That BenQ you checked on HWUB. EX25 something. As for LG, they're pushing more 1440p at this point. 27GL83A is the cheaper same panel of 27GL850 with some feature cuts. 850 is usually $500 while 83A usually $380 iirc. There's also 27GN750 I think. Don't remember the price but I think HWUB reviewed it. I think it was 1080p. Another contender is MSI MAG251RX. But iirc its overdrive wasn't perfect either. I think it's $350? HWUB reviewed it too I think.
 
Many OLD games.
New ones too, especially consoles. They vsync to 60hz which is why there's no tearing, or 50hz if you're from europe.
 
Also they seems to have ripped off their graph & chart from YouTube's Hardware Unboxed channel. Either that, or it's the other way around.
TechSpot is essentially the written version of the HWUB YouTube channel - just like how GamersNexus does video reviews and provides a written article, TechSpot is the same for HardwareUnboxed. After a quick peek over there, it looks like the site encompasses more than just the HWUB content, but you can see Steve and Tim are both authors on the website.
27GL83A is the cheaper same panel of 27GL850 with some feature cuts. 850 is usually $500 while 83A usually $380 iirc.
I second the 27GL83A - it's essentially the 27GL850 without the 2-port USB hub on the back or wide gamut support, but you save 100-120 USD. Do note that the HWUB reviews of these monitors mention poor contrast ratios, lower than those of other IPS monitors. As was mentioned earlier in the thread, monitors are super subjective, and purchasing decisions for them are very preference-driven. This is exacerbated by the fact that all monitors have some sort of trade-off; there is no "best" monitor.
 
I second the 27GL83A - it's essentially the 27GL850 without the 2-port USB hub on the back or wide gamut support, but you save 100-120 USD. Do note that the HWUB reviews of these monitors mention poor contrast ratios, lower than those of other IPS monitors. As was mentioned earlier in the thread, monitors are super subjective, and purchasing decisions for them are very preference-driven. This is exacerbated by the fact that all monitors have some sort of trade-off; there is no "best" monitor.

This -- it's always best to see the monitor in real life to see if you like it & if you know what you're looking for. "Bad contrast ratios" if you're coming from a cheap TN mean you're going to see the deepest blacks and truest whites you've ever seen on a computer in your life lol.
 
My mind is very exhausted now.

Today, the 3060 Ti is released.
All major reviews especially HWUB reckons what I reckon that the new card is primarily for 1080p. Except Watch Dogs Legion, other games cut it very well with FSP well over the 60 mark.

Can I just go for a 1080p 24.5" 144Hz? Integrating all information from various sources, I m beginning to understand that all those overdrive modes are all more or less for very specific usages, more importantly, not perfect! All these 240Hz or 280Hz printed on the box are for marketing rather than actual practical use.

Do I understand them correctly? Theoretically, the refresh rate can go as high as 240Hz, it's just the response time is slower as the refresh rate approaches the monitor max value?

I think I will not engage any one of the overdrive mode. I will just leave it to the monitor to adjust by way of G-Sync. The slowerresponse time probably won't be noticeable as the refresh rate goes higher up as the game FPS rises.

As long as the lagging due to my current BenQ GW2270 VA panel will be resolved, I m good. I trust something like the AoC 24G2 will cut it.
 
  • Like
Reactions: Rei
Can I just go for a 1080p 24.5" 144Hz? Integrating all information from various sources, I m beginning to understand that all those overdrive modes are all more or less for very specific usages, more importantly, not perfect! All these 240Hz or 280Hz printed on the box are for marketing rather than actual practical use.

Do I understand them correctly? Theoretically, the refresh rate can go as high as 240Hz, it's just the response time is slower as the refresh rate approaches the monitor max value?
Now you're getting it... And that is pretty much spot on, barring a few different exceptional configuration which I won't get into as it will become a long & confusing post.
I think I will not engage any one of the overdrive mode. I will just leave it to the monitor to adjust by way of G-Sync. The slowerresponse time probably won't be noticeable as the refresh rate goes higher up as the game FPS rises.
That's about right. And as I said before, for your use case, it's best to leave overdrive at the default mode.
As long as the lagging due to my current BenQ GW2270 VA panel will be resolved, I m good. I trust something like the AoC 24G2 will cut it.
If it were up to me since I am neither a frame rate nor a resolution whore, with your rig, I'd keep your current monitor, get a RTX 3080, blast every game on the highest settings at your monitor's native 1080p@60Hz along with supersampling & anti-aliasing then enjoy all that graphical fidelity. With this resolution & frame rate, the RTX 3080 will last way longer than most other RTX 3080 owner with superior monitor before needing to upgrade again for newer games. Whether I turn V-Sync on or off depends if screen tearing is distracting or not. But that's just me. You & any other gamer would not have the low standard that I have.
 
I'd keep your current monitor, get a RTX 3080, blast every game on the highest settings at your monitor's native 1080p@60Hz along with supersampling & anti-aliasing then enjoy all that graphical fidelity.

You mean I don't need to rush to buy a new monitor for the 3080?

Anyway, I'd like to thank all of the folks who have helped me in this post, really appreciate.
 
You mean I don't need to rush to buy a new monitor for the 3080?
That is up to you. You decide if you want the higher refresh rate and/or resolution. My suggestion is to get the 3080 first, try it out on the old monitor & if you think the performance & image quality of your old monitor is lacking then you prolly should upgrade.
Anyway, I'd like to thank all of the folks who have helped me in this post, really appreciate.
Your welcome. This is what this forum is for.
 
You mean I don't need to rush to buy a new monitor for the 3080?
I would. A 3080 is massive overkill for a 1080p 60Hz monitor, as it's a card targeted at 4K gaming. It makes little sense (in my opinion anyway) to spend $700 (much more than that at this point thanks to scalpers) on a card whose full power you wouldn't take advantage of. If 1080p 144Hz is all you're looking for, you could go for something like a 3060Ti (or future AMD equivalent) to save money there, as well as saving money on the monitor as well. The 24G2 is a great choice, and with an MSRP of $180 if memory serves, it's much cheaper than the high-end 1440p 144Hz monitors we've been discussing.

Edit: To clarify, you certainly don't have to rush - you're welcome to use your 1080p 60Hz monitor, regardless of whichever graphics card you choose. I just think if you're buying a high-end card you should have an appropriately balanced monitor to take full advantage of all the money you spent on the GPU. Similarly, you wouldn't pair a 3080 with a 2200GE as you'd be extremely CPU limited and you'd be wasting all the potential performance of your $700+ graphics card. I'm arguing that the monitor is the same way. With a low-end CPU you bottleneck your GPU (and your experience) on the computer side of things - with a lower-end monitor you bottleneck your experience on your eyes' side of things.
 
Last edited:
I would. A 3080 is massive overkill for a 1080p 60Hz monitor, as it's a card targeted at 4K gaming.
I know, but the others 3070 and 3060 Ti won't run Watch Dogs Legion gloriously with Ray tracing ON+ DLSS OFF!
Blame Ubisoft for bad coding / porting is the only alternative I can think of.
I also love and have pre-ordered Cyberpunk, but I know Cyberpunk is okay even with 3060 Ti.

It makes little sense (in my opinion anyway) to spend $700 (much more than that at this point thanks to scalpers) on a card whose full power you wouldn't take advantage of.
I have thought about the price of 3080 being too high for mainly 1 game of my favourite. Well...

Edit: To clarify, you certainly don't have to rush - you're welcome to use your 1080p 60Hz monitor, regardless of whichever graphics card you choose. I just think if you're buying a high-end card you should have an appropriately balanced monitor to take full advantage of all the money you spent on the GPU. Similarly, you wouldn't pair a 3080 with a 2200GE as you'd be extremely CPU limited and you'd be wasting all the potential performance of your $700+ graphics card. I'm arguing that the monitor is the same way. With a low-end CPU you bottleneck your GPU (and your experience) on the computer side of things - with a lower-end monitor you bottleneck your experience on your eyes' side of things.
I don't quite know because when I move the maps in Civilization 6, it's like a wriggle carpet if you see what I mean. I believe that the phenomenon is the well-know screen tearing? V_Sync was off when that happens and the frame rates are around 64 give or take 2 to 3 FPS. That catalyzes my actions to get a new 144Hz mon with Adaptive Sync 1080p. I don't know is that is a sensible decision though.
 
I know, but the others 3070 and 3060 Ti won't run Watch Dogs Legion gloriously with Ray tracing ON+ DLSS OFF!
Blame Ubisoft for bad coding / porting is the only alternative I can think of.
I also love and have pre-ordered Cyberpunk, but I know Cyberpunk is okay even with 3060 Ti.
Yeah, from what I hear it's bad optimization. I definitely wouldn't make a purchasing decision based solely on that game, especially this early in it's life cycle. According ot this Tom's Hardware benchmark, 3070 w/ RTX + DLSS averages 90 FPS or so on 1080p ultra: https://www.tomshardware.com/news/watch-dogs-legion-benchmark
I have thought about the price of 3080 being too high for mainly 1 game of my favourite. Well...
As mentioned above, deciding to spend $200 more (MSRP difference between 3070 and 3080) on a single $60 game is a bit extreme if you ask me...
I don't quite know because when I move the maps in Civilization 6, it's like a wriggle carpet if you see what I mean. I believe that the phenomenon is the well-know screen tearing? V_Sync was off when that happens and the frame rates are around 64 give or take 2 to 3 FPS. That catalyzes my actions to get a new 144Hz mon with Adaptive Sync 1080p. I don't know is that is a sensible decision though.
Sounds like tearing. Turning VSync on should clear that up, and it's not like Civ is a fast-paced game, so any input lag shouldn't matter. I recommended the AOC 24G2, but if you don't mind a shitty stand, the Asus VP249QGR is $20 cheaper at $160 MSRP...
 
  • Like
Reactions: Rei
Yeah, from what I hear it's bad optimization. I definitely wouldn't make a purchasing decision based solely on that game, especially this early in it's life cycle. According ot this Tom's Hardware benchmark, 3070 w/ RTX + DLSS averages 90 FPS or so on 1080p ultra: https://www.tomshardware.com/news/watch-dogs-legion-benchmark
https://www.tomshardware.com/news/watch-dogs-legion-benchmark

Hey, thanks for the pointer, I've definitely missed out this one.

As mentioned above, deciding to spend $200 more (MSRP difference between 3070 and 3080) on a single $60 game is a bit extreme if you ask me...
https://www.tomshardware.com/news/watch-dogs-legion-benchmark

I'll probably give it up. But there are some other games which I wish to collect like Total War: Three Kingdom, this game demands very seriously on both CPU and GPU. Have you watched Steve (Gamer Nexus) benchmark, he is the only one amongst Linus, Tech Deals, Tim (OC3DTV) and HWUB who has this game in his benchmark list. It can push the GPU to the end of the envelope when everything is set to Ultra. Also, do you still recall Crysis?

I haven't purchased Total War: Three Kingdoms, but I have Crysis 3 in my collection. This is another monster game. A little while ago, I asked on Toms' Hardware Forum and was told that the game actually redraws each frame objects which aren't necessary to update or something like that and because of this reason, it puts HUGE loading on the CPU for computations. Call it intentional or overlook, but I am more inclined to believe in the former. As to why the dev like to do that, I really and don't care to know. Knowing what I know now, I still fall for these titles. I also wish to run some flight sim like DCS F-18 and F-16 later, and those even require more GPU power. I'm also a casual Battlefield V player, so my interest is really a bit diverse.

If I really want to, I need to get a 3080, but the price is a bit insane one of these days. If you look at AMD's...

Sounds like tearing. Turning VSync on should clear that up, and it's not like Civ is a fast-paced game, so any input lag shouldn't matter. I recommended the AOC 24G2, but if you don't mind a shitty stand, the Asus VP249QGR is $20 cheaper at $160 MSRP...
Thanks for the confirmation. I will try to see if it helps. And thanks for the heads-up on VP249QGR. I'm not in a rush so will research on this one more.
 
  • Like
Reactions: Rei
I don't quite know because when I move the maps in Civilization 6, it's like a wriggle carpet if you see what I mean. I believe that the phenomenon is the well-know screen tearing? V_Sync was off when that happens and the frame rates are around 64 give or take 2 to 3 FPS. That catalyzes my actions to get a new 144Hz mon with Adaptive Sync 1080p. I don't know is that is a sensible decision though.
Sounds like tearing. Turning VSync on should clear that up, and it's not like Civ is a fast-paced game, so any input lag shouldn't matter.
You mentioned that Civilization 6 caps out around 60-70 fps with V-Sync off? I am actually a bit surprised that there is screen tearing at that frame rate. They usually happened at much higher frame rate. Best to turn V-Sync on. At that frame rate, input lag should not be noticeable & as milewski said: any input lag don't matter for non-fast-paced games.
but I have Crysis 3 in my collection. This is another monster game. A little while ago, I asked on Toms' Hardware Forum and was told that the game actually redraws each frame objects which aren't necessary to update or something like that and because of this reason, it puts HUGE loading on the CPU for computations. Call it intentional or overlook, but I am more inclined to believe in the former. As to why the dev like to do that, I really and don't care to know. Knowing what I know now, I still fall for these titles.
To my understanding, it is to make the game look prettier or something like that. After all, Crytek is known to be the type of dev that pushes hardware to it's absolute limit with their games at release.
If I really want to, I need to get a 3080, but the price is a bit insane one of these days. If you look at AMD's...
After going about various threads throughout this forum, it seems that AMD still have a bit of driver issue they need to fix for their recent GPUs & they seem to take a bit too long to finding solutions so if you also have that issue, it might take a while to get it fixed. Though, some other users don't seem to notice the driver issue so you might not as well. Also, if you want to use ray-tracing, you might wanna stick with GeForce. Nvidia does a better implementation of ray-tracing as it won't tax your GPU's performance will AMD's Radeon will. This is due to Ampere (& Turing) line having their own ray-tracing core to offset ray-tracing processing from other cores within the GPU. That is also where your money is well-spent.
I would. A 3080 is massive overkill for a 1080p 60Hz monitor, as it's a card targeted at 4K gaming. It makes little sense (in my opinion anyway) to spend $700 (much more than that at this point thanks to scalpers) on a card whose full power you wouldn't take advantage of. If 1080p 144Hz is all you're looking for, you could go for something like a 3060Ti (or future AMD equivalent) to save money there, as well as saving money on the monitor as well. The 24G2 is a great choice, and with an MSRP of $180 if memory serves, it's much cheaper than the high-end 1440p 144Hz monitors we've been discussing.

Edit: To clarify, you certainly don't have to rush - you're welcome to use your 1080p 60Hz monitor, regardless of whichever graphics card you choose. I just think if you're buying a high-end card you should have an appropriately balanced monitor to take full advantage of all the money you spent on the GPU. Similarly, you wouldn't pair a 3080 with a 2200GE as you'd be extremely CPU limited and you'd be wasting all the potential performance of your $700+ graphics card. I'm arguing that the monitor is the same way. With a low-end CPU you bottleneck your GPU (and your experience) on the computer side of things - with a lower-end monitor you bottleneck your experience on your eyes' side of things.
I am partially in agreement with milewski here as well. 3080 is rather overkill for your current monitor but only for the first few years. Few years later down the line, as newer AAA games get more graphically intensive, those games will finally stress your GPU to fit your monitor's spec.
So this all depends on your financial willingness to spend more often for eyecandy. Get a new monitor & GPU now & you get all the buttery smooth & fluid visual smoothness of gameplay now but to keep your visual addiction, you will need to upgrade your GPU (& CPU) much sooner. Or just get only a new GPU & not upgrade your monitor but visual quality will constantly be capped at 1080p@60Hz but your GPU will last you much longer before newer games tell you that you'd need to upgrade you GPU & CPU. I'm mentioning CPU as well cuz by that time, your Core i5 would constrain your GPU's performance.
As for input lag, unless you play fast-paced games & you are twitchy & anal about it, I wouldn't worry too much about it.
 
Let's just say there is a marked difference between what happens in Linus and reviews and what happens in-game. They run content with a top-of-the-bill system so it is liable to produce less frame variability. They run it for a short while and give it a few minutes of attention.

I find it very hard to translate those conclusions to the places I use a monitor in and for. And they're the same use cases. I prefer getting my info from a real review like tftcentral.co.uk or some sort. Not that Linus draws the wrong conclusions... or other reviewers, but they do lack perspective. How does each feature affect your actual game, your immersion? They can't, won't say. And its a personal thing too, what works for me, may or may not work for you.

What I did find, every single time, is that added features over the standard, unfiltered experience, almost never last indefinitely for me. That goes for audio and in a way it also goes for video. With Gsync for example, I had the idea of feeling more latency impact than I did with uncapped frames, while the benefit (no tearing) wasn't really visible. Gsync off didn't tear either. If you already have a high refresh panel, you can just tweak a little bit and avoid tearing. I'm also noticing that engines and GPUs handle frame delivery better and differently. The goodold half-way torn up images are history and have been prior to owning a Gsync capable panel already. And for the old content that doesn't have that, well... you just run it at 144 hz constantly, and don't need Gsync to begin with.

Gsync also doesn't mix well with a strobing backlight, which is arguably a much more useful feature to have for motion clarity - its the other half of the experience really, and high FPS won't get you there on its own. Strobe compensates for the way our eyes move across the screen, and allows us to 'snapshot' the images rather than view constantly changing colors. This way, you can actually read text passing by fast, instead of deciphering it within a line of blur. This also affects refresh rates of 240hz and all... you can also just strobe the image at 120hz and achieve a better result doing so, for motion clarity. Blurbusters even puts 240hz on a lower tier than a 120hz+strobe panel - although native 240hz might be at 2ms persistence just as well - its much less likely to deliver, because you won't have 240 FPS at all times, and half the number is far easier to hold on to.

See how those considerations defeat all necessity for variable refresh like Gsync? You really just want to have the most fixed and high FPS and a matching refresh rate for the best experience.


View attachment 177572

FWIW... don't pay for Gsync, but if the panel comes compatible with Gsync, why not. That's my advice, best of worlds really, you won't ever feel cheated out of your money and if it works, yay. But it is most certainly NOT the must have in any possible situation. Its bonus, but all other monitor qualities and price should come first. Gsync will still be shite on a crappy panel. :)

Note: I paid 450 EUR for this Gigabyte G34GWC, 3440x1440 144hz Gsync compatible, strobe capable VA Ultrawide. Paying similar for a high refresh IPS at 27 inch seems to be the norm, but I have to say, there is very little differentiation now between IPS and VA. Yes, viewing angles are smaller, but contrast is higher, blacks are much better and bleed is nonexistant (quality/ panel lottery, mind). For gaming, VA is a dream really, and I much prefer it over IPS. Take note of the fact that your typical ambient lighting is of great impact: if you play in dim lit rooms, definitely AVOID IPS. The glow will make you want to throw it out the window sooner rather than later - get VA instead. If you don't and room is brightly lit, the lacking contrast of IPS doesn't matter and its the better choice.

But... IPS is certainly more consistent, especially in darker hues. VA does smear a tiny bit, but it grows on you, I hardly notice it having used a VA before, but the advantages still do jump out at me - crisp image, super vibrant colors and blacks... its really something when you can't see your game is running in 16:9 on a 21:9 screen because black is just... black.

Some thoughts - but key point and TL DR... focus on the primary monitor qualities first before staring blindly at whatever-Sync. Its not that relevant.


So what is important...
- uniformity (you really do see it when a panel is not 'equally lit')
- no bleed, low glow (in case of IPS) and preferably NO glow at a normal view distance. I tried a Dell U2515HM (might have mixed a letter) IPS and the glow was horrendous, it really stands out in games that are almost always with dark elements.
- diagonal/size/view distance but with 27/1440 you got that locked down solid, its a sweetspot.
- a good overdrive mode (you pointed that one out, good topic to focus on!). You only need one. You won't change per game, that only happens the first week, and then novelty wears off.
- overall build quality. Can you hold the panel normally without bending stuff, is the stand workable, etc.

After those boxes are ticked... look at the budget, see what's left over in the list and pick the best added featureset :)
I am loving my Gigabyte 32QC. I showed it to my friend who works at a computer store and he was blown away by the colors due to the contrast ratio.
 
I am loving my Gigabyte 32QC. I showed it to my friend who works at a computer store and he was blown away by the colors due to the contrast ratio.
Specs on that monitor?
 
Specs on that monitor?
Its a VA panel with a 1500 r curve 1440p and 165 hz on DP and 144 on HDMI. Freesync premium pro and a true 5000 to 1 contrast ratio.
 
  • Like
Reactions: Rei
Its a VA panel with a 1500 r curve 1440p and 165 hz on DP and 144 on HDMI. Freesync premium pro and a true 5000 to 1 contrast ratio.
Must be high quality n prolly expensive monitor if your VA panel could go up to 5000:1 contrast ratio. They are typically in the 3000:1 to 4000:1 for mid-range types. I suppose it's 16:9 aspect ratio? Is the curving bad at that aspect ratio?
 
Must be high quality n prolly expensive monitor if your VA panel could go up to 5000:1 contrast ratio. They are typically in the 3000:1 to 4000:1 for mid-range types. I suppose it's 16:9 aspect ratio? Is the curving bad at that aspect ratio?
Nope it is actually an affordable monitor considering the specs. Mine cost $499 Canadian. An Asus, MSI or even Aorus monitor with those specs would be at least $200 more. It is 31.5 inches 16:9 and the curve is something (for me) that you quickly come to appreciate it. I use it strictly as a Gaming monitor and my 5700 pretty much gives me 100+ FPS averages in some Games. I even posted a thread on why I was getting 144+ FPS average on this panel vs my 4K 60 hz panel at 1440P.
 
  • Like
Reactions: Rei
Nope it is actually an affordable monitor considering the specs. Mine cost $499 Canadian. An Asus, MSI or even Aorus monitor with those specs would be at least $200 more. It is 31.5 inches 16:9 and the curve is something (for me) that you quickly come to appreciate it. I use it strictly as a Gaming monitor and my 5700 pretty much gives me 100+ FPS averages in some Games. I even posted a thread on why I was getting 144+ FPS average on this panel vs my 4K 60 hz panel at 1440P.
I've always been hesitant to mention the G27QC/G32QC after watching the HWUB review:
High response time error rates, dark level smearing, poor overdrive optimization - Tim feels it has too many issues compared to similar priced competitors to recommend it.
 
You mentioned that Civilization 6 caps out around 60-70 fps with V-Sync off? I am actually a bit surprised that there is screen tearing at that frame rate. They usually happened at much higher frame rate. Best to turn V-Sync on. At that frame rate, input lag should not be noticeable & as milewski said: any input lag don't matter for non-fast-paced games.

To my understanding, it is to make the game look prettier or something like that. After all, Crytek is known to be the type of dev that pushes hardware to it's absolute limit with their games at release.

After going about various threads throughout this forum, it seems that AMD still have a bit of driver issue they need to fix for their recent GPUs & they seem to take a bit too long to finding solutions so if you also have that issue, it might take a while to get it fixed. Though, some other users don't seem to notice the driver issue so you might not as well. Also, if you want to use ray-tracing, you might wanna stick with GeForce. Nvidia does a better implementation of ray-tracing as it won't tax your GPU's performance will AMD's Radeon will. This is due to Ampere (& Turing) line having their own ray-tracing core to offset ray-tracing processing from other cores within the GPU. That is also where your money is well-spent.

I am partially in agreement with milewski here as well. 3080 is rather overkill for your current monitor but only for the first few years. Few years later down the line, as newer AAA games get more graphically intensive, those games will finally stress your GPU to fit your monitor's spec.
So this all depends on your financial willingness to spend more often for eyecandy. Get a new monitor & GPU now & you get all the buttery smooth & fluid visual smoothness of gameplay now but to keep your visual addiction, you will need to upgrade your GPU (& CPU) much sooner. Or just get only a new GPU & not upgrade your monitor but visual quality will constantly be capped at 1080p@60Hz but your GPU will last you much longer before newer games tell you that you'd need to upgrade you GPU & CPU. I'm mentioning CPU as well cuz by that time, your Core i5 would constrain your GPU's performance.
As for input lag, unless you play fast-paced games & you are twitchy & anal about it, I wouldn't worry too much about it.
You're right. 3080 is future proof. For the games I play, guess the card will last for 5 years or so. RTX 4000 will just be a mild upgrade, nVidia tradition s I suppose.

Nope it is actually an affordable monitor considering the specs. Mine cost $499 Canadian. An Asus, MSI or even Aorus monitor with those specs would be at least $200 more. It is 31.5 inches 16:9 and the curve is something (for me) that you quickly come to appreciate it. I use it strictly as a Gaming monitor and my 5700 pretty much gives me 100+ FPS averages in some Games. I even posted a thread on why I was getting 144+ FPS average on this panel vs my 4K 60 hz panel at 1440P.
I m actually a bit surprise that nobody mention LG or ViewSonic, those are big brands
 
  • Like
Reactions: Rei
I m actually a bit surprise that nobody mention LG or ViewSonic, those are big brands
The 27GL83A and 27GL850 myself and @phanbuey mentioned above are LG monitors. Viewsonic tends to be a little more budget oriented than the likes of LG from what I understand.
 
Back
Top