Friday, October 18th 2013

NVIDIA Introduces G-SYNC Technology for Gaming Monitors

Solving the decades-old problem of onscreen tearing, stuttering and lag, NVIDIA today unveiled NVIDIA G-SYNC technology which, for the first time, enables perfect synchronization between the GPU and the display. The result is consistently smooth frame rates and ultrafast response not possible with previous display technologies.

Several years in the making, G-SYNC technology synchronizes the monitor's refresh rate to the GPU's render rate, so images display the moment they are rendered. Scenes appear instantly. Objects are sharper. Game play is smoother.

"Our commitment to create a pure gaming experience led us to G-SYNC," said Jeff Fisher, senior vice president of the GeForce business unit at NVIDIA. "This revolutionary technology eliminates artifacts that have long stood between gamers and the game. Once you play on a G-SYNC monitor, you'll never want to go back."

Since their earliest days, displays have had fixed refresh rates -- typically 60 times a second (hertz). But, due to the dynamic nature of video games, GPUs render frames at varying rates. As the GPU seeks to synchronize with the monitor, persistent tearing occurs. Turning on V-SYNC (or Vertical-SYNC) eliminates tearing but causes increased lag and stutter as the GPU and monitor refresh at different rates.

G-SYNC eliminates this tradeoff. It perfectly syncs the monitor to the GPU, regardless of frame rate, leading to uncompromised PC gaming experiences.

G-SYNC technology includes a G-SYNC module designed by NVIDIA and integrated into gaming monitors, as well as hardware and software incorporated into certain Kepler-based GPUs. (A full list of GPUs that support NVIDIA G-SYNC is available here.)

Leading Game Developers Blown Away
Game developers have quickly embraced the benefits of G-SYNC technology, which enables their games to be played seamlessly.

"The huge gains in GPU rendering power over the past decade have enabled developers and artists to create increasingly complex 3D scenes and worlds. But even on the highest end PC, the illusion of reality is hampered by tearing and stutter. NVIDIA G-SYNC elegantly solves this longstanding problem. Images on a G-SYNC display are stunningly stable and lifelike. G-SYNC literally makes everything look better." - Tim Sweeney, founder, EPIC Games

"NVIDIA's G-SYNC technology is a truly innovative solution to an ancient legacy restriction with computer graphics, and it enables one to finally see perfect tear-free pictures with the absolute lowest latency possible. The resulting output really allows your mind to interpret and see it as a true continuous moving picture which looks and feels fantastic. It's something that has to be seen to be believed!" - Johan Andersson, technical director, DICE

"With G-SYNC, you can finally have your cake and eat it too -- make every bit of the GPU power at your disposal contribute to a significantly better visual experience without the drawbacks of tear and stutter." - John Carmack, co-founder, iD Software

Rollout Plans by Monitor Manufacturers
Many of the industry's leading monitor manufacturers have already included G-SYNC technology in their product roadmaps for 2014. Among the first planning to roll out the technology are ASUS, BenQ, Philips and ViewSonic.

"ASUS strives to provide the best gaming experience through leading innovations. We are excited about offering NVIDIA's new G-SYNC technology in a variety of ASUS gaming monitors. We are certain that it will impress gamers with its incredible step-up in smoothness and visual quality." - Vincent Chiou, associate vice president, Display Business Unit, ASUS

"We are extremely thrilled to build G-SYNC into our professional gaming monitors. The two together, offering gamers a significant competitive advantage, will certainly take PC gaming experience to a whole new level." - Peter Chen, general manager, BenQ Technology Product Center

"We can't wait to start offering Philips monitors with G-SYNC technology specifically for gamers. We believe that anyone who really cares about their gaming experience is going to want one." - Sean Shih, global product marketing director, TPV Technology, (TPV sells Philips brand monitors)

"Everyone here at ViewSonic is pumped about the great visual experience that G-SYNC delivers. We look forward to making the most of this technology with our award-winning line of gaming and professional displays." - Jeff Volpe, president, ViewSonic

Enthusiasm by System Builders and Integrators
A variety of the industry's leading system builders and integrators are planning to make G-SYNC technology available in the months ahead. Among them are Digital Storm, EVGA, Falcon Northwest, Overlord and Scan Computers.

"A look at G-SYNC is a look at the future of displays. Having to go back to a standard monitor after seeing it will start to annoy you. It's that good." -Kelt Reeves, founder and CEO, Falcon Northwest.

"G-SYNC is a ground-breaking technology that will deliver a flawless gaming experience. We're hugely excited that our customers will be able to enjoy incredible gameplay at any frame rate without tearing or stutter." - Elan Raja III, co-founder, Scan Computers.
Add your own comment

147 Comments on NVIDIA Introduces G-SYNC Technology for Gaming Monitors

#1
Mistral
I'm surprised Carmack sounded so positive about this thing. I respect him as much as the next guy, but I can't see it that way at he moment.

I am curious though, how will g-sync do in say fighting games that require to-the-frame accuracy to pull out the best combos? It could be either a real boon or a curse for them.
Posted on Reply
#2
NeoXF
by: 15th Warlock
Yes! How can they offer more options to gamers?! What impertinence! Let's make sure that all people can only buy graphic cards and monitors that are capped at 30FPS so no one can have an unfair advantage, who needs more alternatives anyways?!!

Who do these people think they are by offering innovation in this field?? Let's boycott Nvidia and burn all their engineers at the stake! And then to have the audacity to sell this technology and make a profit! How dare them?!!

/S :rolleyes:
Yes, how dare corporation care about anything else other than making money!

/S :rolleyes:


Just because the world is how it is now, doesn't mean it's any good. But what the Hell, you guys can salute your new green, blue, red or WTF ever overlords in any way you want, it's not me being dead inside while shielding myself further and further away from things that should count more, with consumerism gimmicks.

This makes me sick to my stomach, but what can I do...
Posted on Reply
#3
15th Warlock
by: haswrong

exactly.. if i decide to please someone, i do it for free..

remember when your momma asked you to take out the litter back then? did you ask like $270 for the service? the whole world will be soon eaten by the false religion that is the economy, you just wait.
by: NeoXF
Yes, how dare corporation care about anything else other than making money!

/S :rolleyes:


Just because the world is how it is now, doesn't mean it's any good. But what the Hell, you guys can salute your new green, blue, red or WTF ever overlords in any way you want, it's not me being dead inside while shielding myself further and further away from things that should count more, with consumerism gimmicks.

This makes me sick to my stomach, but what can I do...
Aww, how nice of you both, but let me ask you a few questions: Do you have a job? If you do, do you work for free? I mean, do you offer your services without expecting to be remunerated for them? And if that's the case, how do you support yourself and/or your family? Through charity/welfare?

I mean, you have to find a way to pay your bills somehow, am I right?

I would assume that people who work for these companies (and please note that this applies to any given company in our "evil economy") expect some sort of compensation for their work, wouldn't they?

Anyway, not going to discuss the basics of how our society works, not the right forum to do so, but I just found your counter argument really amusing; besides, no one is pointing a gun to your face forcing you to buy these new monitors, so there's no reason to get all worked up about this superfluous piece of technology, when there're obviously way more important things to worry about and fix like the state of the economy, world hunger, world peace and other serious matters...:rolleyes:

Back to topic, it sounds like nVidia is genuinely interested in fixing this V-Sync problem, that has existed for so long, I for one am excited to see that someone is focusing some research to addressing this issue, I just hope they do offer the results of their findings to more costumers than just owners of new monitors and/or Kepler based cards.
Posted on Reply
#4
The Von Matrices
by: haswrong
nvidia said that gamers will love it.. this sounds like notebook gamers will love it.. are you ready to ditch your $1300 monitor for even more expensive one with the variable refresh rate, which doesnt make your reasponse any faster?

ok, i have to give you one thing, you can now watch your ass being fragged without stutter :D
By that logic there should be no reason why we should buy new graphics cards to run at more detailed settings because the increased detail doesn't make you respond any faster. Have fun running at "low" settings on your integrated graphics. There's a lot more to video gaming than competition.

by: 15th Warlock
Back to topic, it sounds like nVidia is genuinely interested in fixing this V-Sync problem, that has existed for so long, I for one am excited to see that someone is focusing some research to addressing this issue, I just hope they do offer the results of their findings to more costumers than just owners of new monitors and/or Kepler based cards.
Thank you. It seems as if for some people NVidia producing any technology first makes it inherently evil. I don't care who invented it, I support the technology. Kudos to NVidia for doing it first.

For those who claim this is proprietary with no evidence, I quote Anandtech stating exactly the opposite:
Meanwhile we do have limited information on the interface itself; G-Sync is designed to work over DisplayPort (since it’s packet based), with NVIDIA manipulating the timing of the v-blank signal to indicate a refresh. Importantly, this indicates that NVIDIA may not be significantly modifying the DisplayPort protocol, which at least cracks open the door to other implementations on the source/video card side.
Just because no one else supports it yet doesn't mean that no one else ever will. Someone has to be first. Most standards aren't designed by committee but by everyone adopting one manufacturer's implementation for simplicity.
Posted on Reply
#5
15th Warlock
by: The Von Matrices
By that logic there should be no reason why we should buy new graphics cards to run at more detailed settings because the increased detail doesn't make you respond any faster. Have fun running at "low" settings on your integrated graphics. There's a lot more to video gaming than competition.



Thank you. It seems as if for some people NVidia producing any technology first makes it inherently evil. I don't care who invented it, I support the technology. Kudos to NVidia for doing it first.

For those who claim this is proprietary with no evidence, I quote Anandtech stating exactly the opposite:


Just because no one else supports it yet doesn't mean that no one else ever will. Someone has to be first. Most standards aren't designed by committee but by everyone adopting one manufacturer's implementation for simplicity.
Exactly, seems like they have taken addressing this problem to heart, with innovations like Adaptative V-Sync, FCAT and now G-Sync.
Posted on Reply
#6
Xzibit
by: The Von Matrices

For those who claim this is proprietary with no evidence, I quote Anandtech stating exactly the opposite:
I think people read Nvidia own FAQ on G-Sync
Q: Does NVIDIA G-SYNC work with other competitive products?

A: NVIDIA G-SYNC is only works with NVIDIA GPUs and G-SYNC enabled monitors.
You can draw a conclusion there
Posted on Reply
#7
Fluffmeister
Sounds like a GTX xx0 card combined with an G-SYNC enabled monitor will offer a pretty damn sweet BF4 experience.

Oh nVidia, you big meanies, no wonder peeps here are mad.
Posted on Reply
#8
The Von Matrices
by: Xzibit
I think people read Nvidia own FAQ on G-Sync

You can draw a conclusion there
Your comment does not disprove mine. As I quoted, the signaling technology should be possible to reverse engineer. At that point anyone can produce monitors or video outputs that comply to that standard. G-Sync will be NVidia exclusive for a few years just because no one has had time to dissect it. It doesn't mean that there will never be generic components that are compatible with it. The only difference is that third parties won't use the trademarked term "G-sync."

You also need to consider the source when you read the quote from the FAQ. All manufacturers advertise that their products only work with first-party accessories. It doesn't mean that third parties can't make compatible accessories.
Posted on Reply
#9
Xzibit
by: The Von Matrices

You also need to consider the source when you read the quote from the FAQ. All manufacturers advertise that their products only work with first-party accessories. It doesn't mean that third parties can't make compatible accessories.
Which is the accessory the GPU or the G-Sync ?

Since G-Sync will be talking to the driver. When was the last time Nvidia let outsiders tinker with that ?
Posted on Reply
#10
The Von Matrices
by: Xzibit
Which is the accessory the GPU or the G-Sync ?

Since G-Sync will be talking to the driver. When was the last time Nvidia let outsiders tinker with that ?
Once you know what commands are being sent over the cable then you can implement them into your own drivers or hardware. For example, if you create a monitor that can read all the signals sent via the G-sync protocol and respond to them just like a genuine G-sync monitor, then why would this matter to the drivers? A properly reverse engineered product should be no different than the genuine device. I doubt NVidia wants manufacturers to do this, but I see no reason, engineering or legal, that third party manufacturers cannot, and the driver shouldn't be able to tell otherwise.

The only hurdle would be the investment required to reverse engineer the protocol, and if genuine G-sync doesn't catch on, then there will be no financial incentive and no third party will bother to do it.
Posted on Reply
#11
Assimilator
Innovations like G-SYNC and SHIELD streaming show exactly why NVIDIA titles themselves "World Leader in Visual Computing Technologies". Because they are.

When was the last time AMD released ANYTHING game-changing? No, Mantle doesn't count, because the world doesn't need another API; we have DirectX and it works just great. No, TrueAudio doesn't count, because no-one gives a shit.
Posted on Reply
#12
qubit
Overclocked quantum bit
So, all nvidia have done is reverse sync direction, making the monitor sync with the card's varying frame rate output instead. A simple enough change technically, but it looks like the visual impact is big judging by the PR and articles I've read.

Couple of things that might be worse though are motion blur and the shape distortion* of moving objects, both of which are currently fixed by nvidia's LightBoost strobing backlight feature which my monitor has. The PR doesn't mention LightBoost anywhere, so I expect both of these motion artifacts to be present. The motion blur in particular is horrible and I'd rather have a bit of lag and occasional stutter than put up with this. I'd have to see G-Sync in action to properly judge it, though.

Also, it would be interesting to see this varying video signal on an oscilloscope.

*To check out the shape distortion, just open a window on the desktop, make it stretch from top to bottom, but be rather thin, then move it from side to side with the mouse. The shape will change with the top leading the bottom - moving the mouse faster makes the effect stronger. This is due to the scanning nature of the video signal, where the bottom part of the window (the whole picture, in fact) is quite literally drawn later than the top part. Note that the slower the monitor refresh, the worse the effect. Note that it's separate to the tearing artifact that you're also likely to see.

LightBoost strobing blanks the display and only shows the completed picture, eliminating this effect. Of course, this comes at the expense of maxed out lag. At least the lag is very short at 120Hz. Sometimes you just can't win, lol.
Posted on Reply
#13
NC37
This honestly isn't worth it nVidia. VSYNC is not such a terrible thing that it needs a special dedicated chip which...you aren't opening to the entire industry, will increase production costs, and likely only make the situation worse later when someone comes out with an alternative that does it without all the negatives.

You should have just made the tech and licensed it for everyone to use then enjoyed the royalties for years. I seriously doubt it requires a Kepler GPU to use it. Already know PhysX will work on non NV cards. This isn't something special either. Just setting yourself up for the fall later when someone, maybe even AMD, does it and does it better and for everyone to use.
Posted on Reply
#15
Lionheart


:banghead::banghead:

This site has gone to shit with all the fanboy's & trolls, Jesus Christ lolz
Posted on Reply
#16
NeoXF
by: 15th Warlock
[...]
Yeah, 'kay. You obviously 've got everything figured out. And your only real problem in your life seems to be that you need a better paying job. Righahahahahat... :roll:


Again, open or bust.

It might have taken a lot of years, but microsoft is finally on the way out as a de-facto PC gaming platform... and why? Yeah, I'll let someone else figure this one out.


But I digress, we shall see. I'm far away from getting a gaming monitor anytime soon either way.
Posted on Reply
#17
Frick
Fishfaced Nincompoop
by: NeoXF

It might have taken a lot of years, but microsoft is finally on the way out as a de-facto PC gaming platform... and why? Yeah, I'll let someone else figure this one out.
Just want to point out that no. Not in the least. Alternatives are incoming and to an extent already here, but right now? No way, no how.

EDIT: And I just can't fathom the depths to which this place has plunged. All this rage.. For something they have not seen irl. And if this does what it says it does, you do have to see it irl before you can pass judgement.
Posted on Reply
#18
Am*
by: Solidstate89

Are you like, intentionally playing dumb or are you just not getting this? The refresh rate of the monitor will always be 60Hz, there is no way right now to modulate it, dynamically, on the fly. Period. End of story. V-Sync attempts to lock your frames to 60Hz so it doesn't induce screen tearing, because the monitor operates at 60Hz. You can change it, but not dynamically. Doesn't work that way. But you can still get some screen tearing, and most importantly, you can get input lag because the GPU is rendering frames faster than the monitor can handle.

Adaptive V-Sync, simply removes the restriction on frame rates when it drops below 60. So instead of hard locking your game into either 60FPS or 30FPS, if it drops below, it simply reacts like V-sync isn't enabled, so it can run at 45FPS instead of being locked to intervals of 30. Again, this has nothing to do with the monitor and can not control it. Why do you even think Adaptive V-sync can change the monitor's refresh rate? How do you expect Adaptive V-sync to change the monitor's refresh rate - on the fly - to 45Hz? It doesn't and it physically can't. There is no standard that allows that to happen. There is no protocol that allows that to happen.

That is what G-Sync is. Maybe one of the consortiums can come up with a hardware agnostic standard...at some point. We don't know when, or if any of the consortiums even care. So please, for the love of god, stop making baseless (and wholly inaccurate and ignorant) assumptions. There isn't a single freaking monitor that works the way you describe these days and no firmware update is going to change that. Ever.
Well quite clearly, you're the one playing dumb, seeing how you have yet to explain to me why someone running a 120Hz or even a 75Hz monitor, would benefit from DROPPING the refresh rate to the frame rate of the game. Can you point out to me, a single person suffering from their monitor's higher refresh rate in any games that never even exceed it? It has never been a problem and Nvidia are yet again trying to fix a problem that never even existed in the first place, and quite clearly you are being ignorant by ignoring simple facts and have not had any experience with what causes tearing.

If a monitor is running at a refresh rate above the framerate of the GPU, unless the monitor does some image post-processing, scaling or duplicates frames (like those 240Hz TVs), the monitor will only draw the frames it has. End of. That renders this G-SYNC gimmick worthless because it is trying to show a problem that was never there in the first place. Whether your monitor runs at 30Hz or 120Hz it will only draw the frames that it has -- if it is less than the refresh rate, it won't affect the monitor either way.

The ONLY problem that currently exists that has anything remotely to do with monitors is that when an old game runs too fast, you have to choose between A. running at a higher framerate and experiencing tearing or B. Capping the framerate with vsync. The biggest problem with Vsync is that it drops GPU utilization to the point where a GPU can barely distinguish between idle and 3D load and this is a problem that occurs only with Nvidia cards, even on single GPU setups (because they have too many clock profiles to switch between), which causes stuttering. AMD doesn't have this problem because they have idle, 2D (Blu-ray) and 3D load clock profiles, nothing more. This gimmick does nothing whatsoever to fix that, and every Nvidia GPU up to this point, from 200-700 series has had this problem, and Maxwell will continue to have it until they address every affected game individually in the drivers 1-2 years after initial release. My GTX 285 had this problem, my 460 had this problem until they fixed most of them 2 years back, and my 660 had this problem, until I returned it. When they can work out a way to scale their GPU cores/clusters to imitate old cards, they will solve the problem instantly. This G-SYNC crap does not affect this problem in either a positive or a negative way, therefore it is worthless (even more so to 120Hz/144Hz fast gaming monitor users). By all means feel free to explain any benefits from this tech that I am not seeing.

by: Assimilator
Innovations like G-SYNC and SHIELD streaming show exactly why NVIDIA titles themselves "World Leader in Visual Computing Technologies". Because they are.
Please give it a rest, bud. This G-sync and Shield (the joke of a Android tablet slapped onto a 360 controller, with about 30 games on its support list, about which almost nobody outside of North America even knows or gives a single shit about) are not innovations in the slightest, and they are exactly why Nvidia are slowly losing its consumer GPU market share to AMD, as well as the reason why hardcore PC gamers buying into this crap will continue to get ridiculed by our casual PC and console gaming bretheren. Instead of investing in features that matter, they continue churning out more pricey gimmicks. If that is what you're into, more power to you and continue buying Nvidia. I for one, see these "innovations" as gimmicks and they add no value whatsoever to their GPUs or anything else employing this sort of tech that will come at a premium because of it in comparison to AMD.

Nvidia (and AMD) deserve praise for a lot of things -- G-sync and Shield are neither of them.
Posted on Reply
#19
1d10t
by: The Von Matrices
Thank you. It seems as if for some people NVidia producing any technology first makes it inherently evil. I don't care who invented it, I support the technology. Kudos to NVidia for doing it first.
For those who claim this is proprietary with no evidence, I quote Anandtech stating exactly the opposite:
Just because no one else supports it yet doesn't mean that no one else ever will. Someone has to be first. Most standards aren't designed by committee but by everyone adopting one manufacturer's implementation for simplicity.
If and only if G-Sync try manipulate TDMS and DDC Clock over Display Port,there's highly probability this will leave DCP (Displayport Content Protection) exposed.Now that will be unpleasant for some.Unless G Sync is communicate between two return channel TMDS and DDC,this will crippling frame sequence rendering within two channel.And let me guess,this will not work on SLI and or Stereoscopic.

by: Assimilator
Innovations like SHIELD streaming show exactly why NVIDIA titles themselves "World Leader in Visual Computing Technologies". Because they are.
Ah yes...that's great.Let me ask a simple question.Do you own nVidia Shield or at least ever try it?Do you use android phones?Ever try running games on much cheaper Google Nexus 4?
Posted on Reply
#20
qubit
Overclocked quantum bit
Thinking about it a little more, while this reduces lag, you can't eliminate it like nvidia claims and here's why.

At the moment, whether vsync is on or off, once the GPU has rendered the frame, it doesn't get displayed until the next monitor refresh (this is independent of the refresh frequency, of course) so you get lag. The amount of lag will vary too, depending when in that cycle the GPU has rendered the frame. If vsync is off, then you'll get tearing and stutters, regardless of whether the GPU is rendering faster or slower than the refresh rate. Remember there's still lag in the system due to the rendering time of the GPU and general propagation delay through the controller and also the rest of the computer.

Now you turn G-Sync on, what happens? The monitor waits for the GPU, not refreshing until it gets the command from to do so. This results in the frame being displayed the instant it's ready. Lag may be reduced, but not eliminated. Tearing and stuttering will be eliminated, however, because the monitor will be displaying every frame and crucially only ever doing so once.

Lag isn't eliminated, because the GPU still requires time to build the frame. Imagine an extreme case where the GPU is slowed down to around 15-20fps (can easily happen). You'll still have lag corresponding to this variable frame rate and therefore the game will feel horribly laggy. Responsiveness to your controls will be improved however since the monitor displays the frame as soon as it's ready, but more importantly perhaps there will be no tearing or stutters, which is a significant benefit, because both of these effects look bloody awful. NVIDIA obviously gets this.

The only thing I wonder is if the player will notice dynamic artifacts with the game responsiveness, since the frame rate and synced refresh rate vary continuously and significantly? It might lead to some weird effect where the player feels disoriented perhaps? Maybe even inducing nausea in some people? I don't know, but this is something to look out for and I will when I get my hands on some demo hardware. NVIDIA are obviously not gonna tell you about this in a press release, lol.

So, in short, while I can see this technology improving the game play experience, there's still no substitute for putting out frames as fast as possible. I'm currently gaming at 120Hz with very few dropped frames which makes a world of difference over 60Hz. (Adding LightBoost strobing to the mix with its blur elimination then takes this experience to another level altogether). That doesn't change with G-Sync. It would be really interesting to get my hands on demo hardware and see G-Sync for myself.

by: Am*
Well quite clearly, you're the one playing dumb, seeing how you have yet to explain to me why someone running a 120Hz or even a 75Hz monitor, would benefit from DROPPING the refresh rate to the frame rate of the game. Can you point out to me, a single person suffering from their monitor's higher refresh rate in any games that never even exceed it? It has never been a problem and Nvidia are yet again trying to fix a problem that never even existed in the first place, and quite clearly you are being ignorant by ignoring simple facts and have not had any experience with what causes tearing.

If a monitor is running at a refresh rate above the framerate of the GPU, unless the monitor does some image post-processing, scaling or duplicates frames (like those 240Hz TVs), the monitor will only draw the frames it has. End of. That renders this G-SYNC gimmick worthless because it is trying to show a problem that was never there in the first place. Whether your monitor runs at 30Hz or 120Hz it will only draw the frames that it has -- if it is less than the refresh rate, it won't affect the monitor either way.
I don't think you properly understand what G-Sync does on a technical level or what causes tearing and stutters in the first place. Remember, the only way* to remove stutters is to have the GPU and monitor synced, it doesn't matter which way round that sync goes. NVIDIA have simply synced the monitor with the GPU ie synced the opposite way round to how it's done now.

Check out my explanation above and in my previous post a few posts back. Getting rid of stutters and tearing is a big deal, possibly even more so than the slight lag reduction. This feature may be proprietary for now, but it's no gimmick.

*Not strictly true, sort of. If I run an old game which is rendering at several hundred fps and leave vsync off, then it looks perfectly smooth, with very little judder or tearing - and noticeably less lag. It seems that brute forcing the problem with a large excess number of frames per second rendered by the GPU can really help, even though those frames are variable.
Posted on Reply
#21
shb-
by: RejZoR
Clearly neither do you then. Adaptive V-Sync is there to:
a) remove image tearing
b) doesn't pretty much half the framerate of FPS drops below certain level like normal V-Sync
Nvidia page:
NVIDIA's Adaptive VSync fixes both problems by unlocking the frame rate when below the VSync cap, which reduces stuttering, and by locking the frame rate when performance improves once more, thereby minimizing tearing.
Below 60 (120) fps you get no stuttering (basically vsync turns off), but see tearing. Above 60 (120) fps vsync is on as usual, resulting in no tearing and no visible stutter.

Really, why would nvidia develop something thats already solved, and a guy like asus join them? Nobody thats stupid. Also this tech is praised by various tech journalists and devs like John Karmack who seen it in action.

Only sh1tty thing about this is vendor lockin. We need this for amd and intel too, on every tv and monitor. Lets hope nvidia wont be stupid and open it up.
Posted on Reply
#22
tigger
I'm the only one
by: qubit
Thinking about it a little more, while this reduces lag, you can't eliminate it like nvidia claims and here's why.

At the moment, whether vsync is on or off, once the GPU has rendered the frame, it doesn't get displayed until the next monitor refresh (this is independent of the refresh frequency, of course) so you get lag. The amount of lag will vary too, depending when in that cycle the GPU has rendered the frame. If vsync is off, then you'll get tearing and stutters, regardless of whether the GPU is rendering faster or slower than the refresh rate. Remember there's still lag in the system due to the rendering time of the GPU and general propagation delay through the controller and also the rest of the computer.

Now you turn G-Sync on, what happens? The monitor waits for the GPU, not refreshing until it gets the command from to do so. This results in the frame being displayed the instant it's ready. Lag may be reduced, but not eliminated. Tearing and stuttering will be eliminated, however, because the monitor will be displaying every frame and crucially only ever doing so once.

Lag isn't eliminated, because the GPU still requires time to build the frame. Imagine an extreme case where the GPU is slowed down to around 15-20fps (can easily happen). You'll still have lag corresponding to this variable frame rate and therefore the game will feel horribly laggy. Responsiveness to your controls will be improved however since the monitor displays the frame as soon as it's ready, but more importantly perhaps there will be no tearing or stutters, which is a significant benefit, because both of these effects look bloody awful. NVIDIA obviously gets this.

The only thing I wonder is if the player will notice dynamic artifacts with the game responsiveness, since the frame rate and synced refresh rate vary continuously and significantly? It might lead to some weird effect where the player feels disoriented perhaps? Maybe even inducing nausea in some people? I don't know, but this is something to look out for and I will when I get my hands on some demo hardware. NVIDIA are obviously not gonna tell you about this in a press release, lol.

So, in short, while I can see this technology improving the game play experience, there's still no substitute for putting out frames as fast as possible. I'm currently gaming at 120Hz with very few dropped frames which makes a world of difference over 60Hz. (Adding LightBoost strobing to the mix with its blur elimination then takes this experience to another level altogether). That doesn't change with G-Sync. It would be really interesting to get my hands on demo hardware and see G-Sync for myself.



I don't think you properly understand what G-Sync does on a technical level or what causes tearing and stutters in the first place. Remember, the only way* to remove stutters is to have the GPU and monitor synced, it doesn't matter which way round that sync goes. NVIDIA have simply synced the monitor with the GPU ie synced the opposite way round to how it's done now.

Check out my explanation above and in my previous post a few posts back. Getting rid of stutters and tearing is a big deal, possibly even more so than the slight lag reduction. This feature may be proprietary for now, but it's no gimmick.

*Not strictly true, sort of. If I run an old game which is rendering at several hundred fps and leave vsync off, then it looks perfectly smooth, with very little judder or tearing - and noticeably less lag. It seems that brute forcing the problem with a large excess number of frames per second rendered by the GPU can really help, even though those frames are variable.
Couple of interesting posts mate, much more so than the rest of the fan boy crap and uninformed arguing from others.
Posted on Reply
#23
qubit
Overclocked quantum bit
by: Frick
EDIT: And I just can't fathom the depths to which this place has plunged. All this rage.. For something they have not seen irl. And if this does what it says it does, you do have to see it irl before you can pass judgement.
+1 there. Why do things like this induce such foaming at the mouth? It's fucking ridiculous. If someone doesn't want it, then just don't buy it. No one's forcing them.

by: tigger
Couple of interesting posts mate, much more so than the rest of the fan boy crap and uninformed arguing from others.
+1 again.

Another technical thing I've just thought of about G-Sync.

Regardless of how moving pictures are being displayed, they are still sampled, just like audio. This means that the Nyquist limit or Nyquist frequency applies.

Hence, for fast moving objects eg during frenetic FPS gaming, you want that limit to be as high as possible, since an object moving fast enough will not just be rendered with only a few frames, but will display sampling artefacts similar to the "reverse spokes" effect in cowboy movies of old. In gaming, you may not even see the object, or it may appear in completely the wrong place and of course, be heavily lagged. If the GPU drops to its minimum of 30fps, then you can bet you'll see this effect and in a twitchy FPS, that can easily mean the difference between fragging or being fragged.

So again, while G-Sync looks like a great innovation to me, there remains no substitute for a high framerate as well.
Posted on Reply
#24
jagd
Which innovations ? A company acting like headless chicken because losing it's main market ( GPU 's ) ? Nvidia has hardtime between AMD apus /raising tablets -smartphones - dropping pc sales etc etc and trying to find new markets nothing more nothing less.

Btw who you are to decide about mantle and true audio in behalf of all people on earth , i dont remember i made you speaker person :slap: . What you dont get is i want microsoft free gaming = No direct X fyi

by: Assimilator
Innovations like G-SYNC and SHIELD streaming show exactly why NVIDIA titles themselves "World Leader in Visual Computing Technologies". Because they are.
Posted on Reply
#25
remixedcat
It's all frothing rabid fandogs that see the "enemy" release something innovative thier faction doesn't have and they attack and hate it even though they haven't seen it, used it, or understand it.
Posted on Reply
Add your own comment