Friday, October 18th 2013

NVIDIA Introduces G-SYNC Technology for Gaming Monitors

Solving the decades-old problem of onscreen tearing, stuttering and lag, NVIDIA today unveiled NVIDIA G-SYNC technology which, for the first time, enables perfect synchronization between the GPU and the display. The result is consistently smooth frame rates and ultrafast response not possible with previous display technologies.

Several years in the making, G-SYNC technology synchronizes the monitor's refresh rate to the GPU's render rate, so images display the moment they are rendered. Scenes appear instantly. Objects are sharper. Game play is smoother.

"Our commitment to create a pure gaming experience led us to G-SYNC," said Jeff Fisher, senior vice president of the GeForce business unit at NVIDIA. "This revolutionary technology eliminates artifacts that have long stood between gamers and the game. Once you play on a G-SYNC monitor, you'll never want to go back."

Since their earliest days, displays have had fixed refresh rates -- typically 60 times a second (hertz). But, due to the dynamic nature of video games, GPUs render frames at varying rates. As the GPU seeks to synchronize with the monitor, persistent tearing occurs. Turning on V-SYNC (or Vertical-SYNC) eliminates tearing but causes increased lag and stutter as the GPU and monitor refresh at different rates.

G-SYNC eliminates this tradeoff. It perfectly syncs the monitor to the GPU, regardless of frame rate, leading to uncompromised PC gaming experiences.

G-SYNC technology includes a G-SYNC module designed by NVIDIA and integrated into gaming monitors, as well as hardware and software incorporated into certain Kepler-based GPUs. (A full list of GPUs that support NVIDIA G-SYNC is available here.)

Leading Game Developers Blown Away
Game developers have quickly embraced the benefits of G-SYNC technology, which enables their games to be played seamlessly.

"The huge gains in GPU rendering power over the past decade have enabled developers and artists to create increasingly complex 3D scenes and worlds. But even on the highest end PC, the illusion of reality is hampered by tearing and stutter. NVIDIA G-SYNC elegantly solves this longstanding problem. Images on a G-SYNC display are stunningly stable and lifelike. G-SYNC literally makes everything look better." - Tim Sweeney, founder, EPIC Games

"NVIDIA's G-SYNC technology is a truly innovative solution to an ancient legacy restriction with computer graphics, and it enables one to finally see perfect tear-free pictures with the absolute lowest latency possible. The resulting output really allows your mind to interpret and see it as a true continuous moving picture which looks and feels fantastic. It's something that has to be seen to be believed!" - Johan Andersson, technical director, DICE

"With G-SYNC, you can finally have your cake and eat it too -- make every bit of the GPU power at your disposal contribute to a significantly better visual experience without the drawbacks of tear and stutter." - John Carmack, co-founder, iD Software

Rollout Plans by Monitor Manufacturers
Many of the industry's leading monitor manufacturers have already included G-SYNC technology in their product roadmaps for 2014. Among the first planning to roll out the technology are ASUS, BenQ, Philips and ViewSonic.

"ASUS strives to provide the best gaming experience through leading innovations. We are excited about offering NVIDIA's new G-SYNC technology in a variety of ASUS gaming monitors. We are certain that it will impress gamers with its incredible step-up in smoothness and visual quality." - Vincent Chiou, associate vice president, Display Business Unit, ASUS

"We are extremely thrilled to build G-SYNC into our professional gaming monitors. The two together, offering gamers a significant competitive advantage, will certainly take PC gaming experience to a whole new level." - Peter Chen, general manager, BenQ Technology Product Center

"We can't wait to start offering Philips monitors with G-SYNC technology specifically for gamers. We believe that anyone who really cares about their gaming experience is going to want one." - Sean Shih, global product marketing director, TPV Technology, (TPV sells Philips brand monitors)

"Everyone here at ViewSonic is pumped about the great visual experience that G-SYNC delivers. We look forward to making the most of this technology with our award-winning line of gaming and professional displays." - Jeff Volpe, president, ViewSonic

Enthusiasm by System Builders and Integrators
A variety of the industry's leading system builders and integrators are planning to make G-SYNC technology available in the months ahead. Among them are Digital Storm, EVGA, Falcon Northwest, Overlord and Scan Computers.

"A look at G-SYNC is a look at the future of displays. Having to go back to a standard monitor after seeing it will start to annoy you. It's that good." -Kelt Reeves, founder and CEO, Falcon Northwest.

"G-SYNC is a ground-breaking technology that will deliver a flawless gaming experience. We're hugely excited that our customers will be able to enjoy incredible gameplay at any frame rate without tearing or stutter." - Elan Raja III, co-founder, Scan Computers.
Add your own comment

147 Comments on NVIDIA Introduces G-SYNC Technology for Gaming Monitors

#1
PopcornMachine
This may be a bit better than AMD droning on about sound, but not much.

I remain disappointed the by lack of innovation relevant to the consumer's current sound system and monitor.

In the end, all we really have received from these two esteemed institutions is re-branded cards.

...
Posted on Reply
#2
MikeMurphy
Neat idea, but I'd rather spend that money on a video card that will do 60fps at my desired resolution.
Posted on Reply
#3
Wile E
Power User
by: NeoXF
Yeah, 'kay. You obviously 've got everything figured out. And your only real problem in your life seems to be that you need a better paying job. Righahahahahat... :roll:


Again, open or bust.

It might have taken a lot of years, but microsoft is finally on the way out as a de-facto PC gaming platform... and why? Yeah, I'll let someone else figure this one out.


But I digress, we shall see. I'm far away from getting a gaming monitor anytime soon either way.
Almost everything that's open started as a response to a proprietary tech. For example, OpenCL wouldn't be nearly as far along as it is if it weren't for CUDA.

Innovation is innovation, and should be respected as such. This little piece of tech, if it performs worthwhile, will spawn further ideas and refinments, and maybe even an open standard.
Posted on Reply
#4
Solaris17
Creator Solaris Utility DVD
So as an nvidia supporter this is the most rediculous thing ever. Why pay the $$ whats the failure rate? does it work on the good panels? whats the color repro like? why not get a 120hz or 240hz monitor and not give a f@#$?
Posted on Reply
#5
Wile E
Power User
by: Solaris17
So as an nvidia supporter this is the most rediculous thing ever. Why pay the $$ whats the failure rate? does it work on the good panels? whats the color repro like? why not get a 120hz or 240hz monitor and not give a f@#$?
This is to eliminate stutter AND tearing. 120 or 140 does you no good if you can't run those framerates in your games. Run below that, and you get stuttering and tearing.
Posted on Reply
#6
Steevo
My TV already does frame scaling and interpolation to reduce tearing. Even at 13FPS when I have overloaded my GPU I barely see it.


This doesn't fix stutter or GPU stalls, it covers them up like bad lag by redrawing old frames.
Posted on Reply
#7
qubit
Overclocked quantum bit
by: Steevo
This doesn't fix stutter or GPU stalls, it covers them up like bad lag by redrawing old frames.
It doesn't cover things up, it fixes stutter and lag. You misunderstand how it works.

I made three previous posts in this thread explaining the technical aspects of how G-Sync works which should clear that up for you if you'd like to check them out.

If you got hitching without it, then you'll get this with it too. It's not made for fixing that, which is caused by lots of things, quite often by other background software running on the PC such as a virus scanner kicking in at the wrong moment, Windows paging etc.
Posted on Reply
#8
1d10t
by: qubit
It doesn't cover things up, it fixes stutter and lag. You misunderstand how it works.

I made three previous posts in this thread explaining the technical aspects of how G-Sync works which should clear that up for you if you'd like to check them out.

If you got hitching without it, then you'll get this with it too. It's not made for fixing that, which is caused by lots of things, quite often by other background software running on the PC such as a virus scanner kicking in at the wrong moment, Windows paging etc.
As my previous post,such thing already exist in the consumer electric.LG introduce Dynamic Motion Clarity Index and Samsung had Clear Motion Rate in late 2010.Both of them are capable of adjusting basic timing to nearest standard NTSC,PAL or SECAM regardless any source.They also had Motion Estimates Motion Compensate (MEMC) mechanism,"similar" with nVidia's Lightboost.Another features like Motion Interpolation (MI) have capabilities to doubling or quadrupling any source refresh rate to match standard characteristic 3D effect Frame Patterned Retarder (FPR).

Fundamentally speaking there's is major difference between desktop and consumer electric.There's no GPU in consumer electric,so the source keep locking in NTSC 29,97 fps or PAL 25fps.There's only two factor to keep up rather than "virtually infinite" in desktop.Most HDTV had to deal with various input such as A/V,Composite,S-Video,SCART,RGB D-Sub and HDMI which has different clock.AFAIK, dekstop only had 3 clock, 30Hz,60Hz and recently introduced 4K's 33Hz spread across DVI,HDMI and Display Port.

I stated before that i had LG panel that do 240Hz FPR 3D effect.Yet i admitted there is severe judder if i crank motion interpolation to the max.It's only natural for HDTV that can't do a massive task rendering full white or pitch black frame,in RGB mode with MEMC and MI switch on.In addition,my panel doesn't carry EDID,DDC or desktop similar timing and clock,so it only does 120 Hz or 240Hz interpolation mode.

Bottom line,it's good to see nVidia aware of this matters.G Sync could be a game changer for desktop to have advanced MEMC and MI.But then again,don't expect LG and Samsung for not bringing their tech to desktop at a reasonable price.
Posted on Reply
#9
qubit
Overclocked quantum bit
by: 1d10t
As my previous post,such thing already exist in the consumer electric.LG introduce Dynamic Motion Clarity Index and Samsung had Clear Motion Rate in late 2010.Both of them are capable of adjusting basic timing to nearest standard NTSC,PAL or SECAM regardless any source.They also had Motion Estimates Motion Compensate (MEMC) mechanism,"similar" with nVidia's Lightboost.Another features like Motion Interpolation (MI) have capabilities to doubling or quadrupling any source refresh rate to match standard characteristic 3D effect Frame Patterned Retarder (FPR).

Fundamentally speaking there's is major difference between desktop and consumer electric.There's no GPU in consumer electric,so the source keep locking in NTSC 29,97 fps or PAL 25fps.There's only two factor to keep up rather than "virtually infinite" in desktop.Most HDTV had to deal with various input such as A/V,Composite,S-Video,SCART,RGB D-Sub and HDMI which has different clock.AFAIK, dekstop only had 3 clock, 30Hz,60Hz and recently introduced 4K's 33Hz spread across DVI,HDMI and Display Port.

I stated before that i had LG panel that do 240Hz FPR 3D effect.Yet i admitted there is severe judder if i crank motion interpolation to the max.It's only natural for HDTV that can't do a massive task rendering full white or pitch black frame,in RGB mode with MEMC and MI switch on.In addition,my panel doesn't carry EDID,DDC or desktop similar timing and clock,so it only does 120 Hz or 240Hz interpolation mode.

Bottom line,it's good to see nVidia aware of this matters.G Sync could be a game changer for desktop to have advanced MEMC and MI.But then again,don't expect LG and Samsung for not bringing their tech to desktop at a reasonable price.
I'm not sure why you think this already exists? This has not been done before in any commercial product.

G-Sync synchronizes the monitor dynamically to the variable GPU framerate which changes from instant to instant, or from one frame to another. This feature is irrelevant for TVs, which are just reproducing video, hence just need to run at a fixed rate to match the video source.

It's very different with video cards, because here you have interaction between what the user does and their output on the screen. In this case, things like lag, refresh rate, stutter and screen tearing are very important and G-Sync addresses all these things simply by syncing the monitor to the varying framerate of the graphics card rather than the other way, syncing the graphics card to the fixed framerate of the monitor, as happens now.
Posted on Reply
#10
The Von Matrices
by: qubit
I'm not sure why you think this already exists? This has not been done before in any commercial product.
I think he's confused between the refresh rate and the TMDS clock rate, which will vary depending on resolution.
Posted on Reply
#11
Steevo
by: qubit
It doesn't cover things up, it fixes stutter and lag. You misunderstand how it works.

I made three previous posts in this thread explaining the technical aspects of how G-Sync works which should clear that up for you if you'd like to check them out.

If you got hitching without it, then you'll get this with it too. It's not made for fixing that, which is caused by lots of things, quite often by other background software running on the PC such as a virus scanner kicking in at the wrong moment, Windows paging etc.
So how does it fix stutter and lag?

A) By redrawing old frames on the screen.
B) Magic and pixie dust.



Don't know about you, but I am going with A here.


So lets do some math.

60FPS = .01667 rounded seconds per frame.

If a gamer notices the lack of updated frame(s) that get displayed as the "frame buffer" doesn't get updated with the next frame its "lag". http://www.hdtvtest.co.uk/news/input-lag All TV's and monitors have some native lag, this doesn't remove that, don't be confused on that point.

The lag or stutter here is http://www.mvps.org/directx/articles/fps_versus_frame_time.htm A increase in the number of MS it takes to render one or more of the frames to the output frame buffer.

the G-Spot runiner or whatever they want to call it can only work one of two ways to do as they are claiming.

A) Hold frames in a secondary buffer and display them at a set frequency rate that matches Vsync or whatever display rate it chooses. They can do things like what most TV's do, interpolating the two frames with motion adaptive/vector blending to create a new frame.

B) Or they can keep rerendering the frame and keep the pixels driven for more ms with the same data, which by definition is lag.....so.....yeah


There is this constant that we experience, called time, and unless they have started using quantum technology and keep us frozen in time while we wait for the GPU to render more fresh frames all we are going to see is lag or blur. Your choice, but now apparently you can pay for it.

http://en.wikipedia.org/wiki/Multiple_buffering

Triple buffer with Vsync.......been there done that, it was crap and caused lag.
Posted on Reply
#12
Wile E
Power User
by: Steevo
So how does it fix stutter and lag?

A) By redrawing old frames on the screen.
B) Magic and pixie dust.



Don't know about you, but I am going with A here.


So lets do some math.

60FPS = .01667 rounded seconds per frame.

If a gamer notices the lack of updated frame(s) that get displayed as the "frame buffer" doesn't get updated with the next frame its "lag". http://www.hdtvtest.co.uk/news/input-lag All TV's and monitors have some native lag, this doesn't remove that, don't be confused on that point.

The lag or stutter here is http://www.mvps.org/directx/articles/fps_versus_frame_time.htm A increase in the number of MS it takes to render one or more of the frames to the output frame buffer.

the G-Spot runiner or whatever they want to call it can only work one of two ways to do as they are claiming.

A) Hold frames in a secondary buffer and display them at a set frequency rate that matches Vsync or whatever display rate it chooses. They can do things like what most TV's do, interpolating the two frames with motion adaptive/vector blending to create a new frame.

B) Or they can keep rerendering the frame and keep the pixels driven for more ms with the same data, which by definition is lag.....so.....yeah


There is this constant that we experience, called time, and unless they have started using quantum technology and keep us frozen in time while we wait for the GPU to render more fresh frames all we are going to see is lag or blur. Your choice, but now apparently you can pay for it.
Or, it can alter the refresh rate of the monitor to match the output it's receiving from the video card, eliminating the need to hold back or duplicate frames, which is exactly what it's claiming to do.
Posted on Reply
#13
Steevo
by: Wile E
Or, it can alter the refresh rate of the monitor to match the output it's receiving from the video card, eliminating the need to hold back or duplicate frames, which is exactly what it's claiming to do.
So you would like to pay for lag? Seriously, I must be in the wrong business.
Posted on Reply
#14
Wile E
Power User
by: Steevo
So you would like to pay for lag? Seriously, I must be in the wrong business.
Supposedly it comes with little to no lag hit. I don't know, but I do know I tend to actually see a product in action before I go about making judgments and accusations on it.
Posted on Reply
#15
Steevo
by: Wile E
Supposedly it comes with little to no lag hit. I don't know, but I do know I tend to actually see a product in action before I go about making judgments and accusations on it.
I know of the time constant, and buffering, and other attempts, and all still have that time issue.


I will also wait to see it in action.
Posted on Reply
#16
Xzibit
Only thing that's been shown is Nvidias demo and a slow camera rotation in Tomb Raider with Lara Croft by her self.

We don't even know the specs of the systems they were run on.

Nvidia Tom Peterson also said it was game dependent as well. Some games will not work with it and 2D is not a focus.

Until someone test this in several game scenarios I'll remain skeptical much like the 3D Vision Surround craze.
Posted on Reply
#17
1d10t
by: qubit
I'm not sure why you think this already exists? This has not been done before in any commercial product.

G-Sync synchronizes the monitor dynamically to the variable GPU framerate which changes from instant to instant, or from one frame to another. This feature is irrelevant for TVs, which are just reproducing video, hence just need to run at a fixed rate to match the video source.

It's very different with video cards, because here you have interaction between what the user does and their output on the screen. In this case, things like lag, refresh rate, stutter and screen tearing are very important and G-Sync addresses all these things simply by syncing the monitor to the varying framerate of the graphics card rather than the other way, syncing the graphics card to the fixed framerate of the monitor, as happens now.
Please read my explanation above.I do aware of all desktop thing in matter of speaking.
If you trying to distinguish between dekstop and consumer electric,let me ask you a question...have you tried 3D TV FPR,3D TV Passive and nVidia 3D Vision and tell the difference between them?
TV wise,they do dynamically upsample source to have 120Hz or 240 Hz display in opposite of G Sync dynamically downsample display to match GPU output.
For convincing this is already had long history exist in consumer electric,have you noticed major panel maker such as LG,Samsung,Sharp supporting this?It's only desktop specific vendor listed such as Asus,Benq,Phillips and Viewsonic.
It's only a matter of time LG and Samsung will make a monitor supporting this dynamically downsample feature,on the other hand nVidia couldn't make a better dishwasher.

by: The Von Matrices
I think he's confused between the refresh rate and the TMDS clock rate, which will vary depending on resolution.
Yeah..i don't even know the difference between 60Hz and 60fps...silly me :)
Posted on Reply
#18
qubit
Overclocked quantum bit
by: Steevo
So how does it fix stutter and lag?

A) By redrawing old frames on the screen.
B) Magic and pixie dust.



Don't know about you, but I am going with A here.


So lets do some math.

60FPS = .01667 rounded seconds per frame.

If a gamer notices the lack of updated frame(s) that get displayed as the "frame buffer" doesn't get updated with the next frame its "lag". http://www.hdtvtest.co.uk/news/input-lag All TV's and monitors have some native lag, this doesn't remove that, don't be confused on that point.

The lag or stutter here is http://www.mvps.org/directx/articles/fps_versus_frame_time.htm A increase in the number of MS it takes to render one or more of the frames to the output frame buffer.

the G-Spot runiner or whatever they want to call it can only work one of two ways to do as they are claiming.

A) Hold frames in a secondary buffer and display them at a set frequency rate that matches Vsync or whatever display rate it chooses. They can do things like what most TV's do, interpolating the two frames with motion adaptive/vector blending to create a new frame.

B) Or they can keep rerendering the frame and keep the pixels driven for more ms with the same data, which by definition is lag.....so.....yeah


There is this constant that we experience, called time, and unless they have started using quantum technology and keep us frozen in time while we wait for the GPU to render more fresh frames all we are going to see is lag or blur. Your choice, but now apparently you can pay for it.

http://en.wikipedia.org/wiki/Multiple_buffering

Triple buffer with Vsync.......been there done that, it was crap and caused lag.
I'm sorry, but you're wrong on all counts. I've already written an explanation of how this works, so I'm not gonna go round in circles with you trying to explain it again, especially after having seen your further replies since the one to me. Click the link below, where I explained it in three posts (just scroll down to see the other two).

http://www.techpowerup.com/forums/showthread.php?p=3000195#post3000195


Mind you, I like the idea of the pixie dust. ;)


by: 1d10t
Please read my explanation above.I do aware of all desktop thing in matter of speaking.
If you trying to distinguish between dekstop and consumer electric,let me ask you a question...have you tried 3D TV FPR,3D TV Passive and nVidia 3D Vision and tell the difference between them?
TV wise,they do dynamically upsample source to have 120Hz or 240 Hz display in opposite of G Sync dynamically downsample display to match GPU output.
For convincing this is already had long history exist in consumer electric,have you noticed major panel maker such as LG,Samsung,Sharp supporting this?It's only desktop specific vendor listed such as Asus,Benq,Phillips and Viewsonic.
It's only a matter of time LG and Samsung will make a monitor supporting this dynamically downsample feature,on the other hand nVidia couldn't make a better dishwasher.
Yes, not only have I tried 3D Vision, but I have it and it works very well too. I think active shutter glasses were around before NVIDIA brought out their version. How is this relevant?

I think the link I posted above for Steevo will help you understand what I'm saying, as well. Again though, G-Sync is a first by NVIDIA, as it's irrelevant for watching video. It's the interaction caused by gaming that makes all the difference.
Posted on Reply
#19
markybox
by: qubit
Couple of things that might be worse though are motion blur and the shape distortion* of moving objects, both of which are currently fixed by nvidia's LightBoost strobing backlight feature which my monitor has. The PR doesn't mention LightBoost anywhere
Are you aware that G-SYNC has a superior, low-persistence strobe backlight mode? Google for it -- pcper report, Blur Busters article, John Carmack youtube, etc.

So G-SYNC includes an improved LightBoost sequel, officially sanctioned by nVIDIA (but not yet announced). Probably with better colors (and sharper motion than LightBoost=10%), and very easily enabled via OSD menus.

You can only choose G-SYNC mode, or strobed mode, though.
But I'm happy it will be even better than LightBoost, officially for 2D mode.
This is NVIDIA's secret weapon. Probably has an unannounced brand name.
Posted on Reply
#20
qubit
Overclocked quantum bit
by: markybox
Are you aware that G-SYNC has a superior, low-persistence strobe backlight mode? Google for it -- pcper report, Blur Busters article, John Carmack youtube, etc.

So G-SYNC includes an improved LightBoost sequel, officially sanctioned by nVIDIA (but not yet announced). Probably with better colors (and sharper motion than LightBoost=10%), and very easily enabled via OSD menus.

You can only choose G-SYNC mode, or strobed mode, though.
But I'm happy it will be even better than LightBoost.
Yes, it's obvious that at the lower frame rates, a strobed backlight would be highly visible and intensely annoying.

Be interesting to see exactly how NVIDIA address this. Got a link?
Posted on Reply
#21
markybox
by: qubit
Yes, it's obvious that at the lower frame rates, a strobed backlight would be highly visible and intensely annoying.
Be interesting to see exactly how NVIDIA address this. Got a link?
It's an "either-or" proposition, according to John Carmack:
CONFIRMED: nVidia G-SYNC includes a strobe backlight upgrade!

It is currently a selectable choice:
G-SYNC Mode: Better for variable framerates (eliminate stutters/tearing, more blur)
Strobe Mode: Better for constant max framerates (e.g. 120fps @ 120Hz, eliminates blur)
Also, 85Hz and 144Hz strobing is mentioned on the main page.
Posted on Reply
#22
qubit
Overclocked quantum bit
by: markybox
It's an "either-or" proposition, according to John Carmack:
CONFIRMED: nVidia G-SYNC includes a strobe backlight upgrade!

It is currently a selectable choice:
G-SYNC Mode: Better for variable framerates (eliminate stutters/tearing, more blur)
Strobe Mode: Better for constant max framerates (e.g. 120fps @ 120Hz, eliminates blur)
Also, 85Hz and 144Hz strobing is mentioned on the main page.
Cheers matey. I'm busy right now, but I'll read it later and get back to you.

I found these two links in the meantime:

http://www.pcper.com/reviews/Graphics-Cards/NVIDIA-G-Sync-Death-Refresh-Rate

http://www.pcper.com/news/Graphics-Cards/PCPer-Live-NVIDIA-G-Sync-Discussion-Tom-Petersen-QA
Posted on Reply
#23
erocker
I would love to see Nvidia have these things setup at retailers. The only way I could make a purchasing decision on something like this is to see it running in person.
Posted on Reply
#24
MxPhenom 216
Corsair Fanboy
by: erocker
I would love to see Nvidia have these things setup at retailers. The only way I could make a purchasing decision on something like this is to see it running in person.
You looking to get the 780Ti now if you are interested in G-SYNC?
Posted on Reply
#25
erocker
by: MxPhenom 216
You looking to get the 780Ti now if you are interested in G-SYNC?
Maybe. I also want to try out lightboost. It all depends on price/performance though, I've been very happy with my 7970 but I'm not particular to any brand. Thing is, it wouldn't make sense for me to buy a new monitor.
Posted on Reply
Add your own comment