Friday, October 18th 2013
NVIDIA Introduces G-SYNC Technology for Gaming Monitors
Solving the decades-old problem of onscreen tearing, stuttering and lag, NVIDIA today unveiled NVIDIA G-SYNC technology which, for the first time, enables perfect synchronization between the GPU and the display. The result is consistently smooth frame rates and ultrafast response not possible with previous display technologies.
Several years in the making, G-SYNC technology synchronizes the monitor's refresh rate to the GPU's render rate, so images display the moment they are rendered. Scenes appear instantly. Objects are sharper. Game play is smoother."Our commitment to create a pure gaming experience led us to G-SYNC," said Jeff Fisher, senior vice president of the GeForce business unit at NVIDIA. "This revolutionary technology eliminates artifacts that have long stood between gamers and the game. Once you play on a G-SYNC monitor, you'll never want to go back."
Since their earliest days, displays have had fixed refresh rates -- typically 60 times a second (hertz). But, due to the dynamic nature of video games, GPUs render frames at varying rates. As the GPU seeks to synchronize with the monitor, persistent tearing occurs. Turning on V-SYNC (or Vertical-SYNC) eliminates tearing but causes increased lag and stutter as the GPU and monitor refresh at different rates.
G-SYNC eliminates this tradeoff. It perfectly syncs the monitor to the GPU, regardless of frame rate, leading to uncompromised PC gaming experiences.
G-SYNC technology includes a G-SYNC module designed by NVIDIA and integrated into gaming monitors, as well as hardware and software incorporated into certain Kepler-based GPUs. (A full list of GPUs that support NVIDIA G-SYNC is available here.)
Leading Game Developers Blown Away
Game developers have quickly embraced the benefits of G-SYNC technology, which enables their games to be played seamlessly.
"The huge gains in GPU rendering power over the past decade have enabled developers and artists to create increasingly complex 3D scenes and worlds. But even on the highest end PC, the illusion of reality is hampered by tearing and stutter. NVIDIA G-SYNC elegantly solves this longstanding problem. Images on a G-SYNC display are stunningly stable and lifelike. G-SYNC literally makes everything look better." - Tim Sweeney, founder, EPIC Games
"NVIDIA's G-SYNC technology is a truly innovative solution to an ancient legacy restriction with computer graphics, and it enables one to finally see perfect tear-free pictures with the absolute lowest latency possible. The resulting output really allows your mind to interpret and see it as a true continuous moving picture which looks and feels fantastic. It's something that has to be seen to be believed!" - Johan Andersson, technical director, DICE
"With G-SYNC, you can finally have your cake and eat it too -- make every bit of the GPU power at your disposal contribute to a significantly better visual experience without the drawbacks of tear and stutter." - John Carmack, co-founder, iD Software
Rollout Plans by Monitor Manufacturers
Many of the industry's leading monitor manufacturers have already included G-SYNC technology in their product roadmaps for 2014. Among the first planning to roll out the technology are ASUS, BenQ, Philips and ViewSonic.
"ASUS strives to provide the best gaming experience through leading innovations. We are excited about offering NVIDIA's new G-SYNC technology in a variety of ASUS gaming monitors. We are certain that it will impress gamers with its incredible step-up in smoothness and visual quality." - Vincent Chiou, associate vice president, Display Business Unit, ASUS
"We are extremely thrilled to build G-SYNC into our professional gaming monitors. The two together, offering gamers a significant competitive advantage, will certainly take PC gaming experience to a whole new level." - Peter Chen, general manager, BenQ Technology Product Center
"We can't wait to start offering Philips monitors with G-SYNC technology specifically for gamers. We believe that anyone who really cares about their gaming experience is going to want one." - Sean Shih, global product marketing director, TPV Technology, (TPV sells Philips brand monitors)
"Everyone here at ViewSonic is pumped about the great visual experience that G-SYNC delivers. We look forward to making the most of this technology with our award-winning line of gaming and professional displays." - Jeff Volpe, president, ViewSonic
Enthusiasm by System Builders and Integrators
A variety of the industry's leading system builders and integrators are planning to make G-SYNC technology available in the months ahead. Among them are Digital Storm, EVGA, Falcon Northwest, Overlord and Scan Computers.
"A look at G-SYNC is a look at the future of displays. Having to go back to a standard monitor after seeing it will start to annoy you. It's that good." -Kelt Reeves, founder and CEO, Falcon Northwest.
"G-SYNC is a ground-breaking technology that will deliver a flawless gaming experience. We're hugely excited that our customers will be able to enjoy incredible gameplay at any frame rate without tearing or stutter." - Elan Raja III, co-founder, Scan Computers.
Several years in the making, G-SYNC technology synchronizes the monitor's refresh rate to the GPU's render rate, so images display the moment they are rendered. Scenes appear instantly. Objects are sharper. Game play is smoother."Our commitment to create a pure gaming experience led us to G-SYNC," said Jeff Fisher, senior vice president of the GeForce business unit at NVIDIA. "This revolutionary technology eliminates artifacts that have long stood between gamers and the game. Once you play on a G-SYNC monitor, you'll never want to go back."
Since their earliest days, displays have had fixed refresh rates -- typically 60 times a second (hertz). But, due to the dynamic nature of video games, GPUs render frames at varying rates. As the GPU seeks to synchronize with the monitor, persistent tearing occurs. Turning on V-SYNC (or Vertical-SYNC) eliminates tearing but causes increased lag and stutter as the GPU and monitor refresh at different rates.
G-SYNC eliminates this tradeoff. It perfectly syncs the monitor to the GPU, regardless of frame rate, leading to uncompromised PC gaming experiences.
G-SYNC technology includes a G-SYNC module designed by NVIDIA and integrated into gaming monitors, as well as hardware and software incorporated into certain Kepler-based GPUs. (A full list of GPUs that support NVIDIA G-SYNC is available here.)
Leading Game Developers Blown Away
Game developers have quickly embraced the benefits of G-SYNC technology, which enables their games to be played seamlessly.
"The huge gains in GPU rendering power over the past decade have enabled developers and artists to create increasingly complex 3D scenes and worlds. But even on the highest end PC, the illusion of reality is hampered by tearing and stutter. NVIDIA G-SYNC elegantly solves this longstanding problem. Images on a G-SYNC display are stunningly stable and lifelike. G-SYNC literally makes everything look better." - Tim Sweeney, founder, EPIC Games
"NVIDIA's G-SYNC technology is a truly innovative solution to an ancient legacy restriction with computer graphics, and it enables one to finally see perfect tear-free pictures with the absolute lowest latency possible. The resulting output really allows your mind to interpret and see it as a true continuous moving picture which looks and feels fantastic. It's something that has to be seen to be believed!" - Johan Andersson, technical director, DICE
"With G-SYNC, you can finally have your cake and eat it too -- make every bit of the GPU power at your disposal contribute to a significantly better visual experience without the drawbacks of tear and stutter." - John Carmack, co-founder, iD Software
Rollout Plans by Monitor Manufacturers
Many of the industry's leading monitor manufacturers have already included G-SYNC technology in their product roadmaps for 2014. Among the first planning to roll out the technology are ASUS, BenQ, Philips and ViewSonic.
"ASUS strives to provide the best gaming experience through leading innovations. We are excited about offering NVIDIA's new G-SYNC technology in a variety of ASUS gaming monitors. We are certain that it will impress gamers with its incredible step-up in smoothness and visual quality." - Vincent Chiou, associate vice president, Display Business Unit, ASUS
"We are extremely thrilled to build G-SYNC into our professional gaming monitors. The two together, offering gamers a significant competitive advantage, will certainly take PC gaming experience to a whole new level." - Peter Chen, general manager, BenQ Technology Product Center
"We can't wait to start offering Philips monitors with G-SYNC technology specifically for gamers. We believe that anyone who really cares about their gaming experience is going to want one." - Sean Shih, global product marketing director, TPV Technology, (TPV sells Philips brand monitors)
"Everyone here at ViewSonic is pumped about the great visual experience that G-SYNC delivers. We look forward to making the most of this technology with our award-winning line of gaming and professional displays." - Jeff Volpe, president, ViewSonic
Enthusiasm by System Builders and Integrators
A variety of the industry's leading system builders and integrators are planning to make G-SYNC technology available in the months ahead. Among them are Digital Storm, EVGA, Falcon Northwest, Overlord and Scan Computers.
"A look at G-SYNC is a look at the future of displays. Having to go back to a standard monitor after seeing it will start to annoy you. It's that good." -Kelt Reeves, founder and CEO, Falcon Northwest.
"G-SYNC is a ground-breaking technology that will deliver a flawless gaming experience. We're hugely excited that our customers will be able to enjoy incredible gameplay at any frame rate without tearing or stutter." - Elan Raja III, co-founder, Scan Computers.
147 Comments on NVIDIA Introduces G-SYNC Technology for Gaming Monitors
I support technological advancement, whether it is NVidia, AMD, Intel, or some other company, and G-sync looks promising. This is the type of disruptive technology like AMD did with Eyefinity where no one saw it coming and it will take a few years before everyone else catches up.
Yeah nv first with SLI, but not the first to do it "right". FCAT was just an anti AMD tool to bottleneck AMD sales. PhysX wasn't even nvidias technology, they just bought it to make better marketing than ATI. Better Linux support? Read this and cry. OpenGL and OpenCL is pretty much AMD territory at this moment.
Then what do you mean by nvidia fighting two generations by AMD with only one generation? GF10*(gen1) GF11*(Gen2). Maxwell is far FAR away(late 2014) and by then AMD will be as well hitting their next generation.
NVIDIA GeForce GT 630M
its at the lower end for sure size wise
a) remove image tearing
b) doesn't pretty much half the framerate of FPS drops below certain level like normal V-Sync does
So, why do we need G-Sync to remove image tearing again? Well, AMD does say you need a GCN powered hardware to use it and NVIDIA doesn't have that at all. And since it's an low level API it is very hardware specific. That would be like expecting ATI, S3 and NVIDIA to support Glide back in its days. Natively.
High level API's work that way (but are slower, aka Direct3D), low level ones don't (and are faster as a result, aka Matle or Glide).
Back then it was possible but now a days it be a lawsuit frenzy. That's why there no CUDA\PhysX wrappers. Slower\buggy but at least you weren't locked out of the API entirely.
Nvidia G-Sync FAQ Boo!!! :rolleyes:
aand it looks i might need a new system too with a kepler card :banghead:
Glide Wrapper means emulation. Emulation means slow. How exactly that solves anything? Glide wrappers exist because you could play games that would then look better. Besides, what you can emulate from Voodoo cards doesn't mean you can emulate modern stuff today. Stuff was rather simple back then... Vertex Shaders had software emulation, but it was slower. Pixel Shaders were not even possible to emulate without getting 1 frame per minute... and neither was available on any Voodoo card...
Mantle is performance only thing and if you negate the boost with emulation, wouldn't it be easier to just stay with Direct3D 11 then and remain at point zero?
Anyway wasn't qubit going on about something like this some time ago?
Say this like you are teaching it to people at a class and not like you are rabid fandogs or sales people.
No matter what this company is releasing it is always going to get this same hatred. Doesn't matter what the greatest minds of game development think.
G-SYNC board will support Lightboost too.
www.neogaf.com/forum/showpost.php?p=86572603&postcount=539
edit:
30Hz is minimum, variable between 30Hz to 144Hz (obvious max limit)
twitter.com/ID_AA_Carmack/status/391300283672051713
Wait, what...$175 dollars for more useless bullshit with a support list of 10-20 games that will never even exceed the refresh rate of a 120Hz monitor, unless they run at lower than garbage settings of a console and 95% of the time needing to turn off the "exclusive" shit for which you've payed out the ass for. :banghead: And people are supporting this shit...if Nvidia (not directed at you BTW) doesn't have the most braindead and loyal fanboys, I don't know what other company does (other than Apple). I would take those days over the price fixing bullshit Nvidia are playing now any day -- at least competition was fierce back then, and you still had various ways to run proprietary tech yourself (with Glide wrappers etc).
Personally,i don't hate nVidia for their "innovation" or "breakthrough" technology.They push gaming to a new level,creating a better experience and better enjoyment.I just hated their price and the so called "proprietary" tech. A better DSP doesn't translate to better audio,it's only small fragment.You need a better amplifier that can encode DSP digital signal and better speaker to translate analog signal from amplifier.
FCAT was surely anti AMD,because AMD uses AFR rather than nVidia's SFR.Look at bright sides,AMD now working for better driver to address frame pacing issues.
I know how hard I try to shut up about things I don't know much about yet. I know what you're saying but:
1. Kepler might not support it, but Maxwell or Volta at the very list could. GPUs are way more complex and adaptable creatures then what they used to be back then... and even then:
2. You could use wrappers to run Glide on non-3dfx hardware...
Ah, I still remember how I finally got Carmageddon 2 to work on a glide wrapper on nV hardware, a world of difference in graphics...
Most PC gamers have invested a lot of moola on their monitors (myself included) and if nVidia really wants to see this tech take off so to speak, they must make this available to as many costumers as they can, not just people who invest in new monitors starting next year...
Adaptive V-Sync, simply removes the restriction on frame rates when it drops below 60. So instead of hard locking your game into either 60FPS or 30FPS, if it drops below, it simply reacts like V-sync isn't enabled, so it can run at 45FPS instead of being locked to intervals of 30. Again, this has nothing to do with the monitor and can not control it. Why do you even think Adaptive V-sync can change the monitor's refresh rate? How do you expect Adaptive V-sync to change the monitor's refresh rate - on the fly - to 45Hz? It doesn't and it physically can't. There is no standard that allows that to happen. There is no protocol that allows that to happen.
That is what G-Sync is. Maybe one of the consortiums can come up with a hardware agnostic standard...at some point. We don't know when, or if any of the consortiums even care. So please, for the love of god, stop making baseless (and wholly inaccurate and ignorant) assumptions. There isn't a single freaking monitor that works the way you describe these days and no firmware update is going to change that. Ever.
I've never set up a WC loop, so you don't see me going into sub-forums and threads about setting up WC loops and throwing around BS beliefs on something I clearly know nothing about.
Who do these people think they are by offering innovation in this field?? Let's boycott Nvidia and burn all their engineers at the stake! And then to have the audacity to sell this technology and make a profit! How dare them?!!
/S :rolleyes:
ok, i have to give you one thing, you can now watch your ass being fragged without stutter :D exactly.. if i decide to please someone, i do it for free..
remember when your momma asked you to take out the litter back then? did you ask like $270 for the service? the whole world will be soon eaten by the false religion that is the economy, you just wait.
Care to explain where this G Sync take a part?