1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Introduces G-SYNC Technology for Gaming Monitors

Discussion in 'News' started by Cristian_25H, Oct 18, 2013.

  1. Cristian_25H

    Cristian_25H News Poster

    Joined:
    Dec 6, 2011
    Messages:
    4,624 (4.36/day)
    Thanks Received:
    1,164
    Location:
    Still on the East Side
    Solving the decades-old problem of onscreen tearing, stuttering and lag, NVIDIA today unveiled NVIDIA G-SYNC technology which, for the first time, enables perfect synchronization between the GPU and the display. The result is consistently smooth frame rates and ultrafast response not possible with previous display technologies.

    Several years in the making, G-SYNC technology synchronizes the monitor's refresh rate to the GPU's render rate, so images display the moment they are rendered. Scenes appear instantly. Objects are sharper. Game play is smoother.

    [​IMG] [​IMG]

    "Our commitment to create a pure gaming experience led us to G-SYNC," said Jeff Fisher, senior vice president of the GeForce business unit at NVIDIA. "This revolutionary technology eliminates artifacts that have long stood between gamers and the game. Once you play on a G-SYNC monitor, you'll never want to go back."

    Since their earliest days, displays have had fixed refresh rates -- typically 60 times a second (hertz). But, due to the dynamic nature of video games, GPUs render frames at varying rates. As the GPU seeks to synchronize with the monitor, persistent tearing occurs. Turning on V-SYNC (or Vertical-SYNC) eliminates tearing but causes increased lag and stutter as the GPU and monitor refresh at different rates.

    G-SYNC eliminates this tradeoff. It perfectly syncs the monitor to the GPU, regardless of frame rate, leading to uncompromised PC gaming experiences.

    G-SYNC technology includes a G-SYNC module designed by NVIDIA and integrated into gaming monitors, as well as hardware and software incorporated into certain Kepler-based GPUs. (A full list of GPUs that support NVIDIA G-SYNC is available here.)

    Leading Game Developers Blown Away
    Game developers have quickly embraced the benefits of G-SYNC technology, which enables their games to be played seamlessly.

    "The huge gains in GPU rendering power over the past decade have enabled developers and artists to create increasingly complex 3D scenes and worlds. But even on the highest end PC, the illusion of reality is hampered by tearing and stutter. NVIDIA G-SYNC elegantly solves this longstanding problem. Images on a G-SYNC display are stunningly stable and lifelike. G-SYNC literally makes everything look better." - Tim Sweeney, founder, EPIC Games

    "NVIDIA's G-SYNC technology is a truly innovative solution to an ancient legacy restriction with computer graphics, and it enables one to finally see perfect tear-free pictures with the absolute lowest latency possible. The resulting output really allows your mind to interpret and see it as a true continuous moving picture which looks and feels fantastic. It's something that has to be seen to be believed!" - Johan Andersson, technical director, DICE

    "With G-SYNC, you can finally have your cake and eat it too -- make every bit of the GPU power at your disposal contribute to a significantly better visual experience without the drawbacks of tear and stutter." - John Carmack, co-founder, iD Software

    Rollout Plans by Monitor Manufacturers
    Many of the industry's leading monitor manufacturers have already included G-SYNC technology in their product roadmaps for 2014. Among the first planning to roll out the technology are ASUS, BenQ, Philips and ViewSonic.

    "ASUS strives to provide the best gaming experience through leading innovations. We are excited about offering NVIDIA's new G-SYNC technology in a variety of ASUS gaming monitors. We are certain that it will impress gamers with its incredible step-up in smoothness and visual quality." - Vincent Chiou, associate vice president, Display Business Unit, ASUS

    "We are extremely thrilled to build G-SYNC into our professional gaming monitors. The two together, offering gamers a significant competitive advantage, will certainly take PC gaming experience to a whole new level." - Peter Chen, general manager, BenQ Technology Product Center

    "We can't wait to start offering Philips monitors with G-SYNC technology specifically for gamers. We believe that anyone who really cares about their gaming experience is going to want one." - Sean Shih, global product marketing director, TPV Technology, (TPV sells Philips brand monitors)

    "Everyone here at ViewSonic is pumped about the great visual experience that G-SYNC delivers. We look forward to making the most of this technology with our award-winning line of gaming and professional displays." - Jeff Volpe, president, ViewSonic

    Enthusiasm by System Builders and Integrators
    A variety of the industry's leading system builders and integrators are planning to make G-SYNC technology available in the months ahead. Among them are Digital Storm, EVGA, Falcon Northwest, Overlord and Scan Computers.

    "A look at G-SYNC is a look at the future of displays. Having to go back to a standard monitor after seeing it will start to annoy you. It's that good." -Kelt Reeves, founder and CEO, Falcon Northwest.

    "G-SYNC is a ground-breaking technology that will deliver a flawless gaming experience. We're hugely excited that our customers will be able to enjoy incredible gameplay at any frame rate without tearing or stutter." - Elan Raja III, co-founder, Scan Computers.
     
  2. de.das.dude

    de.das.dude Pro Indian Modder

    Joined:
    Jun 13, 2010
    Messages:
    7,826 (4.89/day)
    Thanks Received:
    2,089
    just saw this on facebook!
     
  3. tigger

    tigger I'm the only one

    Joined:
    Mar 20, 2006
    Messages:
    10,183 (3.24/day)
    Thanks Received:
    1,399
    It's going to put a £100+ premium on the price of the monitor though I reckon.
     
  4. Am*

    Joined:
    Nov 1, 2011
    Messages:
    253 (0.23/day)
    Thanks Received:
    53
    Yet again, more useless, proprietary and gimmicky crap from Nvidia. This is nothing that can't be implemented as a firmware upgrade for literally every monitor available by forcing it to drop excessive frames or simply stopping the GPU from rendering more than the refresh rate allows in the drivers. I'm pretty certain several monitors, including mine, already implements something like this unofficially, as I see no tearing even in ancient games like COD4 and Q3A that run well over 250FPS.

    Also, have these guys ever heard of Dxtory?

    Please GTFO with more of this proprietary, overpriced and useless bullshit to further fragment PC gaming, Nvidia.

    EDIT: and good lord, that stupid name...G-SYNC? Sounds like the name of an Nsync tribute band.
     
    Last edited: Oct 18, 2013
    Roph and NeoXF say thanks.
  5. rokazs1

    rokazs1

    Joined:
    Nov 21, 2011
    Messages:
    24 (0.02/day)
    Thanks Received:
    5
    Location:
    Herning, Denmark
    Don't forget 780ti !:cool:
     
  6. SIGSEGV

    SIGSEGV

    Joined:
    Mar 31, 2012
    Messages:
    508 (0.54/day)
    Thanks Received:
    107
    sigh, another proprietary crap from nvidia :wtf:
     
    NeoXF says thanks.
  7. Xzibit

    Joined:
    Apr 30, 2012
    Messages:
    1,121 (1.23/day)
    Thanks Received:
    252
    I was wondering what Nvidia was going to do with all the un-sold Tegra 4 chips

    Besides Nvidia users don't have stuttering nor tearing....right guys ???
     
  8. Renald New Member

    Joined:
    Apr 1, 2013
    Messages:
    15 (0.03/day)
    Thanks Received:
    0
    I've 60+ FPS on most of the games with a 200€ card.

    Why would it be useful to have that ? It will not resolve multi-GPU problems, and it's useless for other usage.


    I must surrender, it's too stupid, even from them. :respect:
     
  9. Prima.Vera

    Prima.Vera

    Joined:
    Sep 15, 2011
    Messages:
    2,256 (1.98/day)
    Thanks Received:
    294
    How much did nVidia pay for all those guys to have North Korean adulation declarations for this new crap??
     
  10. RejZoR

    RejZoR

    Joined:
    Oct 2, 2004
    Messages:
    4,747 (1.29/day)
    Thanks Received:
    990
    Location:
    Europe/Slovenia
    Does it even work with AMD GPU's ? If not, it's as useless as it can get.
     
    NeoXF and FordGT90Concept say thanks.
  11. SIGSEGV

    SIGSEGV

    Joined:
    Mar 31, 2012
    Messages:
    508 (0.54/day)
    Thanks Received:
    107
    hope it helps..
     
  12. Recus

    Recus

    Joined:
    Jul 10, 2011
    Messages:
    529 (0.44/day)
    Thanks Received:
    180
    Desperate AMD fanboys likes AMD's "exclusive" features onscreen tearing, stuttering and lag. :rolleyes:

    Today local pharmacy closed earlier. Did you attacked them because they are selling useless, proprietary and gimmicky pharmacy companies drugs?

    Mantle won't work on Nvidia. Why it's not useless?
     
    Doc41, Crap Daddy and MxPhenom 216 say thanks.
  13. TheMailMan78

    TheMailMan78 Big Member

    Joined:
    Jun 3, 2007
    Messages:
    21,148 (7.81/day)
    Thanks Received:
    7,675
    Cadaveca and I were talking about this about a year ago. Sometime (no matter the mfg of the GPU) the "flow" of the animations and movement on the screen would be so smooth that it gave the same feeling as watching a movie on a 120mhz HD monitor. However the "sensation" for lack of a better word was always short lived. He tried to figure out what was causing it and so did I. He also asked W1zz about it but its a very hard thing to explain. You either "know" the feeling or you don't. After reading this I think NVIDIA might have narrowed it down to a hardware level judging by what the developers were saying in the PR. If so than this is gonna be awesome.
     
    Doc41, james888, remixedcat and 2 others say thanks.
  14. Am*

    Joined:
    Nov 1, 2011
    Messages:
    253 (0.23/day)
    Thanks Received:
    53
    Butthurt Nvidiot strikes again, what a surprise...might want to check my system specs before you embarrass yourself any further.
     
    Last edited: Oct 18, 2013
  15. MxPhenom 216

    MxPhenom 216 Corsair Fanboy

    Joined:
    Aug 31, 2010
    Messages:
    10,065 (6.61/day)
    Thanks Received:
    2,277
    Location:
    Seattle, WA
    I think this could be pretty cool. id be interested to try out a monitor with a G-Sync module thats for sure.

    Asus 27" 1440p monitor with G-Sync......anyone?
     
  16. Recus

    Recus

    Joined:
    Jul 10, 2011
    Messages:
    529 (0.44/day)
    Thanks Received:
    180
    And who can confirm your specs, mind invalid. You better go and write petition to AMD asking them to stop driver updates because games related problems aren't GPU makers problems it's game developers problem. :eek:
     
  17. wickerman

    wickerman

    Joined:
    Mar 12, 2006
    Messages:
    289 (0.09/day)
    Thanks Received:
    49
    Location:
    Austin, TX
    I really like this idea, but if it requires me to replace my u2711 I see that as a bit of a problem. Sure 2560x1440 panels have come down in price significantly since I bought mine, but it seems like a bit of a waste to replace my current panel with something that is the same resolution. I'd rather jump on the 4k train when the prices become more reasonable.

    I suppose if I had a friend/family member willing to buy mine and the replacement monitor offered benefits in other areas (color accuracy, response time, lower power, etc) then I could be talked into replacing my u2711 with another 1440p/1600p panel that supported this tech.

    *edit*
    Also if this is Nvidia exclusive tech, that is also a bit annoying. I tend to keep my monitors for a while, but I could switch back and forth between AMD and Nvidia graphics quite frequently. Would be kind of annoying to if we came across a generation down the line where AMD has the superior performance but I have to wait for Nvidia to catch up just to take advantage of the reason I upgraded my monitor.
     
  18. The Von Matrices

    The Von Matrices

    Joined:
    Dec 16, 2010
    Messages:
    1,353 (0.96/day)
    Thanks Received:
    450
    I look forward to learning more about this technology and seeing it implemented. It's basically dynamic refresh rate for monitors. I do see this making a huge difference for people like me who like to turn up the details at the expense of having the frame rate frequently drop below the monitor's refresh rate.
     
  19. Am*

    Joined:
    Nov 1, 2011
    Messages:
    253 (0.23/day)
    Thanks Received:
    53
    My PC is barely mid range compared to the systems some people here have, and you want me to prove my specs? Daaymn, you must be pretty broke to even consider saying that, no wonder you're mindlessly trolling the forums.

    P.S. will attach a CPU-Z/whatever system validation is quickest, when I can be arsed to do it.
     
  20. the54thvoid

    the54thvoid

    Joined:
    Dec 14, 2009
    Messages:
    3,386 (1.90/day)
    Thanks Received:
    1,618
    Location:
    Glasgow - home of formal profanity
    It looks to be a good thing but for God's sake don't be proprietary with it. And wtf with the tone of this thread?
     
  21. Crap Daddy

    Crap Daddy

    Joined:
    Oct 29, 2010
    Messages:
    2,744 (1.88/day)
    Thanks Received:
    1,046
    It is proprietary and will cost money compared to Mantle which is proprietary but comes for free albeit with just one game. Guess we will soon have to own two systems for gaming depending on what games we like, one NV and one AMD.
     
  22. The Von Matrices

    The Von Matrices

    Joined:
    Dec 16, 2010
    Messages:
    1,353 (0.96/day)
    Thanks Received:
    450
    My hope is that their pricing estimates are not too far off. I would be willing to pay $50 extra per monitor for this. If NVidia has this working with their 3D Surround implementation, then I would seriously consider replacing my graphics cards and monitors for an upgrade.
     
  23. 1d10t

    1d10t

    Joined:
    Sep 28, 2012
    Messages:
    193 (0.25/day)
    Thanks Received:
    42
    I reckon the previous hype nVidia boasting, "3D Games is The Future".And where all of those now? :p

    Is this nVidia respond to AMD Mantle ? :p

    OR...obviously nVidia cant reach above 60fps with highest detail ,but instead admitted they restricted it :p

    OR...they didn't good in 4K.But instead making another competitive card they make their own monitor that only do 1080p :p

    -= edited=-

    I bet it organize by Origin PC :p
     
    Last edited: Oct 18, 2013
  24. Hilux SSRG

    Hilux SSRG

    Joined:
    May 1, 2012
    Messages:
    990 (1.09/day)
    Thanks Received:
    161
    Location:
    New Jersey, USA
    If Nvidia can eliminate stuttering and provide ultrafast response, I'm very interested. I hope we get to see some videos soon.
     
  25. erocker

    erocker Super Moderator Staff Member

    Joined:
    Jul 19, 2006
    Messages:
    39,821 (13.16/day)
    Thanks Received:
    14,196
    *Waits for demonstration by a 3rd party*
     

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page