1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Introduces G-SYNC Technology for Gaming Monitors

Discussion in 'News' started by Cristian_25H, Oct 18, 2013.

  1. remixedcat

    remixedcat

    Joined:
    May 13, 2010
    Messages:
    3,160 (1.80/day)
    Thanks Received:
    754
    It's all frothing rabid fandogs that see the "enemy" release something innovative thier faction doesn't have and they attack and hate it even though they haven't seen it, used it, or understand it.
     
  2. PopcornMachine

    PopcornMachine

    Joined:
    Aug 17, 2009
    Messages:
    1,563 (0.77/day)
    Thanks Received:
    459
    Location:
    Los Angeles/Orange County CA
    This may be a bit better than AMD droning on about sound, but not much.

    I remain disappointed the by lack of innovation relevant to the consumer's current sound system and monitor.

    In the end, all we really have received from these two esteemed institutions is re-branded cards.

    ...
     
    Last edited: Oct 21, 2013
  3. MikeMurphy

    Joined:
    Feb 3, 2005
    Messages:
    403 (0.11/day)
    Thanks Received:
    57
    Neat idea, but I'd rather spend that money on a video card that will do 60fps at my desired resolution.
     
  4. Wile E

    Wile E Power User

    Joined:
    Oct 1, 2006
    Messages:
    24,324 (7.90/day)
    Thanks Received:
    3,778
    Almost everything that's open started as a response to a proprietary tech. For example, OpenCL wouldn't be nearly as far along as it is if it weren't for CUDA.

    Innovation is innovation, and should be respected as such. This little piece of tech, if it performs worthwhile, will spawn further ideas and refinments, and maybe even an open standard.
     
    The Von Matrices and remixedcat say thanks.
  5. Solaris17

    Solaris17 Creator Solaris Utility DVD

    Joined:
    Aug 16, 2005
    Messages:
    17,480 (5.01/day)
    Thanks Received:
    3,783
    Location:
    Florida
    So as an nvidia supporter this is the most rediculous thing ever. Why pay the $$ whats the failure rate? does it work on the good panels? whats the color repro like? why not get a 120hz or 240hz monitor and not give a f@#$?
     
  6. Wile E

    Wile E Power User

    Joined:
    Oct 1, 2006
    Messages:
    24,324 (7.90/day)
    Thanks Received:
    3,778
    This is to eliminate stutter AND tearing. 120 or 140 does you no good if you can't run those framerates in your games. Run below that, and you get stuttering and tearing.
     
  7. Steevo

    Steevo

    Joined:
    Nov 4, 2005
    Messages:
    8,725 (2.56/day)
    Thanks Received:
    1,445
    My TV already does frame scaling and interpolation to reduce tearing. Even at 13FPS when I have overloaded my GPU I barely see it.


    This doesn't fix stutter or GPU stalls, it covers them up like bad lag by redrawing old frames.
     
    Solaris17 says thanks.
    10 Million points folded for TPU
  8. qubit

    qubit Overclocked quantum bit

    Joined:
    Dec 6, 2007
    Messages:
    10,179 (3.85/day)
    Thanks Received:
    3,720
    Location:
    Quantum well (UK)
    It doesn't cover things up, it fixes stutter and lag. You misunderstand how it works.

    I made three previous posts in this thread explaining the technical aspects of how G-Sync works which should clear that up for you if you'd like to check them out.

    If you got hitching without it, then you'll get this with it too. It's not made for fixing that, which is caused by lots of things, quite often by other background software running on the PC such as a virus scanner kicking in at the wrong moment, Windows paging etc.
     
    The Von Matrices says thanks.
  9. 1d10t

    1d10t

    Joined:
    Sep 28, 2012
    Messages:
    193 (0.22/day)
    Thanks Received:
    42
    As my previous post,such thing already exist in the consumer electric.LG introduce Dynamic Motion Clarity Index and Samsung had Clear Motion Rate in late 2010.Both of them are capable of adjusting basic timing to nearest standard NTSC,PAL or SECAM regardless any source.They also had Motion Estimates Motion Compensate (MEMC) mechanism,"similar" with nVidia's Lightboost.Another features like Motion Interpolation (MI) have capabilities to doubling or quadrupling any source refresh rate to match standard characteristic 3D effect Frame Patterned Retarder (FPR).

    Fundamentally speaking there's is major difference between desktop and consumer electric.There's no GPU in consumer electric,so the source keep locking in NTSC 29,97 fps or PAL 25fps.There's only two factor to keep up rather than "virtually infinite" in desktop.Most HDTV had to deal with various input such as A/V,Composite,S-Video,SCART,RGB D-Sub and HDMI which has different clock.AFAIK, dekstop only had 3 clock, 30Hz,60Hz and recently introduced 4K's 33Hz spread across DVI,HDMI and Display Port.

    I stated before that i had LG panel that do 240Hz FPR 3D effect.Yet i admitted there is severe judder if i crank motion interpolation to the max.It's only natural for HDTV that can't do a massive task rendering full white or pitch black frame,in RGB mode with MEMC and MI switch on.In addition,my panel doesn't carry EDID,DDC or desktop similar timing and clock,so it only does 120 Hz or 240Hz interpolation mode.

    Bottom line,it's good to see nVidia aware of this matters.G Sync could be a game changer for desktop to have advanced MEMC and MI.But then again,don't expect LG and Samsung for not bringing their tech to desktop at a reasonable price.
     
    Solaris17 says thanks.
  10. qubit

    qubit Overclocked quantum bit

    Joined:
    Dec 6, 2007
    Messages:
    10,179 (3.85/day)
    Thanks Received:
    3,720
    Location:
    Quantum well (UK)
    I'm not sure why you think this already exists? This has not been done before in any commercial product.

    G-Sync synchronizes the monitor dynamically to the variable GPU framerate which changes from instant to instant, or from one frame to another. This feature is irrelevant for TVs, which are just reproducing video, hence just need to run at a fixed rate to match the video source.

    It's very different with video cards, because here you have interaction between what the user does and their output on the screen. In this case, things like lag, refresh rate, stutter and screen tearing are very important and G-Sync addresses all these things simply by syncing the monitor to the varying framerate of the graphics card rather than the other way, syncing the graphics card to the fixed framerate of the monitor, as happens now.
     
    remixedcat and The Von Matrices say thanks.
  11. The Von Matrices

    The Von Matrices

    Joined:
    Dec 16, 2010
    Messages:
    1,409 (0.91/day)
    Thanks Received:
    489
    I think he's confused between the refresh rate and the TMDS clock rate, which will vary depending on resolution.
     
    qubit says thanks.
  12. Steevo

    Steevo

    Joined:
    Nov 4, 2005
    Messages:
    8,725 (2.56/day)
    Thanks Received:
    1,445
    So how does it fix stutter and lag?

    A) By redrawing old frames on the screen.
    B) Magic and pixie dust.



    Don't know about you, but I am going with A here.


    So lets do some math.

    60FPS = .01667 rounded seconds per frame.

    If a gamer notices the lack of updated frame(s) that get displayed as the "frame buffer" doesn't get updated with the next frame its "lag". http://www.hdtvtest.co.uk/news/input-lag All TV's and monitors have some native lag, this doesn't remove that, don't be confused on that point.

    The lag or stutter here is http://www.mvps.org/directx/articles/fps_versus_frame_time.htm A increase in the number of MS it takes to render one or more of the frames to the output frame buffer.

    the G-Spot runiner or whatever they want to call it can only work one of two ways to do as they are claiming.

    A) Hold frames in a secondary buffer and display them at a set frequency rate that matches Vsync or whatever display rate it chooses. They can do things like what most TV's do, interpolating the two frames with motion adaptive/vector blending to create a new frame.

    B) Or they can keep rerendering the frame and keep the pixels driven for more ms with the same data, which by definition is lag.....so.....yeah


    There is this constant that we experience, called time, and unless they have started using quantum technology and keep us frozen in time while we wait for the GPU to render more fresh frames all we are going to see is lag or blur. Your choice, but now apparently you can pay for it.

    http://en.wikipedia.org/wiki/Multiple_buffering

    Triple buffer with Vsync.......been there done that, it was crap and caused lag.
     
    10 Million points folded for TPU
  13. Wile E

    Wile E Power User

    Joined:
    Oct 1, 2006
    Messages:
    24,324 (7.90/day)
    Thanks Received:
    3,778
    Or, it can alter the refresh rate of the monitor to match the output it's receiving from the video card, eliminating the need to hold back or duplicate frames, which is exactly what it's claiming to do.
     
    qubit says thanks.
  14. Steevo

    Steevo

    Joined:
    Nov 4, 2005
    Messages:
    8,725 (2.56/day)
    Thanks Received:
    1,445
    So you would like to pay for lag? Seriously, I must be in the wrong business.
     
    10 Million points folded for TPU
  15. Wile E

    Wile E Power User

    Joined:
    Oct 1, 2006
    Messages:
    24,324 (7.90/day)
    Thanks Received:
    3,778
    Supposedly it comes with little to no lag hit. I don't know, but I do know I tend to actually see a product in action before I go about making judgments and accusations on it.
     
    qubit, remixedcat and The Von Matrices say thanks.
  16. Steevo

    Steevo

    Joined:
    Nov 4, 2005
    Messages:
    8,725 (2.56/day)
    Thanks Received:
    1,445
    I know of the time constant, and buffering, and other attempts, and all still have that time issue.


    I will also wait to see it in action.
     
    10 Million points folded for TPU
  17. Xzibit

    Xzibit

    Joined:
    Apr 30, 2012
    Messages:
    1,302 (1.25/day)
    Thanks Received:
    390
    Only thing that's been shown is Nvidias demo and a slow camera rotation in Tomb Raider with Lara Croft by her self.

    We don't even know the specs of the systems they were run on.

    Nvidia Tom Peterson also said it was game dependent as well. Some games will not work with it and 2D is not a focus.

    Until someone test this in several game scenarios I'll remain skeptical much like the 3D Vision Surround craze.
     
    Last edited: Oct 23, 2013
  18. 1d10t

    1d10t

    Joined:
    Sep 28, 2012
    Messages:
    193 (0.22/day)
    Thanks Received:
    42
    Please read my explanation above.I do aware of all desktop thing in matter of speaking.
    If you trying to distinguish between dekstop and consumer electric,let me ask you a question...have you tried 3D TV FPR,3D TV Passive and nVidia 3D Vision and tell the difference between them?
    TV wise,they do dynamically upsample source to have 120Hz or 240 Hz display in opposite of G Sync dynamically downsample display to match GPU output.
    For convincing this is already had long history exist in consumer electric,have you noticed major panel maker such as LG,Samsung,Sharp supporting this?It's only desktop specific vendor listed such as Asus,Benq,Phillips and Viewsonic.
    It's only a matter of time LG and Samsung will make a monitor supporting this dynamically downsample feature,on the other hand nVidia couldn't make a better dishwasher.

    Yeah..i don't even know the difference between 60Hz and 60fps...silly me :)
     
  19. qubit

    qubit Overclocked quantum bit

    Joined:
    Dec 6, 2007
    Messages:
    10,179 (3.85/day)
    Thanks Received:
    3,720
    Location:
    Quantum well (UK)
    I'm sorry, but you're wrong on all counts. I've already written an explanation of how this works, so I'm not gonna go round in circles with you trying to explain it again, especially after having seen your further replies since the one to me. Click the link below, where I explained it in three posts (just scroll down to see the other two).

    http://www.techpowerup.com/forums/showthread.php?p=3000195#post3000195


    Mind you, I like the idea of the pixie dust. ;)


    Yes, not only have I tried 3D Vision, but I have it and it works very well too. I think active shutter glasses were around before NVIDIA brought out their version. How is this relevant?

    I think the link I posted above for Steevo will help you understand what I'm saying, as well. Again though, G-Sync is a first by NVIDIA, as it's irrelevant for watching video. It's the interaction caused by gaming that makes all the difference.
     
  20. markybox New Member

    Joined:
    Sep 6, 2013
    Messages:
    4 (0.01/day)
    Thanks Received:
    1
    Are you aware that G-SYNC has a superior, low-persistence strobe backlight mode? Google for it -- pcper report, Blur Busters article, John Carmack youtube, etc.

    So G-SYNC includes an improved LightBoost sequel, officially sanctioned by nVIDIA (but not yet announced). Probably with better colors (and sharper motion than LightBoost=10%), and very easily enabled via OSD menus.

    You can only choose G-SYNC mode, or strobed mode, though.
    But I'm happy it will be even better than LightBoost, officially for 2D mode.
    This is NVIDIA's secret weapon. Probably has an unannounced brand name.
     
  21. qubit

    qubit Overclocked quantum bit

    Joined:
    Dec 6, 2007
    Messages:
    10,179 (3.85/day)
    Thanks Received:
    3,720
    Location:
    Quantum well (UK)
    Yes, it's obvious that at the lower frame rates, a strobed backlight would be highly visible and intensely annoying.

    Be interesting to see exactly how NVIDIA address this. Got a link?
     
  22. markybox New Member

    Joined:
    Sep 6, 2013
    Messages:
    4 (0.01/day)
    Thanks Received:
    1
    It's an "either-or" proposition, according to John Carmack:
    CONFIRMED: nVidia G-SYNC includes a strobe backlight upgrade!

    It is currently a selectable choice:
    G-SYNC Mode: Better for variable framerates (eliminate stutters/tearing, more blur)
    Strobe Mode: Better for constant max framerates (e.g. 120fps @ 120Hz, eliminates blur)
    Also, 85Hz and 144Hz strobing is mentioned on the main page.
     
    qubit says thanks.
  23. qubit

    qubit Overclocked quantum bit

    Joined:
    Dec 6, 2007
    Messages:
    10,179 (3.85/day)
    Thanks Received:
    3,720
    Location:
    Quantum well (UK)
    Cheers matey. I'm busy right now, but I'll read it later and get back to you.

    I found these two links in the meantime:

    http://www.pcper.com/reviews/Graphics-Cards/NVIDIA-G-Sync-Death-Refresh-Rate

    http://www.pcper.com/news/Graphics-Cards/PCPer-Live-NVIDIA-G-Sync-Discussion-Tom-Petersen-QA
     
  24. erocker

    erocker Super Moderator Staff Member

    Joined:
    Jul 19, 2006
    Messages:
    40,179 (12.75/day)
    Thanks Received:
    14,732
    I would love to see Nvidia have these things setup at retailers. The only way I could make a purchasing decision on something like this is to see it running in person.
     
  25. MxPhenom 216

    MxPhenom 216 Corsair Fanboy

    Joined:
    Aug 31, 2010
    Messages:
    10,430 (6.33/day)
    Thanks Received:
    2,544
    Location:
    Seattle, WA
    You looking to get the 780Ti now if you are interested in G-SYNC?
     
    Crunching for Team TPU

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page