1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Introduces G-SYNC Technology for Gaming Monitors

Discussion in 'News' started by Cristian_25H, Oct 18, 2013.

  1. 1d10t

    1d10t

    Joined:
    Sep 28, 2012
    Messages:
    193 (0.24/day)
    Thanks Received:
    42
    I think somebody forgot, fps are not Hz so 60 fps doesn't mean 60 Hz.Displayed image consist of many frame rendered in one second,while monitor refresh rate consist three factors : horizontal frequencies,resolution and response time.
    Care to explain where this G Sync take a part?
     
  2. Mistral

    Mistral

    Joined:
    Feb 23, 2008
    Messages:
    411 (0.17/day)
    Thanks Received:
    58
    Location:
    Montreal
    I'm surprised Carmack sounded so positive about this thing. I respect him as much as the next guy, but I can't see it that way at he moment.

    I am curious though, how will g-sync do in say fighting games that require to-the-frame accuracy to pull out the best combos? It could be either a real boon or a curse for them.
     
  3. NeoXF

    Joined:
    Sep 19, 2012
    Messages:
    615 (0.77/day)
    Thanks Received:
    80
    Yes, how dare corporation care about anything else other than making money!

    /S :rolleyes:


    Just because the world is how it is now, doesn't mean it's any good. But what the Hell, you guys can salute your new green, blue, red or WTF ever overlords in any way you want, it's not me being dead inside while shielding myself further and further away from things that should count more, with consumerism gimmicks.

    This makes me sick to my stomach, but what can I do...
     
  4. 15th Warlock

    15th Warlock

    Joined:
    Aug 16, 2004
    Messages:
    2,808 (0.75/day)
    Thanks Received:
    1,246
    Location:
    Visalia, CA
    Aww, how nice of you both, but let me ask you a few questions: Do you have a job? If you do, do you work for free? I mean, do you offer your services without expecting to be remunerated for them? And if that's the case, how do you support yourself and/or your family? Through charity/welfare?

    I mean, you have to find a way to pay your bills somehow, am I right?

    I would assume that people who work for these companies (and please note that this applies to any given company in our "evil economy") expect some sort of compensation for their work, wouldn't they?

    Anyway, not going to discuss the basics of how our society works, not the right forum to do so, but I just found your counter argument really amusing; besides, no one is pointing a gun to your face forcing you to buy these new monitors, so there's no reason to get all worked up about this superfluous piece of technology, when there're obviously way more important things to worry about and fix like the state of the economy, world hunger, world peace and other serious matters...:rolleyes:

    Back to topic, it sounds like nVidia is genuinely interested in fixing this V-Sync problem, that has existed for so long, I for one am excited to see that someone is focusing some research to addressing this issue, I just hope they do offer the results of their findings to more costumers than just owners of new monitors and/or Kepler based cards.
     
    Last edited: Oct 19, 2013
    The Von Matrices says thanks.
    Crunching for Team TPU
  5. The Von Matrices

    The Von Matrices

    Joined:
    Dec 16, 2010
    Messages:
    1,371 (0.95/day)
    Thanks Received:
    463
    By that logic there should be no reason why we should buy new graphics cards to run at more detailed settings because the increased detail doesn't make you respond any faster. Have fun running at "low" settings on your integrated graphics. There's a lot more to video gaming than competition.

    Thank you. It seems as if for some people NVidia producing any technology first makes it inherently evil. I don't care who invented it, I support the technology. Kudos to NVidia for doing it first.

    For those who claim this is proprietary with no evidence, I quote Anandtech stating exactly the opposite:

    Just because no one else supports it yet doesn't mean that no one else ever will. Someone has to be first. Most standards aren't designed by committee but by everyone adopting one manufacturer's implementation for simplicity.
     
    Last edited: Oct 19, 2013
    15th Warlock says thanks.
  6. 15th Warlock

    15th Warlock

    Joined:
    Aug 16, 2004
    Messages:
    2,808 (0.75/day)
    Thanks Received:
    1,246
    Location:
    Visalia, CA
    Exactly, seems like they have taken addressing this problem to heart, with innovations like Adaptative V-Sync, FCAT and now G-Sync.
     
    Crunching for Team TPU
  7. Xzibit

    Joined:
    Apr 30, 2012
    Messages:
    1,127 (1.20/day)
    Thanks Received:
    254
    I think people read Nvidia own FAQ on G-Sync

    You can draw a conclusion there
     
  8. Fluffmeister

    Fluffmeister

    Joined:
    Dec 22, 2011
    Messages:
    716 (0.67/day)
    Thanks Received:
    209
    Sounds like a GTX xx0 card combined with an G-SYNC enabled monitor will offer a pretty damn sweet BF4 experience.

    Oh nVidia, you big meanies, no wonder peeps here are mad.
     
  9. The Von Matrices

    The Von Matrices

    Joined:
    Dec 16, 2010
    Messages:
    1,371 (0.95/day)
    Thanks Received:
    463
    Your comment does not disprove mine. As I quoted, the signaling technology should be possible to reverse engineer. At that point anyone can produce monitors or video outputs that comply to that standard. G-Sync will be NVidia exclusive for a few years just because no one has had time to dissect it. It doesn't mean that there will never be generic components that are compatible with it. The only difference is that third parties won't use the trademarked term "G-sync."

    You also need to consider the source when you read the quote from the FAQ. All manufacturers advertise that their products only work with first-party accessories. It doesn't mean that third parties can't make compatible accessories.
     
    Last edited: Oct 19, 2013
  10. Xzibit

    Joined:
    Apr 30, 2012
    Messages:
    1,127 (1.20/day)
    Thanks Received:
    254
    Which is the accessory the GPU or the G-Sync ?

    Since G-Sync will be talking to the driver. When was the last time Nvidia let outsiders tinker with that ?
     
  11. The Von Matrices

    The Von Matrices

    Joined:
    Dec 16, 2010
    Messages:
    1,371 (0.95/day)
    Thanks Received:
    463
    Once you know what commands are being sent over the cable then you can implement them into your own drivers or hardware. For example, if you create a monitor that can read all the signals sent via the G-sync protocol and respond to them just like a genuine G-sync monitor, then why would this matter to the drivers? A properly reverse engineered product should be no different than the genuine device. I doubt NVidia wants manufacturers to do this, but I see no reason, engineering or legal, that third party manufacturers cannot, and the driver shouldn't be able to tell otherwise.

    The only hurdle would be the investment required to reverse engineer the protocol, and if genuine G-sync doesn't catch on, then there will be no financial incentive and no third party will bother to do it.
     
  12. Assimilator

    Assimilator

    Joined:
    Feb 18, 2005
    Messages:
    623 (0.17/day)
    Thanks Received:
    105
    Location:
    South Africa
    Innovations like G-SYNC and SHIELD streaming show exactly why NVIDIA titles themselves "World Leader in Visual Computing Technologies". Because they are.

    When was the last time AMD released ANYTHING game-changing? No, Mantle doesn't count, because the world doesn't need another API; we have DirectX and it works just great. No, TrueAudio doesn't count, because no-one gives a shit.
     
    Fluffmeister says thanks.
  13. qubit

    qubit Overclocked quantum bit

    Joined:
    Dec 6, 2007
    Messages:
    9,872 (3.88/day)
    Thanks Received:
    3,502
    Location:
    Quantum well (UK)
    So, all nvidia have done is reverse sync direction, making the monitor sync with the card's varying frame rate output instead. A simple enough change technically, but it looks like the visual impact is big judging by the PR and articles I've read.

    Couple of things that might be worse though are motion blur and the shape distortion* of moving objects, both of which are currently fixed by nvidia's LightBoost strobing backlight feature which my monitor has. The PR doesn't mention LightBoost anywhere, so I expect both of these motion artifacts to be present. The motion blur in particular is horrible and I'd rather have a bit of lag and occasional stutter than put up with this. I'd have to see G-Sync in action to properly judge it, though.

    Also, it would be interesting to see this varying video signal on an oscilloscope.

    *To check out the shape distortion, just open a window on the desktop, make it stretch from top to bottom, but be rather thin, then move it from side to side with the mouse. The shape will change with the top leading the bottom - moving the mouse faster makes the effect stronger. This is due to the scanning nature of the video signal, where the bottom part of the window (the whole picture, in fact) is quite literally drawn later than the top part. Note that the slower the monitor refresh, the worse the effect. Note that it's separate to the tearing artifact that you're also likely to see.

    LightBoost strobing blanks the display and only shows the completed picture, eliminating this effect. Of course, this comes at the expense of maxed out lag. At least the lag is very short at 120Hz. Sometimes you just can't win, lol.
     
    tigger says thanks.
  14. NC37

    NC37

    Joined:
    Oct 30, 2008
    Messages:
    1,203 (0.54/day)
    Thanks Received:
    268
    This honestly isn't worth it nVidia. VSYNC is not such a terrible thing that it needs a special dedicated chip which...you aren't opening to the entire industry, will increase production costs, and likely only make the situation worse later when someone comes out with an alternative that does it without all the negatives.

    You should have just made the tech and licensed it for everyone to use then enjoyed the royalties for years. I seriously doubt it requires a Kepler GPU to use it. Already know PhysX will work on non NV cards. This isn't something special either. Just setting yourself up for the fall later when someone, maybe even AMD, does it and does it better and for everyone to use.
     
    NeoXF says thanks.
  15. Xzibit

    Joined:
    Apr 30, 2012
    Messages:
    1,127 (1.20/day)
    Thanks Received:
    254
    Last edited: Oct 20, 2013
  16. Lionheart

    Lionheart

    Joined:
    Apr 30, 2008
    Messages:
    4,071 (1.70/day)
    Thanks Received:
    824
    Location:
    Milky Way Galaxy
    [​IMG]

    :banghead::banghead:

    This site has gone to shit with all the fanboy's & trolls, Jesus Christ lolz
     
    Last edited: Oct 20, 2013
  17. NeoXF

    Joined:
    Sep 19, 2012
    Messages:
    615 (0.77/day)
    Thanks Received:
    80
    Yeah, 'kay. You obviously 've got everything figured out. And your only real problem in your life seems to be that you need a better paying job. Righahahahahat... :roll:


    Again, open or bust.

    It might have taken a lot of years, but microsoft is finally on the way out as a de-facto PC gaming platform... and why? Yeah, I'll let someone else figure this one out.


    But I digress, we shall see. I'm far away from getting a gaming monitor anytime soon either way.
     
  18. Frick

    Frick Fishfaced Nincompoop

    Joined:
    Feb 27, 2006
    Messages:
    10,893 (3.41/day)
    Thanks Received:
    2,411
    Just want to point out that no. Not in the least. Alternatives are incoming and to an extent already here, but right now? No way, no how.

    EDIT: And I just can't fathom the depths to which this place has plunged. All this rage.. For something they have not seen irl. And if this does what it says it does, you do have to see it irl before you can pass judgement.
     
    qubit says thanks.
  19. Am*

    Joined:
    Nov 1, 2011
    Messages:
    253 (0.23/day)
    Thanks Received:
    53
    Well quite clearly, you're the one playing dumb, seeing how you have yet to explain to me why someone running a 120Hz or even a 75Hz monitor, would benefit from DROPPING the refresh rate to the frame rate of the game. Can you point out to me, a single person suffering from their monitor's higher refresh rate in any games that never even exceed it? It has never been a problem and Nvidia are yet again trying to fix a problem that never even existed in the first place, and quite clearly you are being ignorant by ignoring simple facts and have not had any experience with what causes tearing.

    If a monitor is running at a refresh rate above the framerate of the GPU, unless the monitor does some image post-processing, scaling or duplicates frames (like those 240Hz TVs), the monitor will only draw the frames it has. End of. That renders this G-SYNC gimmick worthless because it is trying to show a problem that was never there in the first place. Whether your monitor runs at 30Hz or 120Hz it will only draw the frames that it has -- if it is less than the refresh rate, it won't affect the monitor either way.

    The ONLY problem that currently exists that has anything remotely to do with monitors is that when an old game runs too fast, you have to choose between A. running at a higher framerate and experiencing tearing or B. Capping the framerate with vsync. The biggest problem with Vsync is that it drops GPU utilization to the point where a GPU can barely distinguish between idle and 3D load and this is a problem that occurs only with Nvidia cards, even on single GPU setups (because they have too many clock profiles to switch between), which causes stuttering. AMD doesn't have this problem because they have idle, 2D (Blu-ray) and 3D load clock profiles, nothing more. This gimmick does nothing whatsoever to fix that, and every Nvidia GPU up to this point, from 200-700 series has had this problem, and Maxwell will continue to have it until they address every affected game individually in the drivers 1-2 years after initial release. My GTX 285 had this problem, my 460 had this problem until they fixed most of them 2 years back, and my 660 had this problem, until I returned it. When they can work out a way to scale their GPU cores/clusters to imitate old cards, they will solve the problem instantly. This G-SYNC crap does not affect this problem in either a positive or a negative way, therefore it is worthless (even more so to 120Hz/144Hz fast gaming monitor users). By all means feel free to explain any benefits from this tech that I am not seeing.

    Please give it a rest, bud. This G-sync and Shield (the joke of a Android tablet slapped onto a 360 controller, with about 30 games on its support list, about which almost nobody outside of North America even knows or gives a single shit about) are not innovations in the slightest, and they are exactly why Nvidia are slowly losing its consumer GPU market share to AMD, as well as the reason why hardcore PC gamers buying into this crap will continue to get ridiculed by our casual PC and console gaming bretheren. Instead of investing in features that matter, they continue churning out more pricey gimmicks. If that is what you're into, more power to you and continue buying Nvidia. I for one, see these "innovations" as gimmicks and they add no value whatsoever to their GPUs or anything else employing this sort of tech that will come at a premium because of it in comparison to AMD.

    Nvidia (and AMD) deserve praise for a lot of things -- G-sync and Shield are neither of them.
     
    Last edited: Oct 20, 2013
  20. 1d10t

    1d10t

    Joined:
    Sep 28, 2012
    Messages:
    193 (0.24/day)
    Thanks Received:
    42
    If and only if G-Sync try manipulate TDMS and DDC Clock over Display Port,there's highly probability this will leave DCP (Displayport Content Protection) exposed.Now that will be unpleasant for some.Unless G Sync is communicate between two return channel TMDS and DDC,this will crippling frame sequence rendering within two channel.And let me guess,this will not work on SLI and or Stereoscopic.

    Ah yes...that's great.Let me ask a simple question.Do you own nVidia Shield or at least ever try it?Do you use android phones?Ever try running games on much cheaper Google Nexus 4?
     
  21. qubit

    qubit Overclocked quantum bit

    Joined:
    Dec 6, 2007
    Messages:
    9,872 (3.88/day)
    Thanks Received:
    3,502
    Location:
    Quantum well (UK)
    Thinking about it a little more, while this reduces lag, you can't eliminate it like nvidia claims and here's why.

    At the moment, whether vsync is on or off, once the GPU has rendered the frame, it doesn't get displayed until the next monitor refresh (this is independent of the refresh frequency, of course) so you get lag. The amount of lag will vary too, depending when in that cycle the GPU has rendered the frame. If vsync is off, then you'll get tearing and stutters, regardless of whether the GPU is rendering faster or slower than the refresh rate. Remember there's still lag in the system due to the rendering time of the GPU and general propagation delay through the controller and also the rest of the computer.

    Now you turn G-Sync on, what happens? The monitor waits for the GPU, not refreshing until it gets the command from to do so. This results in the frame being displayed the instant it's ready. Lag may be reduced, but not eliminated. Tearing and stuttering will be eliminated, however, because the monitor will be displaying every frame and crucially only ever doing so once.

    Lag isn't eliminated, because the GPU still requires time to build the frame. Imagine an extreme case where the GPU is slowed down to around 15-20fps (can easily happen). You'll still have lag corresponding to this variable frame rate and therefore the game will feel horribly laggy. Responsiveness to your controls will be improved however since the monitor displays the frame as soon as it's ready, but more importantly perhaps there will be no tearing or stutters, which is a significant benefit, because both of these effects look bloody awful. NVIDIA obviously gets this.

    The only thing I wonder is if the player will notice dynamic artifacts with the game responsiveness, since the frame rate and synced refresh rate vary continuously and significantly? It might lead to some weird effect where the player feels disoriented perhaps? Maybe even inducing nausea in some people? I don't know, but this is something to look out for and I will when I get my hands on some demo hardware. NVIDIA are obviously not gonna tell you about this in a press release, lol.

    So, in short, while I can see this technology improving the game play experience, there's still no substitute for putting out frames as fast as possible. I'm currently gaming at 120Hz with very few dropped frames which makes a world of difference over 60Hz. (Adding LightBoost strobing to the mix with its blur elimination then takes this experience to another level altogether). That doesn't change with G-Sync. It would be really interesting to get my hands on demo hardware and see G-Sync for myself.

    I don't think you properly understand what G-Sync does on a technical level or what causes tearing and stutters in the first place. Remember, the only way* to remove stutters is to have the GPU and monitor synced, it doesn't matter which way round that sync goes. NVIDIA have simply synced the monitor with the GPU ie synced the opposite way round to how it's done now.

    Check out my explanation above and in my previous post a few posts back. Getting rid of stutters and tearing is a big deal, possibly even more so than the slight lag reduction. This feature may be proprietary for now, but it's no gimmick.

    *Not strictly true, sort of. If I run an old game which is rendering at several hundred fps and leave vsync off, then it looks perfectly smooth, with very little judder or tearing - and noticeably less lag. It seems that brute forcing the problem with a large excess number of frames per second rendered by the GPU can really help, even though those frames are variable.
     
  22. shb-

    shb-

    Joined:
    Sep 24, 2010
    Messages:
    56 (0.04/day)
    Thanks Received:
    5
    Nvidia page:
    Below 60 (120) fps you get no stuttering (basically vsync turns off), but see tearing. Above 60 (120) fps vsync is on as usual, resulting in no tearing and no visible stutter.

    Really, why would nvidia develop something thats already solved, and a guy like asus join them? Nobody thats stupid. Also this tech is praised by various tech journalists and devs like John Karmack who seen it in action.

    Only sh1tty thing about this is vendor lockin. We need this for amd and intel too, on every tv and monitor. Lets hope nvidia wont be stupid and open it up.
     
  23. tigger

    tigger I'm the only one

    Joined:
    Mar 20, 2006
    Messages:
    10,183 (3.21/day)
    Thanks Received:
    1,399
    Couple of interesting posts mate, much more so than the rest of the fan boy crap and uninformed arguing from others.
     
  24. qubit

    qubit Overclocked quantum bit

    Joined:
    Dec 6, 2007
    Messages:
    9,872 (3.88/day)
    Thanks Received:
    3,502
    Location:
    Quantum well (UK)
    +1 there. Why do things like this induce such foaming at the mouth? It's fucking ridiculous. If someone doesn't want it, then just don't buy it. No one's forcing them.

    +1 again.

    Another technical thing I've just thought of about G-Sync.

    Regardless of how moving pictures are being displayed, they are still sampled, just like audio. This means that the Nyquist limit or Nyquist frequency applies.

    Hence, for fast moving objects eg during frenetic FPS gaming, you want that limit to be as high as possible, since an object moving fast enough will not just be rendered with only a few frames, but will display sampling artefacts similar to the "reverse spokes" effect in cowboy movies of old. In gaming, you may not even see the object, or it may appear in completely the wrong place and of course, be heavily lagged. If the GPU drops to its minimum of 30fps, then you can bet you'll see this effect and in a twitchy FPS, that can easily mean the difference between fragging or being fragged.

    So again, while G-Sync looks like a great innovation to me, there remains no substitute for a high framerate as well.
     
    remixedcat says thanks.
  25. jagd

    Joined:
    Jul 10, 2009
    Messages:
    460 (0.23/day)
    Thanks Received:
    89
    Location:
    TR
    Which innovations ? A company acting like headless chicken because losing it's main market ( GPU 's ) ? Nvidia has hardtime between AMD apus /raising tablets -smartphones - dropping pc sales etc etc and trying to find new markets nothing more nothing less.

    Btw who you are to decide about mantle and true audio in behalf of all people on earth , i dont remember i made you speaker person :slap: . What you dont get is i want microsoft free gaming = No direct X fyi

     

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page