1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Introduces G-SYNC Technology for Gaming Monitors

Discussion in 'News' started by Cristian_25H, Oct 18, 2013.

  1. haswrong

    Joined:
    Jun 11, 2013
    Messages:
    354 (0.29/day)
    Thanks Received:
    63
    if nv did this in the range of 140-85Hz, id call that a gamers benefit :toast:
    but dropping frames to what? 5fps@5Hz and calling it a holy grail????? :confused::twitch:
    gimme a f*cking break! :banghead:

    im too old to jump this bait, so im kinda happy i cant afford to buy nvidia g-card :laugh:
     
  2. Xzibit

    Xzibit

    Joined:
    Apr 30, 2012
    Messages:
    1,900 (1.18/day)
    Thanks Received:
    866
    Can I upgrade it to a 780M ?

    [​IMG]
     
    haswrong says thanks.
  3. 1d10t

    1d10t

    Joined:
    Sep 28, 2012
    Messages:
    206 (0.14/day)
    Thanks Received:
    44
    interesting...so G-Sync could be nVidia's mobile version of lowest end SKU graphic card that doesn't sell,or it might be a defective Tegra4 comes from nVidia shield /sarcasm
     
  4. The Von Matrices

    The Von Matrices

    Joined:
    Dec 16, 2010
    Messages:
    1,455 (0.69/day)
    Thanks Received:
    529
    Thank you for this post. It seems like for every 1 neutral post on any AMD, NVidia or Intel press release there are 2 from people who read the only first sentence of the article and immediately claim the technology is evil based on some argument that would be disproved if they had read the entire article. If only more people approached these topics from a neutral point of view and took time to understand what they were criticizing this forum would be full of intelligent discussion. I guess this is the internet and you can't expect more... (I always wished there was a "No Thanks" button on TPU to note irrelevant posts, but I can imagine how it would be abused).

    I support technological advancement, whether it is NVidia, AMD, Intel, or some other company, and G-sync looks promising. This is the type of disruptive technology like AMD did with Eyefinity where no one saw it coming and it will take a few years before everyone else catches up.
     
  5. ItsKlausMF New Member

    Joined:
    Oct 19, 2013
    Messages:
    1 (0.00/day)
    Thanks Received:
    1
    Audio does matter, better audio = better experience.

    Yeah nv first with SLI, but not the first to do it "right". FCAT was just an anti AMD tool to bottleneck AMD sales. PhysX wasn't even nvidias technology, they just bought it to make better marketing than ATI. Better Linux support? Read this and cry. OpenGL and OpenCL is pretty much AMD territory at this moment.

    Then what do you mean by nvidia fighting two generations by AMD with only one generation? GF10*(gen1) GF11*(Gen2). Maxwell is far FAR away(late 2014) and by then AMD will be as well hitting their next generation.
     
    NeoXF says thanks.
  6. The Von Matrices

    The Von Matrices

    Joined:
    Dec 16, 2010
    Messages:
    1,455 (0.69/day)
    Thanks Received:
    529
    No argument there. How much TrueAudio actually improves audio is still up for debate until AMD releases drivers for it.

    What in the world does this even mean?

    So you're arguing that we shouldn't have any way to measure frame pacing just because AMD couldn't do it properly?

    PhysX was doomed to fail even before NVidia took it over. There was no way that dedicated physics accelerator cards were going to take off just like dedicated sound cards had died out in the years prior.

    I don't know enough about these to make a comment.

    Maxwell is speculated to launch on 28nm in early 2014 then move to 20nm in late 2014 according to reports. And what does it matter how many generations each manufacturer puts out? All that matters is performance/price, which is why I don't care one bit about the rebrands that both sides do as long as the rebrands move down pricing.
     
  7. Xzibit

    Xzibit

    Joined:
    Apr 30, 2012
    Messages:
    1,900 (1.18/day)
    Thanks Received:
    866
    I actually wasn't kidding when I said


    It reminded me of Tegra 3 short board modules. If they were full board they be almost twins

    [​IMG]

    [​IMG]

    NVIDIA GeForce GT 630M
    [​IMG]

    its at the lower end for sure size wise
     
    Last edited: Oct 19, 2013
  8. RejZoR

    RejZoR

    Joined:
    Oct 2, 2004
    Messages:
    9,096 (2.08/day)
    Thanks Received:
    3,473
    Location:
    Europe/Slovenia
    Clearly neither do you then. Adaptive V-Sync is there to:
    a) remove image tearing
    b) doesn't pretty much half the framerate of FPS drops below certain level like normal V-Sync does

    So, why do we need G-Sync to remove image tearing again?

    Well, AMD does say you need a GCN powered hardware to use it and NVIDIA doesn't have that at all. And since it's an low level API it is very hardware specific. That would be like expecting ATI, S3 and NVIDIA to support Glide back in its days. Natively.

    High level API's work that way (but are slower, aka Direct3D), low level ones don't (and are faster as a result, aka Matle or Glide).
     
    10 Year Member at TPU
  9. Xzibit

    Xzibit

    Joined:
    Apr 30, 2012
    Messages:
    1,900 (1.18/day)
    Thanks Received:
    866
    Anyone remember GLIDE Wrappers ?

    Back then it was possible but now a days it be a lawsuit frenzy. That's why there no CUDA\PhysX wrappers. Slower\buggy but at least you weren't locked out of the API entirely.

    Nvidia G-Sync FAQ

    Boo!!!


    :rolleyes:

     
    Last edited: Oct 19, 2013
  10. Doc41

    Doc41

    Joined:
    Sep 25, 2010
    Messages:
    514 (0.23/day)
    Thanks Received:
    462
    Location:
    Bahrain
    This looked like an interesting thing to try as i just bought the VG248QE a while ago, but it looks it'll cost a kidney and might not be available for me :ohwell: will see when its released,
    aand it looks i might need a new system too with a kepler card :banghead:
     
  11. RejZoR

    RejZoR

    Joined:
    Oct 2, 2004
    Messages:
    9,096 (2.08/day)
    Thanks Received:
    3,473
    Location:
    Europe/Slovenia
    @Xzibit
    Glide Wrapper means emulation. Emulation means slow. How exactly that solves anything? Glide wrappers exist because you could play games that would then look better. Besides, what you can emulate from Voodoo cards doesn't mean you can emulate modern stuff today. Stuff was rather simple back then... Vertex Shaders had software emulation, but it was slower. Pixel Shaders were not even possible to emulate without getting 1 frame per minute... and neither was available on any Voodoo card...

    Mantle is performance only thing and if you negate the boost with emulation, wouldn't it be easier to just stay with Direct3D 11 then and remain at point zero?
     
    haswrong says thanks.
    10 Year Member at TPU
  12. Frick

    Frick Fishfaced Nincompoop

    Joined:
    Feb 27, 2006
    Messages:
    13,494 (3.49/day)
    Thanks Received:
    4,401
    You guys make my hatred for booth babes and Bethsoft seem rational and levelheaded (which it totally are btw). :laugh:

    Anyway wasn't qubit going on about something like this some time ago?
     
    Last edited: Oct 19, 2013
    10 Year Member at TPU
  13. remixedcat

    remixedcat

    Joined:
    May 13, 2010
    Messages:
    4,067 (1.75/day)
    Thanks Received:
    1,283
    Ok what the hell does this have to do with Glidewrapper?

    Say this like you are teaching it to people at a class and not like you are rabid fandogs or sales people.
     
  14. GC_PaNzerFIN

    GC_PaNzerFIN

    Joined:
    Oct 9, 2009
    Messages:
    638 (0.25/day)
    Thanks Received:
    636
    Location:
    Finland
    This is going to 144Hz monitors too, meaning they aim to sync everything between 30 to 144fps@144Hz (refresh rate varies with fps). You obviously didn't read a thing about this, instead came in raging like any typical nvidia hater would.

    No matter what this company is releasing it is always going to get this same hatred. Doesn't matter what the greatest minds of game development think.

    G-SYNC board will support Lightboost too.
    http://www.neogaf.com/forum/showpost.php?p=86572603&postcount=539

    edit:
    30Hz is minimum, variable between 30Hz to 144Hz (obvious max limit)

    https://twitter.com/ID_AA_Carmack/status/391300283672051713
     
    Last edited: Oct 19, 2013
  15. john_

    john_

    Joined:
    Sep 6, 2013
    Messages:
    931 (0.83/day)
    Thanks Received:
    421
    Location:
    Athens, Greece
    I am full with AMD and I hate Nvidia with their locks, but this really looks interesting and I believe AMD could follow in the future with something similar (who doesn't like to sell more hardware?).
     
  16. Am*

    Am*

    Joined:
    Nov 1, 2011
    Messages:
    279 (0.16/day)
    Thanks Received:
    60
    No, it doesn't, especially since they're in Nvidia's pockets being payed to spread testimonials about useless "tech" like this, if you can even call it that.

    Give me one reason why it can't work in the way I stated. The only thing Nvidia would be required to do is modify Adaptive V-sync to keep in check in with the monitor's refresh rate, and there would be nothing stopping AMD from developing a similar method. Fifty bucks says DisplayPort or one of the other display standard companies come up with a non-proprietary way of doing this in the next 5 years.

    [​IMG]

    Wait, what...$175 dollars for more useless bullshit with a support list of 10-20 games that will never even exceed the refresh rate of a 120Hz monitor, unless they run at lower than garbage settings of a console and 95% of the time needing to turn off the "exclusive" shit for which you've payed out the ass for. :banghead: And people are supporting this shit...if Nvidia (not directed at you BTW) doesn't have the most braindead and loyal fanboys, I don't know what other company does (other than Apple).

    I would take those days over the price fixing bullshit Nvidia are playing now any day -- at least competition was fierce back then, and you still had various ways to run proprietary tech yourself (with Glide wrappers etc).
     
    Last edited: Oct 19, 2013
  17. 1d10t

    1d10t

    Joined:
    Sep 28, 2012
    Messages:
    206 (0.14/day)
    Thanks Received:
    44
    Have you remember monitor technology related that nVidia introduced between 2010?Yep...3D Vision that require very expensive 3D shutter glass,emitter and monitor approved by nVidia.I got one glasses,one emitter and Samsung's 223RZ 22 inch LCD for the price of $800, not to mention another $350 for GTX 470.As far as i remember,3D only "works" on nVidia specific cards and specific monitor.Later i knew nVidia just adopting 3D FPR from LG and bring it to desktop.For the same $1200 i switch to HD5850 + 42 inch LG 240Hz and had the same effect.Meanwhile,a pair of 3D Vision kit and Asus VG236H will cost you $650 and only works with highend GTX,or you can grab $250 LG D2343P-BN and paired it with every graphic card out there.Where are those "3D Gaming is the future" now?

    Personally,i don't hate nVidia for their "innovation" or "breakthrough" technology.They push gaming to a new level,creating a better experience and better enjoyment.I just hated their price and the so called "proprietary" tech.

    A better DSP doesn't translate to better audio,it's only small fragment.You need a better amplifier that can encode DSP digital signal and better speaker to translate analog signal from amplifier.
    FCAT was surely anti AMD,because AMD uses AFR rather than nVidia's SFR.Look at bright sides,AMD now working for better driver to address frame pacing issues.
     
    NeoXF says thanks.
  18. NeoXF

    Joined:
    Sep 19, 2012
    Messages:
    615 (0.42/day)
    Thanks Received:
    80
    Ha! Probably the smartest comment in this thread so far. You deserve a medal.
    I know how hard I try to shut up about things I don't know much about yet.


    I know what you're saying but:

    1. Kepler might not support it, but Maxwell or Volta at the very list could. GPUs are way more complex and adaptable creatures then what they used to be back then... and even then:

    2. You could use wrappers to run Glide on non-3dfx hardware...

    Ah, I still remember how I finally got Carmageddon 2 to work on a glide wrapper on nV hardware, a world of difference in graphics...
     
  19. 15th Warlock

    15th Warlock

    Joined:
    Aug 16, 2004
    Messages:
    3,002 (0.68/day)
    Thanks Received:
    1,580
    Location:
    Visalia, CA
    I really hope nVidia releases a version of this tech that works with existing monitors, something like a HDMI dongle between the monitor and graphic, and make it hardware agnostic please (though I don't see that last part happening...)

    Most PC gamers have invested a lot of moola on their monitors (myself included) and if nVidia really wants to see this tech take off so to speak, they must make this available to as many costumers as they can, not just people who invest in new monitors starting next year...
     
    10 Year Member at TPU
  20. Solidstate89

    Solidstate89

    Joined:
    May 29, 2012
    Messages:
    304 (0.19/day)
    Thanks Received:
    90
    Location:
    Western New York
    It won't drop below 30 you dunce. You could have at least spent 5 minutes reading what it does. Clearly, none of you have.

    Are you like, intentionally playing dumb or are you just not getting this? The refresh rate of the monitor will always be 60Hz, there is no way right now to modulate it, dynamically, on the fly. Period. End of story. V-Sync attempts to lock your frames to 60Hz so it doesn't induce screen tearing, because the monitor operates at 60Hz. You can change it, but not dynamically. Doesn't work that way. But you can still get some screen tearing, and most importantly, you can get input lag because the GPU is rendering frames faster than the monitor can handle.

    Adaptive V-Sync, simply removes the restriction on frame rates when it drops below 60. So instead of hard locking your game into either 60FPS or 30FPS, if it drops below, it simply reacts like V-sync isn't enabled, so it can run at 45FPS instead of being locked to intervals of 30. Again, this has nothing to do with the monitor and can not control it. Why do you even think Adaptive V-sync can change the monitor's refresh rate? How do you expect Adaptive V-sync to change the monitor's refresh rate - on the fly - to 45Hz? It doesn't and it physically can't. There is no standard that allows that to happen. There is no protocol that allows that to happen.

    That is what G-Sync is. Maybe one of the consortiums can come up with a hardware agnostic standard...at some point. We don't know when, or if any of the consortiums even care. So please, for the love of god, stop making baseless (and wholly inaccurate and ignorant) assumptions. There isn't a single freaking monitor that works the way you describe these days and no firmware update is going to change that. Ever.
     
    Last edited: Oct 19, 2013
    qubit and The Von Matrices say thanks.
  21. haswrong

    Joined:
    Jun 11, 2013
    Messages:
    354 (0.29/day)
    Thanks Received:
    63
    imagine a group of players playing at 120fps and a group struggling at 30-50fps.. whos going to get more frags? the first gropu can react after 1/120th of a wsecond, whereas the other only at 1/30th - 1/50th.. their reactions will be twice as slow. wheres the benefit for gamers? and let me repeat the crucial thing - nvidia wouldnt dare to present this on a crt monitor. and as you posted the carmack link, if you move your view so that the pixel needs to be refreshed faster than 1/30th of a second, it sends a duplicate frame which doesnt contain updated information in the scene aka a player leaning outta the corner. this can lead to upto four identical frames send to you which makes it upto 4*1/30 ~ 133 milisecond later response on your side versus a 120fps guy. is that clearer now? this technology is nvididas sorry excuse for being lazy to make graphics cards that are able to render 60+ fps at 1600p+ resolution. nothing more, nothing less. so now you see why im upset. and they are able to sell this crap even to you so smoothly. unbelievable.



    thanks. as soon as i realized what this technology is, and i realized it as soon as they started talking about lowering the refresh frequency of the monitor, i wasnt exactly in the mood to start searching where is the lowest acceptable limit for those guys.. im inclined to think that they really have no limit if it boils down to getting money from you, lol!
     
    Last edited: Oct 19, 2013
  22. Solidstate89

    Solidstate89

    Joined:
    May 29, 2012
    Messages:
    304 (0.19/day)
    Thanks Received:
    90
    Location:
    Western New York
    Then maybe you shouldn't have made a hyperbolic inaccurate statement on something you knew nothing about. You know, like most people would.

    I've never set up a WC loop, so you don't see me going into sub-forums and threads about setting up WC loops and throwing around BS beliefs on something I clearly know nothing about.
     
    15th Warlock says thanks.
  23. 15th Warlock

    15th Warlock

    Joined:
    Aug 16, 2004
    Messages:
    3,002 (0.68/day)
    Thanks Received:
    1,580
    Location:
    Visalia, CA
    Yes! How can they offer more options to gamers?! What impertinence! Let's make sure that all people can only buy graphic cards and monitors that are capped at 30FPS so no one can have an unfair advantage, who needs more alternatives anyways?!!

    Who do these people think they are by offering innovation in this field?? Let's boycott Nvidia and burn all their engineers at the stake! And then to have the audacity to sell this technology and make a profit! How dare them?!!

    /S :rolleyes:
     
    qubit says thanks.
    10 Year Member at TPU
  24. Frick

    Frick Fishfaced Nincompoop

    Joined:
    Feb 27, 2006
    Messages:
    13,494 (3.49/day)
    Thanks Received:
    4,401
    @haswrong: Isn't that how it is now though?
     
    10 Year Member at TPU
  25. haswrong

    Joined:
    Jun 11, 2013
    Messages:
    354 (0.29/day)
    Thanks Received:
    63
    nvidia said that gamers will love it.. this sounds like notebook gamers will love it.. are you ready to ditch your $1300 monitor for even more expensive one with the variable refresh rate, which doesnt make your reasponse any faster?

    ok, i have to give you one thing, you can now watch your ass being fragged without stutter :D



    exactly.. if i decide to please someone, i do it for free..

    remember when your momma asked you to take out the litter back then? did you ask like $270 for the service? the whole world will be soon eaten by the false religion that is the economy, you just wait.
     
    NeoXF says thanks.

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)