• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA Introduces G-SYNC Technology for Gaming Monitors

Fair enough.
Not useless, but walled garden ... it's basically just a pile of bones.

if nv did this in the range of 140-85Hz, id call that a gamers benefit :toast:
but dropping frames to what? 5fps@5Hz and calling it a holy grail????? :confused::twitch:
gimme a f*cking break! :banghead:

im too old to jump this bait, so im kinda happy i cant afford to buy nvidia g-card :laugh:
 
Can I upgrade it to a 780M ?

2128.jpg
 



interesting...so G-Sync could be nVidia's mobile version of lowest end SKU graphic card that doesn't sell,or it might be a defective Tegra4 comes from nVidia shield /sarcasm
 
as a red team guy, the hatred in this thread is ridiculous (on a related note, amd setting up a 290x demo across the street from nv's event made me uneasy)

name non proprietary nv tech? someone was blind on a videocardz thread as well, physx on the cpu is multiplatform & in many games+consoles! the new FCAT method runs anywhere, FXAA (although i want to say that's one guy), how about helping devs add DX10 or 11 support? not all nv sponsored titles have locks on them, amd has done the same in helping codemasters games be good looking & efficient

sure GPU physx & cuda are very annoying, not doubting that, it's not 'good guy nvidia', but many things start out proprietary to show that they work

we should be pressuring VESA/displayport/hdmi for an adaptive refresh technique like this :respect:

i dont get why nv doesnt license things out so that everyone can enjoy instead of locking down to only their platform (look at bluray, it's standard, just a royalty is paid, it's not locked to sony products)

just cuz we didnt hit a perfect or open world doesnt mean we should destroy it, this is still better than deciding between matrox+rendition+3dfx+whoever else all at once if you want to play various games in the 90s :banghead:

Thank you for this post. It seems like for every 1 neutral post on any AMD, NVidia or Intel press release there are 2 from people who read the only first sentence of the article and immediately claim the technology is evil based on some argument that would be disproved if they had read the entire article. If only more people approached these topics from a neutral point of view and took time to understand what they were criticizing this forum would be full of intelligent discussion. I guess this is the internet and you can't expect more... (I always wished there was a "No Thanks" button on TPU to note irrelevant posts, but I can imagine how it would be abused).

I support technological advancement, whether it is NVidia, AMD, Intel, or some other company, and G-sync looks promising. This is the type of disruptive technology like AMD did with Eyefinity where no one saw it coming and it will take a few years before everyone else catches up.
 
Seriously , AMD is the one suffering from a colossal crap load of dropped frames, tearing, frame interleaving and micro-stuttering in their CF systems , that's why they are trying to mitigate the issue with CF over PCIe.

TrueAudio doesn't mean squat , and R290X is not really the answer to Titan at all , you will see that when the review comes about.The only real advantage for AMD now is Mantle, but it remains to be seen whether they will really get it to shine or not .

On the other hand, NVIDIA always has been the brute force of advancement in the PC space, AMD just plays catch up, and today's announcements just cements that idea.

NVIDIA was the first to introduce the GPU, SLi, Frame Pacing (FCAT) , GPU Boost ,Adaptive V.Sync, first to reach unified shader architecture on PCs, first with GPGPU , CUDA , PhysX , Optix (for real time ray tracing) , first with 3D gaming (3D Vision) . first with Optimus for mobile flexibility , first with Geforce Experience program , SLi profiles , shadow play (for recording) and game streaming .

They had better support with CSAA , FXAA , TXAA , driver side Ambient Occlusion and HBAO+ , TWIMTP program , better Linux , OpenGL support, .. and now G.Sync! all of these innovations are sustained and built upon to this day.

And even when AMD beat them to a certain invention , like Eyefinity , NVIDIA didn't stop until they topped it with more features , so they answered with Surround , then did 3D Surround and now 4k Surround.

NVIDIA was at the forefront in developing all of these technologies and they continue to sustain and expand them till now , AMD just follows suit , they fight back with stuff that they don't really sustain so they end up forgotten and abandoned ,even generating more trouble than they worth. Just look at the pathetic state of their own Eyefinity and all of it's CF frame problems.

In short AMD is the one feeling the heat, NOT NVIDIA, heck NVIDIA now fights two generations of AMD cards with only one generation of their own, Fermi held off HD 5870 and 6970 , Kepler held off 7970 and x290 ! and who knows about Maxwell !

Audio does matter, better audio = better experience.

Yeah nv first with SLI, but not the first to do it "right". FCAT was just an anti AMD tool to bottleneck AMD sales. PhysX wasn't even nvidias technology, they just bought it to make better marketing than ATI. Better Linux support? Read this and cry. OpenGL and OpenCL is pretty much AMD territory at this moment.

Then what do you mean by nvidia fighting two generations by AMD with only one generation? GF10*(gen1) GF11*(Gen2). Maxwell is far FAR away(late 2014) and by then AMD will be as well hitting their next generation.
 
Audio does matter, better audio = better experience.

No argument there. How much TrueAudio actually improves audio is still up for debate until AMD releases drivers for it.

Yeah nv first with SLI, but not the first to do it "right".

What in the world does this even mean?

FCAT was just an anti AMD tool to bottleneck AMD sales.

So you're arguing that we shouldn't have any way to measure frame pacing just because AMD couldn't do it properly?

PhysX wasn't even nvidias technology, they just bought it to make better marketing than ATI.

PhysX was doomed to fail even before NVidia took it over. There was no way that dedicated physics accelerator cards were going to take off just like dedicated sound cards had died out in the years prior.

Better Linux support? Read this and cry. OpenGL and OpenCL is pretty much AMD territory at this moment.

I don't know enough about these to make a comment.

Then what do you mean by nvidia fighting two generations by AMD with only one generation? GF10*(gen1) GF11*(Gen2). Maxwell is far FAR away(late 2014) and by then AMD will be as well hitting their next generation.

Maxwell is speculated to launch on 28nm in early 2014 then move to 20nm in late 2014 according to reports. And what does it matter how many generations each manufacturer puts out? All that matters is performance/price, which is why I don't care one bit about the rebrands that both sides do as long as the rebrands move down pricing.
 
interesting...so G-Sync could be nVidia's mobile version of lowest end SKU graphic card that doesn't sell,or it might be a defective Tegra4 comes from nVidia shield /sarcasm

I actually wasn't kidding when I said

I was wondering what Nvidia was going to do with all the un-sold Tegra 4 chips

Besides Nvidia users don't have stuttering nor tearing....right guys ???


It reminded me of Tegra 3 short board modules. If they were full board they be almost twins

kontron%20ulp%20com.jpg


100918-apalis-arm-som-t30-top-900x492.jpg


NVIDIA GeForce GT 630M
353.jpg


its at the lower end for sure size wise
 
Last edited:
You clearly have no idea what the point of Adaptive V-sync is. It's to keep your frame rate from reducing to 30 if it ever drops below 60 (which is what it does, intervals of 30). Basically it was a way to get the best of both worlds.

Clearly neither do you then. Adaptive V-Sync is there to:
a) remove image tearing
b) doesn't pretty much half the framerate of FPS drops below certain level like normal V-Sync does

So, why do we need G-Sync to remove image tearing again?

Please link me to where it says it won't, cause I'm reaaaaaly curious just from where the BS spring blows. And even if it wouldn't, AMD is in the position to slipstream DX11 HLSL port pretty much every multi-platform game coming out in the next half-decade or so, so having 75% chance that, after a certain point, multi-platform next-generation games can and will be written in Mantle pretty much cements them into API relevancy, w/ or w/o a leading share in the market.

Name one nVidia branded tech that ISN'T proprietary.

Well, AMD does say you need a GCN powered hardware to use it and NVIDIA doesn't have that at all. And since it's an low level API it is very hardware specific. That would be like expecting ATI, S3 and NVIDIA to support Glide back in its days. Natively.

High level API's work that way (but are slower, aka Direct3D), low level ones don't (and are faster as a result, aka Matle or Glide).
 
Well, AMD does say you need a GCN powered hardware to use it and NVIDIA doesn't have that at all. And since it's an low level API it is very hardware specific. That would be like expecting ATI, S3 and NVIDIA to support Glide back in its days. Natively.

High level API's work that way (but are slower, aka Direct3D), low level ones don't (and are faster as a result, aka Matle or Glide).

Anyone remember GLIDE Wrappers ?

Back then it was possible but now a days it be a lawsuit frenzy. That's why there no CUDA\PhysX wrappers. Slower\buggy but at least you weren't locked out of the API entirely.

Nvidia G-Sync FAQ

Q: Does NVIDIA G-SYNC work for all games?

A: NVIDIA G-SYNC works with all games. However, we have found some games that do not behave well and for those games we recommend that users take advantage of our control panel’s ability to disable G-SYNC per game. Games that NVIDIA discovers that have trouble with G-SYNC will be disabled by default in our driver.

Boo!!!


Q: Does G-SYNC work with FCAT?

A: FCAT requires a video capture card to catch the output graphics stream going to a monitor. Since G-SYNC is DP only and G-SYNC manipulates DP in new ways, it is very unlikely that existing capture cards will work. Fortunately, FRAPS is now an accurate reflection of the performance of a G-SYNC enabled system, and FRAPS output can be directly read by FCAT for comparative processing.

:rolleyes:

Q: How much more does G-SYNC add to the cost of a monitor?

A: The NVIDIA G-SYNC Do-it-yourself kit will cost approximately $175.
 
Last edited:
This looked like an interesting thing to try as i just bought the VG248QE a while ago, but it looks it'll cost a kidney and might not be available for me :ohwell: will see when its released,
aand it looks i might need a new system too with a kepler card :banghead:
 
@Xzibit
Glide Wrapper means emulation. Emulation means slow. How exactly that solves anything? Glide wrappers exist because you could play games that would then look better. Besides, what you can emulate from Voodoo cards doesn't mean you can emulate modern stuff today. Stuff was rather simple back then... Vertex Shaders had software emulation, but it was slower. Pixel Shaders were not even possible to emulate without getting 1 frame per minute... and neither was available on any Voodoo card...

Mantle is performance only thing and if you negate the boost with emulation, wouldn't it be easier to just stay with Direct3D 11 then and remain at point zero?
 
You guys make my hatred for booth babes and Bethsoft seem rational and levelheaded (which it totally are btw). :laugh:

Anyway wasn't qubit going on about something like this some time ago?
 
Last edited:
Ok what the hell does this have to do with Glidewrapper?

Say this like you are teaching it to people at a class and not like you are rabid fandogs or sales people.
 
instead of braving the 60+ fps, they aim for 30fps@30Hz hahahahahaha :roll: :slap: :nutkick:



well, dont try that with a crt monitor or you end up in epileptical convulsion in no time :rockout:

This is going to 144Hz monitors too, meaning they aim to sync everything between 30 to 144fps@144Hz (refresh rate varies with fps). You obviously didn't read a thing about this, instead came in raging like any typical nvidia hater would.

No matter what this company is releasing it is always going to get this same hatred. Doesn't matter what the greatest minds of game development think.

G-SYNC board will support Lightboost too.
http://www.neogaf.com/forum/showpost.php?p=86572603&postcount=539

edit:
30Hz is minimum, variable between 30Hz to 144Hz (obvious max limit)

https://twitter.com/ID_AA_Carmack/status/391300283672051713
 
Last edited:
I am full with AMD and I hate Nvidia with their locks, but this really looks interesting and I believe AMD could follow in the future with something similar (who doesn't like to sell more hardware?).
 
This is going to 144Hz monitors too, meaning they aim to sync everything between 30 to 144fps@144Hz (refresh rate varies with fps). You obviously didn't read a thing about this, instead came in raging like any typical nvidia hater would.

No matter what this company is releasing it is always going to get this same hatred. Doesn't matter what the greatest minds of game development think.

https://twitter.com/ID_AA_Carmack/status/391300283672051713

No, it doesn't, especially since they're in Nvidia's pockets being payed to spread testimonials about useless "tech" like this, if you can even call it that.

Really? You think bi-lateral communication with the monitor's ability to control the frame rate delivery of the GPU so it's completely in sync with the monitor can just be easily implemented via a firmware update?

The official white paper hasn't even been released yet and you have to gall to make such an inaccurate and ballsy statement such as that. What standard do you think it can already use? There isn't a display protocol that is capable of doing that. From what I've read on Anandtech, it's a modification of the DisplayPort standard - meaning non-standard - meaning it can't be implemented on ANY of today's monitors without that specific hardware bundle.

Give me one reason why it can't work in the way I stated. The only thing Nvidia would be required to do is modify Adaptive V-sync to keep in check in with the monitor's refresh rate, and there would be nothing stopping AMD from developing a similar method. Fifty bucks says DisplayPort or one of the other display standard companies come up with a non-proprietary way of doing this in the next 5 years.

Nvidia G-Sync FAQ

Q: How much more does G-SYNC add to the cost of a monitor?

A: The NVIDIA G-SYNC Do-it-yourself kit will cost approximately $175.

Q: Does NVIDIA G-SYNC work for all games?

A: NVIDIA G-SYNC works with all games. However, we have found some games that do not behave well and for those games we recommend that users take advantage of our control panel’s ability to disable G-SYNC per game. Games that NVIDIA discovers that have trouble with G-SYNC will be disabled by default in our driver.

714-epic-facepalm.jpg


Wait, what...$175 dollars for more useless bullshit with a support list of 10-20 games that will never even exceed the refresh rate of a 120Hz monitor, unless they run at lower than garbage settings of a console and 95% of the time needing to turn off the "exclusive" shit for which you've payed out the ass for. :banghead: And people are supporting this shit...if Nvidia (not directed at you BTW) doesn't have the most braindead and loyal fanboys, I don't know what other company does (other than Apple).

just cuz we didnt hit a perfect or open world doesnt mean we should destroy it, this is still better than deciding between matrox+rendition+3dfx+whoever else all at once if you want to play various games in the 90s :banghead:

I would take those days over the price fixing bullshit Nvidia are playing now any day -- at least competition was fierce back then, and you still had various ways to run proprietary tech yourself (with Glide wrappers etc).
 
Last edited:
I support technological advancement, whether it is NVidia, AMD, Intel, or some other company, and G-sync looks promising. This is the type of disruptive technology like AMD did with Eyefinity where no one saw it coming and it will take a few years before everyone else catches up.

Have you remember monitor technology related that nVidia introduced between 2010?Yep...3D Vision that require very expensive 3D shutter glass,emitter and monitor approved by nVidia.I got one glasses,one emitter and Samsung's 223RZ 22 inch LCD for the price of $800, not to mention another $350 for GTX 470.As far as i remember,3D only "works" on nVidia specific cards and specific monitor.Later i knew nVidia just adopting 3D FPR from LG and bring it to desktop.For the same $1200 i switch to HD5850 + 42 inch LG 240Hz and had the same effect.Meanwhile,a pair of 3D Vision kit and Asus VG236H will cost you $650 and only works with highend GTX,or you can grab $250 LG D2343P-BN and paired it with every graphic card out there.Where are those "3D Gaming is the future" now?

Personally,i don't hate nVidia for their "innovation" or "breakthrough" technology.They push gaming to a new level,creating a better experience and better enjoyment.I just hated their price and the so called "proprietary" tech.

Audio does matter, better audio = better experience.
Yeah nv first with SLI, but not the first to do it "right". FCAT was just an anti AMD tool to bottleneck AMD sales.

A better DSP doesn't translate to better audio,it's only small fragment.You need a better amplifier that can encode DSP digital signal and better speaker to translate analog signal from amplifier.
FCAT was surely anti AMD,because AMD uses AFR rather than nVidia's SFR.Look at bright sides,AMD now working for better driver to address frame pacing issues.
 
I don't know enough about these to make a comment

Ha! Probably the smartest comment in this thread so far. You deserve a medal.
I know how hard I try to shut up about things I don't know much about yet.


Well, AMD does say you need a GCN powered hardware to use it and NVIDIA doesn't have that at all. And since it's an low level API it is very hardware specific. That would be like expecting ATI, S3 and NVIDIA to support Glide back in its days. Natively.

I know what you're saying but:

1. Kepler might not support it, but Maxwell or Volta at the very list could. GPUs are way more complex and adaptable creatures then what they used to be back then... and even then:

2. You could use wrappers to run Glide on non-3dfx hardware...

Ah, I still remember how I finally got Carmageddon 2 to work on a glide wrapper on nV hardware, a world of difference in graphics...
 
I really hope nVidia releases a version of this tech that works with existing monitors, something like a HDMI dongle between the monitor and graphic, and make it hardware agnostic please (though I don't see that last part happening...)

Most PC gamers have invested a lot of moola on their monitors (myself included) and if nVidia really wants to see this tech take off so to speak, they must make this available to as many costumers as they can, not just people who invest in new monitors starting next year...
 
if nv did this in the range of 140-85Hz, id call that a gamers benefit :toast:
but dropping frames to what? 5fps@5Hz and calling it a holy grail????? :confused::twitch:
gimme a f*cking break! :banghead:

im too old to jump this bait, so im kinda happy i cant afford to buy nvidia g-card :laugh:

It won't drop below 30 you dunce. You could have at least spent 5 minutes reading what it does. Clearly, none of you have.

Give me one reason why it can't work in the way I stated. The only thing Nvidia would be required to do is modify Adaptive V-sync to keep in check in with the monitor's refresh rate, and there would be nothing stopping AMD from developing a similar method. Fifty bucks says DisplayPort or one of the other display standard companies come up with a non-proprietary way of doing this in the next 5 years.

Are you like, intentionally playing dumb or are you just not getting this? The refresh rate of the monitor will always be 60Hz, there is no way right now to modulate it, dynamically, on the fly. Period. End of story. V-Sync attempts to lock your frames to 60Hz so it doesn't induce screen tearing, because the monitor operates at 60Hz. You can change it, but not dynamically. Doesn't work that way. But you can still get some screen tearing, and most importantly, you can get input lag because the GPU is rendering frames faster than the monitor can handle.

Adaptive V-Sync, simply removes the restriction on frame rates when it drops below 60. So instead of hard locking your game into either 60FPS or 30FPS, if it drops below, it simply reacts like V-sync isn't enabled, so it can run at 45FPS instead of being locked to intervals of 30. Again, this has nothing to do with the monitor and can not control it. Why do you even think Adaptive V-sync can change the monitor's refresh rate? How do you expect Adaptive V-sync to change the monitor's refresh rate - on the fly - to 45Hz? It doesn't and it physically can't. There is no standard that allows that to happen. There is no protocol that allows that to happen.

That is what G-Sync is. Maybe one of the consortiums can come up with a hardware agnostic standard...at some point. We don't know when, or if any of the consortiums even care. So please, for the love of god, stop making baseless (and wholly inaccurate and ignorant) assumptions. There isn't a single freaking monitor that works the way you describe these days and no firmware update is going to change that. Ever.
 
Last edited:
edit: 30Hz is minimum, variable between 30Hz to 144Hz (obvious max limit)
https://twitter.com/ID_AA_Carmack/status/391300283672051713

imagine a group of players playing at 120fps and a group struggling at 30-50fps.. whos going to get more frags? the first gropu can react after 1/120th of a wsecond, whereas the other only at 1/30th - 1/50th.. their reactions will be twice as slow. wheres the benefit for gamers? and let me repeat the crucial thing - nvidia wouldnt dare to present this on a crt monitor. and as you posted the carmack link, if you move your view so that the pixel needs to be refreshed faster than 1/30th of a second, it sends a duplicate frame which doesnt contain updated information in the scene aka a player leaning outta the corner. this can lead to upto four identical frames send to you which makes it upto 4*1/30 ~ 133 milisecond later response on your side versus a 120fps guy. is that clearer now? this technology is nvididas sorry excuse for being lazy to make graphics cards that are able to render 60+ fps at 1600p+ resolution. nothing more, nothing less. so now you see why im upset. and they are able to sell this crap even to you so smoothly. unbelievable.



It won't drop below 30 you dunce. You could have at least spent 5 minutes reading what it does. Clearly, none of you have.

thanks. as soon as i realized what this technology is, and i realized it as soon as they started talking about lowering the refresh frequency of the monitor, i wasnt exactly in the mood to start searching where is the lowest acceptable limit for those guys.. im inclined to think that they really have no limit if it boils down to getting money from you, lol!
 
Last edited:
Then maybe you shouldn't have made a hyperbolic inaccurate statement on something you knew nothing about. You know, like most people would.

I've never set up a WC loop, so you don't see me going into sub-forums and threads about setting up WC loops and throwing around BS beliefs on something I clearly know nothing about.
 
imagine a group of players playing at 120fps and a group struggling at 30-50fps.. whos going to get more frags? the first gropu can react after 1/120th of a wsecond, whereas the other only at 1/30th - 1/50th.. their reactions will be twice as slow. wheres the benefit for gamers? and let me repeat the crucial thing - nvidia wouldnt dare to present this on a crt monitor. and as you posted the carmack link, if you move your view so that the pixel needs to be refreshed faster than 1/30th of a second, it sends a duplicate frame which doesnt contain updated information in the scene aka a player leaning outta the corner. this can lead to upto four identical frames send to you which makes it upto 4*1/30 ~ 133 milisecond later response on your side versus a 120fps guy. is that clearer now? this technology is nvididas sorry excuse for being lazy to make graphics cards that are able to render 60+ fps at 1600p+ resolution. nothing more, nothing less. so now you see why im upset. and they are able to sell this crap even to you so smoothly. unbelievable.

Yes! How can they offer more options to gamers?! What impertinence! Let's make sure that all people can only buy graphic cards and monitors that are capped at 30FPS so no one can have an unfair advantage, who needs more alternatives anyways?!!

Who do these people think they are by offering innovation in this field?? Let's boycott Nvidia and burn all their engineers at the stake! And then to have the audacity to sell this technology and make a profit! How dare them?!!

/S :rolleyes:
 
@haswrong: Isn't that how it is now though?
 
@haswrong: Isn't that how it is now though?

nvidia said that gamers will love it.. this sounds like notebook gamers will love it.. are you ready to ditch your $1300 monitor for even more expensive one with the variable refresh rate, which doesnt make your reasponse any faster?

ok, i have to give you one thing, you can now watch your ass being fragged without stutter :D



...the audacity to sell this technology and make a profit! How dare them?!!
/S :rolleyes:

exactly.. if i decide to please someone, i do it for free..

remember when your momma asked you to take out the litter back then? did you ask like $270 for the service? the whole world will be soon eaten by the false religion that is the economy, you just wait.
 
Back
Top