Friday, October 18th 2013

NVIDIA Introduces G-SYNC Technology for Gaming Monitors

Solving the decades-old problem of onscreen tearing, stuttering and lag, NVIDIA today unveiled NVIDIA G-SYNC technology which, for the first time, enables perfect synchronization between the GPU and the display. The result is consistently smooth frame rates and ultrafast response not possible with previous display technologies.

Several years in the making, G-SYNC technology synchronizes the monitor's refresh rate to the GPU's render rate, so images display the moment they are rendered. Scenes appear instantly. Objects are sharper. Game play is smoother.
"Our commitment to create a pure gaming experience led us to G-SYNC," said Jeff Fisher, senior vice president of the GeForce business unit at NVIDIA. "This revolutionary technology eliminates artifacts that have long stood between gamers and the game. Once you play on a G-SYNC monitor, you'll never want to go back."

Since their earliest days, displays have had fixed refresh rates -- typically 60 times a second (hertz). But, due to the dynamic nature of video games, GPUs render frames at varying rates. As the GPU seeks to synchronize with the monitor, persistent tearing occurs. Turning on V-SYNC (or Vertical-SYNC) eliminates tearing but causes increased lag and stutter as the GPU and monitor refresh at different rates.

G-SYNC eliminates this tradeoff. It perfectly syncs the monitor to the GPU, regardless of frame rate, leading to uncompromised PC gaming experiences.

G-SYNC technology includes a G-SYNC module designed by NVIDIA and integrated into gaming monitors, as well as hardware and software incorporated into certain Kepler-based GPUs. (A full list of GPUs that support NVIDIA G-SYNC is available here.)

Leading Game Developers Blown Away
Game developers have quickly embraced the benefits of G-SYNC technology, which enables their games to be played seamlessly.

"The huge gains in GPU rendering power over the past decade have enabled developers and artists to create increasingly complex 3D scenes and worlds. But even on the highest end PC, the illusion of reality is hampered by tearing and stutter. NVIDIA G-SYNC elegantly solves this longstanding problem. Images on a G-SYNC display are stunningly stable and lifelike. G-SYNC literally makes everything look better." - Tim Sweeney, founder, EPIC Games

"NVIDIA's G-SYNC technology is a truly innovative solution to an ancient legacy restriction with computer graphics, and it enables one to finally see perfect tear-free pictures with the absolute lowest latency possible. The resulting output really allows your mind to interpret and see it as a true continuous moving picture which looks and feels fantastic. It's something that has to be seen to be believed!" - Johan Andersson, technical director, DICE

"With G-SYNC, you can finally have your cake and eat it too -- make every bit of the GPU power at your disposal contribute to a significantly better visual experience without the drawbacks of tear and stutter." - John Carmack, co-founder, iD Software

Rollout Plans by Monitor Manufacturers
Many of the industry's leading monitor manufacturers have already included G-SYNC technology in their product roadmaps for 2014. Among the first planning to roll out the technology are ASUS, BenQ, Philips and ViewSonic.

"ASUS strives to provide the best gaming experience through leading innovations. We are excited about offering NVIDIA's new G-SYNC technology in a variety of ASUS gaming monitors. We are certain that it will impress gamers with its incredible step-up in smoothness and visual quality." - Vincent Chiou, associate vice president, Display Business Unit, ASUS

"We are extremely thrilled to build G-SYNC into our professional gaming monitors. The two together, offering gamers a significant competitive advantage, will certainly take PC gaming experience to a whole new level." - Peter Chen, general manager, BenQ Technology Product Center

"We can't wait to start offering Philips monitors with G-SYNC technology specifically for gamers. We believe that anyone who really cares about their gaming experience is going to want one." - Sean Shih, global product marketing director, TPV Technology, (TPV sells Philips brand monitors)

"Everyone here at ViewSonic is pumped about the great visual experience that G-SYNC delivers. We look forward to making the most of this technology with our award-winning line of gaming and professional displays." - Jeff Volpe, president, ViewSonic

Enthusiasm by System Builders and Integrators
A variety of the industry's leading system builders and integrators are planning to make G-SYNC technology available in the months ahead. Among them are Digital Storm, EVGA, Falcon Northwest, Overlord and Scan Computers.

"A look at G-SYNC is a look at the future of displays. Having to go back to a standard monitor after seeing it will start to annoy you. It's that good." -Kelt Reeves, founder and CEO, Falcon Northwest.

"G-SYNC is a ground-breaking technology that will deliver a flawless gaming experience. We're hugely excited that our customers will be able to enjoy incredible gameplay at any frame rate without tearing or stutter." - Elan Raja III, co-founder, Scan Computers.
Add your own comment

147 Comments on NVIDIA Introduces G-SYNC Technology for Gaming Monitors

#53
The Von Matrices
kn00tcnas a red team guy, the hatred in this thread is ridiculous (on a related note, amd setting up a 290x demo across the street from nv's event made me uneasy)

name non proprietary nv tech? someone was blind on a videocardz thread as well, physx on the cpu is multiplatform & in many games+consoles! the new FCAT method runs anywhere, FXAA (although i want to say that's one guy), how about helping devs add DX10 or 11 support? not all nv sponsored titles have locks on them, amd has done the same in helping codemasters games be good looking & efficient

sure GPU physx & cuda are very annoying, not doubting that, it's not 'good guy nvidia', but many things start out proprietary to show that they work

we should be pressuring VESA/displayport/hdmi for an adaptive refresh technique like this :respect:

i dont get why nv doesnt license things out so that everyone can enjoy instead of locking down to only their platform (look at bluray, it's standard, just a royalty is paid, it's not locked to sony products)

just cuz we didnt hit a perfect or open world doesnt mean we should destroy it, this is still better than deciding between matrox+rendition+3dfx+whoever else all at once if you want to play various games in the 90s :banghead:
Thank you for this post. It seems like for every 1 neutral post on any AMD, NVidia or Intel press release there are 2 from people who read the only first sentence of the article and immediately claim the technology is evil based on some argument that would be disproved if they had read the entire article. If only more people approached these topics from a neutral point of view and took time to understand what they were criticizing this forum would be full of intelligent discussion. I guess this is the internet and you can't expect more... (I always wished there was a "No Thanks" button on TPU to note irrelevant posts, but I can imagine how it would be abused).

I support technological advancement, whether it is NVidia, AMD, Intel, or some other company, and G-sync looks promising. This is the type of disruptive technology like AMD did with Eyefinity where no one saw it coming and it will take a few years before everyone else catches up.
Posted on Reply
#54
ItsKlausMF
MuhammedAbdoSeriously , AMD is the one suffering from a colossal crap load of dropped frames, tearing, frame interleaving and micro-stuttering in their CF systems , that's why they are trying to mitigate the issue with CF over PCIe.

TrueAudio doesn't mean squat , and R290X is not really the answer to Titan at all , you will see that when the review comes about.The only real advantage for AMD now is Mantle, but it remains to be seen whether they will really get it to shine or not .

On the other hand, NVIDIA always has been the brute force of advancement in the PC space, AMD just plays catch up, and today's announcements just cements that idea.

NVIDIA was the first to introduce the GPU, SLi, Frame Pacing (FCAT) , GPU Boost ,Adaptive V.Sync, first to reach unified shader architecture on PCs, first with GPGPU , CUDA , PhysX , Optix (for real time ray tracing) , first with 3D gaming (3D Vision) . first with Optimus for mobile flexibility , first with Geforce Experience program , SLi profiles , shadow play (for recording) and game streaming .

They had better support with CSAA , FXAA , TXAA , driver side Ambient Occlusion and HBAO+ , TWIMTP program , better Linux , OpenGL support, .. and now G.Sync! all of these innovations are sustained and built upon to this day.

And even when AMD beat them to a certain invention , like Eyefinity , NVIDIA didn't stop until they topped it with more features , so they answered with Surround , then did 3D Surround and now 4k Surround.

NVIDIA was at the forefront in developing all of these technologies and they continue to sustain and expand them till now , AMD just follows suit , they fight back with stuff that they don't really sustain so they end up forgotten and abandoned ,even generating more trouble than they worth. Just look at the pathetic state of their own Eyefinity and all of it's CF frame problems.

In short AMD is the one feeling the heat, NOT NVIDIA, heck NVIDIA now fights two generations of AMD cards with only one generation of their own, Fermi held off HD 5870 and 6970 , Kepler held off 7970 and x290 ! and who knows about Maxwell !
Audio does matter, better audio = better experience.

Yeah nv first with SLI, but not the first to do it "right". FCAT was just an anti AMD tool to bottleneck AMD sales. PhysX wasn't even nvidias technology, they just bought it to make better marketing than ATI. Better Linux support? Read this and cry. OpenGL and OpenCL is pretty much AMD territory at this moment.

Then what do you mean by nvidia fighting two generations by AMD with only one generation? GF10*(gen1) GF11*(Gen2). Maxwell is far FAR away(late 2014) and by then AMD will be as well hitting their next generation.
Posted on Reply
#55
The Von Matrices
ItsKlausMFAudio does matter, better audio = better experience.
No argument there. How much TrueAudio actually improves audio is still up for debate until AMD releases drivers for it.
ItsKlausMFYeah nv first with SLI, but not the first to do it "right".
What in the world does this even mean?
ItsKlausMFFCAT was just an anti AMD tool to bottleneck AMD sales.
So you're arguing that we shouldn't have any way to measure frame pacing just because AMD couldn't do it properly?
ItsKlausMFPhysX wasn't even nvidias technology, they just bought it to make better marketing than ATI.
PhysX was doomed to fail even before NVidia took it over. There was no way that dedicated physics accelerator cards were going to take off just like dedicated sound cards had died out in the years prior.
ItsKlausMFBetter Linux support? Read this and cry. OpenGL and OpenCL is pretty much AMD territory at this moment.
I don't know enough about these to make a comment.
ItsKlausMFThen what do you mean by nvidia fighting two generations by AMD with only one generation? GF10*(gen1) GF11*(Gen2). Maxwell is far FAR away(late 2014) and by then AMD will be as well hitting their next generation.
Maxwell is speculated to launch on 28nm in early 2014 then move to 20nm in late 2014 according to reports. And what does it matter how many generations each manufacturer puts out? All that matters is performance/price, which is why I don't care one bit about the rebrands that both sides do as long as the rebrands move down pricing.
Posted on Reply
#56
Xzibit
1d10tinteresting...so G-Sync could be nVidia's mobile version of lowest end SKU graphic card that doesn't sell,or it might be a defective Tegra4 comes from nVidia shield /sarcasm
I actually wasn't kidding when I said
XzibitI was wondering what Nvidia was going to do with all the un-sold Tegra 4 chips

Besides Nvidia users don't have stuttering nor tearing....right guys ???
It reminded me of Tegra 3 short board modules. If they were full board they be almost twins





NVIDIA GeForce GT 630M


its at the lower end for sure size wise
Posted on Reply
#57
RejZoR
Solidstate89You clearly have no idea what the point of Adaptive V-sync is. It's to keep your frame rate from reducing to 30 if it ever drops below 60 (which is what it does, intervals of 30). Basically it was a way to get the best of both worlds.
Clearly neither do you then. Adaptive V-Sync is there to:
a) remove image tearing
b) doesn't pretty much half the framerate of FPS drops below certain level like normal V-Sync does

So, why do we need G-Sync to remove image tearing again?
NeoXFPlease link me to where it says it won't, cause I'm reaaaaaly curious just from where the BS spring blows. And even if it wouldn't, AMD is in the position to slipstream DX11 HLSL port pretty much every multi-platform game coming out in the next half-decade or so, so having 75% chance that, after a certain point, multi-platform next-generation games can and will be written in Mantle pretty much cements them into API relevancy, w/ or w/o a leading share in the market.

Name one nVidia branded tech that ISN'T proprietary.
Well, AMD does say you need a GCN powered hardware to use it and NVIDIA doesn't have that at all. And since it's an low level API it is very hardware specific. That would be like expecting ATI, S3 and NVIDIA to support Glide back in its days. Natively.

High level API's work that way (but are slower, aka Direct3D), low level ones don't (and are faster as a result, aka Matle or Glide).
Posted on Reply
#58
Xzibit
RejZoRWell, AMD does say you need a GCN powered hardware to use it and NVIDIA doesn't have that at all. And since it's an low level API it is very hardware specific. That would be like expecting ATI, S3 and NVIDIA to support Glide back in its days. Natively.

High level API's work that way (but are slower, aka Direct3D), low level ones don't (and are faster as a result, aka Matle or Glide).
Anyone remember GLIDE Wrappers ?

Back then it was possible but now a days it be a lawsuit frenzy. That's why there no CUDA\PhysX wrappers. Slower\buggy but at least you weren't locked out of the API entirely.

Nvidia G-Sync FAQ
Q: Does NVIDIA G-SYNC work for all games?

A: NVIDIA G-SYNC works with all games. However, we have found some games that do not behave well and for those games we recommend that users take advantage of our control panel’s ability to disable G-SYNC per game. Games that NVIDIA discovers that have trouble with G-SYNC will be disabled by default in our driver.
Boo!!!
Q: Does G-SYNC work with FCAT?

A: FCAT requires a video capture card to catch the output graphics stream going to a monitor. Since G-SYNC is DP only and G-SYNC manipulates DP in new ways, it is very unlikely that existing capture cards will work. Fortunately, FRAPS is now an accurate reflection of the performance of a G-SYNC enabled system, and FRAPS output can be directly read by FCAT for comparative processing.
:rolleyes:
Q: How much more does G-SYNC add to the cost of a monitor?

A: The NVIDIA G-SYNC Do-it-yourself kit will cost approximately $175.
Posted on Reply
#59
Doc41
This looked like an interesting thing to try as i just bought the VG248QE a while ago, but it looks it'll cost a kidney and might not be available for me :ohwell: will see when its released,
aand it looks i might need a new system too with a kepler card :banghead:
Posted on Reply
#60
RejZoR
@Xzibit
Glide Wrapper means emulation. Emulation means slow. How exactly that solves anything? Glide wrappers exist because you could play games that would then look better. Besides, what you can emulate from Voodoo cards doesn't mean you can emulate modern stuff today. Stuff was rather simple back then... Vertex Shaders had software emulation, but it was slower. Pixel Shaders were not even possible to emulate without getting 1 frame per minute... and neither was available on any Voodoo card...

Mantle is performance only thing and if you negate the boost with emulation, wouldn't it be easier to just stay with Direct3D 11 then and remain at point zero?
Posted on Reply
#61
Frick
Fishfaced Nincompoop
You guys make my hatred for booth babes and Bethsoft seem rational and levelheaded (which it totally are btw). :laugh:

Anyway wasn't qubit going on about something like this some time ago?
Posted on Reply
#62
remixedcat
Ok what the hell does this have to do with Glidewrapper?

Say this like you are teaching it to people at a class and not like you are rabid fandogs or sales people.
Posted on Reply
#63
GC_PaNzerFIN
haswronginstead of braving the 60+ fps, they aim for 30fps@30Hz hahahahahaha :roll: :slap: :nutkick:



well, dont try that with a crt monitor or you end up in epileptical convulsion in no time :rockout:
This is going to 144Hz monitors too, meaning they aim to sync everything between 30 to 144fps@144Hz (refresh rate varies with fps). You obviously didn't read a thing about this, instead came in raging like any typical nvidia hater would.

No matter what this company is releasing it is always going to get this same hatred. Doesn't matter what the greatest minds of game development think.

G-SYNC board will support Lightboost too.
www.neogaf.com/forum/showpost.php?p=86572603&postcount=539

edit:
30Hz is minimum, variable between 30Hz to 144Hz (obvious max limit)

twitter.com/ID_AA_Carmack/status/391300283672051713
Posted on Reply
#64
john_
I am full with AMD and I hate Nvidia with their locks, but this really looks interesting and I believe AMD could follow in the future with something similar (who doesn't like to sell more hardware?).
Posted on Reply
#65
Am*
GC_PaNzerFINThis is going to 144Hz monitors too, meaning they aim to sync everything between 30 to 144fps@144Hz (refresh rate varies with fps). You obviously didn't read a thing about this, instead came in raging like any typical nvidia hater would.

No matter what this company is releasing it is always going to get this same hatred. Doesn't matter what the greatest minds of game development think.

twitter.com/ID_AA_Carmack/status/391300283672051713
No, it doesn't, especially since they're in Nvidia's pockets being payed to spread testimonials about useless "tech" like this, if you can even call it that.
Solidstate89Really? You think bi-lateral communication with the monitor's ability to control the frame rate delivery of the GPU so it's completely in sync with the monitor can just be easily implemented via a firmware update?

The official white paper hasn't even been released yet and you have to gall to make such an inaccurate and ballsy statement such as that. What standard do you think it can already use? There isn't a display protocol that is capable of doing that. From what I've read on Anandtech, it's a modification of the DisplayPort standard - meaning non-standard - meaning it can't be implemented on ANY of today's monitors without that specific hardware bundle.
Give me one reason why it can't work in the way I stated. The only thing Nvidia would be required to do is modify Adaptive V-sync to keep in check in with the monitor's refresh rate, and there would be nothing stopping AMD from developing a similar method. Fifty bucks says DisplayPort or one of the other display standard companies come up with a non-proprietary way of doing this in the next 5 years.
XzibitNvidia G-Sync FAQ

Q: How much more does G-SYNC add to the cost of a monitor?

A: The NVIDIA G-SYNC Do-it-yourself kit will cost approximately $175.

Q: Does NVIDIA G-SYNC work for all games?

A: NVIDIA G-SYNC works with all games. However, we have found some games that do not behave well and for those games we recommend that users take advantage of our control panel’s ability to disable G-SYNC per game. Games that NVIDIA discovers that have trouble with G-SYNC will be disabled by default in our driver.


Wait, what...$175 dollars for more useless bullshit with a support list of 10-20 games that will never even exceed the refresh rate of a 120Hz monitor, unless they run at lower than garbage settings of a console and 95% of the time needing to turn off the "exclusive" shit for which you've payed out the ass for. :banghead: And people are supporting this shit...if Nvidia (not directed at you BTW) doesn't have the most braindead and loyal fanboys, I don't know what other company does (other than Apple).
kn00tcnjust cuz we didnt hit a perfect or open world doesnt mean we should destroy it, this is still better than deciding between matrox+rendition+3dfx+whoever else all at once if you want to play various games in the 90s :banghead:
I would take those days over the price fixing bullshit Nvidia are playing now any day -- at least competition was fierce back then, and you still had various ways to run proprietary tech yourself (with Glide wrappers etc).
Posted on Reply
#66
1d10t
The Von MatricesI support technological advancement, whether it is NVidia, AMD, Intel, or some other company, and G-sync looks promising. This is the type of disruptive technology like AMD did with Eyefinity where no one saw it coming and it will take a few years before everyone else catches up.
Have you remember monitor technology related that nVidia introduced between 2010?Yep...3D Vision that require very expensive 3D shutter glass,emitter and monitor approved by nVidia.I got one glasses,one emitter and Samsung's 223RZ 22 inch LCD for the price of $800, not to mention another $350 for GTX 470.As far as i remember,3D only "works" on nVidia specific cards and specific monitor.Later i knew nVidia just adopting 3D FPR from LG and bring it to desktop.For the same $1200 i switch to HD5850 + 42 inch LG 240Hz and had the same effect.Meanwhile,a pair of 3D Vision kit and Asus VG236H will cost you $650 and only works with highend GTX,or you can grab $250 LG D2343P-BN and paired it with every graphic card out there.Where are those "3D Gaming is the future" now?

Personally,i don't hate nVidia for their "innovation" or "breakthrough" technology.They push gaming to a new level,creating a better experience and better enjoyment.I just hated their price and the so called "proprietary" tech.
ItsKlausMFAudio does matter, better audio = better experience.
Yeah nv first with SLI, but not the first to do it "right". FCAT was just an anti AMD tool to bottleneck AMD sales.
A better DSP doesn't translate to better audio,it's only small fragment.You need a better amplifier that can encode DSP digital signal and better speaker to translate analog signal from amplifier.
FCAT was surely anti AMD,because AMD uses AFR rather than nVidia's SFR.Look at bright sides,AMD now working for better driver to address frame pacing issues.
Posted on Reply
#67
NeoXF
The Von MatricesI don't know enough about these to make a comment
Ha! Probably the smartest comment in this thread so far. You deserve a medal.
I know how hard I try to shut up about things I don't know much about yet.
RejZoRWell, AMD does say you need a GCN powered hardware to use it and NVIDIA doesn't have that at all. And since it's an low level API it is very hardware specific. That would be like expecting ATI, S3 and NVIDIA to support Glide back in its days. Natively.
I know what you're saying but:

1. Kepler might not support it, but Maxwell or Volta at the very list could. GPUs are way more complex and adaptable creatures then what they used to be back then... and even then:

2. You could use wrappers to run Glide on non-3dfx hardware...

Ah, I still remember how I finally got Carmageddon 2 to work on a glide wrapper on nV hardware, a world of difference in graphics...
Posted on Reply
#68
15th Warlock
I really hope nVidia releases a version of this tech that works with existing monitors, something like a HDMI dongle between the monitor and graphic, and make it hardware agnostic please (though I don't see that last part happening...)

Most PC gamers have invested a lot of moola on their monitors (myself included) and if nVidia really wants to see this tech take off so to speak, they must make this available to as many costumers as they can, not just people who invest in new monitors starting next year...
Posted on Reply
#69
Solidstate89
haswrongif nv did this in the range of 140-85Hz, id call that a gamers benefit :toast:
but dropping frames to what? 5fps@5Hz and calling it a holy grail????? :confused::twitch:
gimme a f*cking break! :banghead:

im too old to jump this bait, so im kinda happy i cant afford to buy nvidia g-card :laugh:
It won't drop below 30 you dunce. You could have at least spent 5 minutes reading what it does. Clearly, none of you have.
Am*Give me one reason why it can't work in the way I stated. The only thing Nvidia would be required to do is modify Adaptive V-sync to keep in check in with the monitor's refresh rate, and there would be nothing stopping AMD from developing a similar method. Fifty bucks says DisplayPort or one of the other display standard companies come up with a non-proprietary way of doing this in the next 5 years.
Are you like, intentionally playing dumb or are you just not getting this? The refresh rate of the monitor will always be 60Hz, there is no way right now to modulate it, dynamically, on the fly. Period. End of story. V-Sync attempts to lock your frames to 60Hz so it doesn't induce screen tearing, because the monitor operates at 60Hz. You can change it, but not dynamically. Doesn't work that way. But you can still get some screen tearing, and most importantly, you can get input lag because the GPU is rendering frames faster than the monitor can handle.

Adaptive V-Sync, simply removes the restriction on frame rates when it drops below 60. So instead of hard locking your game into either 60FPS or 30FPS, if it drops below, it simply reacts like V-sync isn't enabled, so it can run at 45FPS instead of being locked to intervals of 30. Again, this has nothing to do with the monitor and can not control it. Why do you even think Adaptive V-sync can change the monitor's refresh rate? How do you expect Adaptive V-sync to change the monitor's refresh rate - on the fly - to 45Hz? It doesn't and it physically can't. There is no standard that allows that to happen. There is no protocol that allows that to happen.

That is what G-Sync is. Maybe one of the consortiums can come up with a hardware agnostic standard...at some point. We don't know when, or if any of the consortiums even care. So please, for the love of god, stop making baseless (and wholly inaccurate and ignorant) assumptions. There isn't a single freaking monitor that works the way you describe these days and no firmware update is going to change that. Ever.
Posted on Reply
#70
haswrong
GC_PaNzerFINedit: 30Hz is minimum, variable between 30Hz to 144Hz (obvious max limit)
twitter.com/ID_AA_Carmack/status/391300283672051713
imagine a group of players playing at 120fps and a group struggling at 30-50fps.. whos going to get more frags? the first gropu can react after 1/120th of a wsecond, whereas the other only at 1/30th - 1/50th.. their reactions will be twice as slow. wheres the benefit for gamers? and let me repeat the crucial thing - nvidia wouldnt dare to present this on a crt monitor. and as you posted the carmack link, if you move your view so that the pixel needs to be refreshed faster than 1/30th of a second, it sends a duplicate frame which doesnt contain updated information in the scene aka a player leaning outta the corner. this can lead to upto four identical frames send to you which makes it upto 4*1/30 ~ 133 milisecond later response on your side versus a 120fps guy. is that clearer now? this technology is nvididas sorry excuse for being lazy to make graphics cards that are able to render 60+ fps at 1600p+ resolution. nothing more, nothing less. so now you see why im upset. and they are able to sell this crap even to you so smoothly. unbelievable.
Solidstate89It won't drop below 30 you dunce. You could have at least spent 5 minutes reading what it does. Clearly, none of you have.
thanks. as soon as i realized what this technology is, and i realized it as soon as they started talking about lowering the refresh frequency of the monitor, i wasnt exactly in the mood to start searching where is the lowest acceptable limit for those guys.. im inclined to think that they really have no limit if it boils down to getting money from you, lol!
Posted on Reply
#71
Solidstate89
Then maybe you shouldn't have made a hyperbolic inaccurate statement on something you knew nothing about. You know, like most people would.

I've never set up a WC loop, so you don't see me going into sub-forums and threads about setting up WC loops and throwing around BS beliefs on something I clearly know nothing about.
Posted on Reply
#72
15th Warlock
haswrongimagine a group of players playing at 120fps and a group struggling at 30-50fps.. whos going to get more frags? the first gropu can react after 1/120th of a wsecond, whereas the other only at 1/30th - 1/50th.. their reactions will be twice as slow. wheres the benefit for gamers? and let me repeat the crucial thing - nvidia wouldnt dare to present this on a crt monitor. and as you posted the carmack link, if you move your view so that the pixel needs to be refreshed faster than 1/30th of a second, it sends a duplicate frame which doesnt contain updated information in the scene aka a player leaning outta the corner. this can lead to upto four identical frames send to you which makes it upto 4*1/30 ~ 133 milisecond later response on your side versus a 120fps guy. is that clearer now? this technology is nvididas sorry excuse for being lazy to make graphics cards that are able to render 60+ fps at 1600p+ resolution. nothing more, nothing less. so now you see why im upset. and they are able to sell this crap even to you so smoothly. unbelievable.
Yes! How can they offer more options to gamers?! What impertinence! Let's make sure that all people can only buy graphic cards and monitors that are capped at 30FPS so no one can have an unfair advantage, who needs more alternatives anyways?!!

Who do these people think they are by offering innovation in this field?? Let's boycott Nvidia and burn all their engineers at the stake! And then to have the audacity to sell this technology and make a profit! How dare them?!!

/S :rolleyes:
Posted on Reply
#73
Frick
Fishfaced Nincompoop
@haswrong: Isn't that how it is now though?
Posted on Reply
#74
haswrong
Frick@haswrong: Isn't that how it is now though?
nvidia said that gamers will love it.. this sounds like notebook gamers will love it.. are you ready to ditch your $1300 monitor for even more expensive one with the variable refresh rate, which doesnt make your reasponse any faster?

ok, i have to give you one thing, you can now watch your ass being fragged without stutter :D
15th Warlock...the audacity to sell this technology and make a profit! How dare them?!!
/S :rolleyes:
exactly.. if i decide to please someone, i do it for free..

remember when your momma asked you to take out the litter back then? did you ask like $270 for the service? the whole world will be soon eaten by the false religion that is the economy, you just wait.
Posted on Reply
#75
1d10t
I think somebody forgot, fps are not Hz so 60 fps doesn't mean 60 Hz.Displayed image consist of many frame rendered in one second,while monitor refresh rate consist three factors : horizontal frequencies,resolution and response time.
Care to explain where this G Sync take a part?
Posted on Reply
Add your own comment
Apr 18th, 2024 22:05 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts