• We've upgraded our forums. Please post any issues/requests in this thread.

NVIDIA Introduces G-SYNC Technology for Gaming Monitors

Joined
Sep 28, 2012
Messages
206 (0.11/day)
Likes
44
System Name Bluish Eight
Processor AMD Vishera FX 8350
Motherboard Asus Sabertooth 990FX R.20
Cooling XSPC Raystorm +Dual XSPC Rasa + EK XTC 420 + EK XTX 240 + Swiftech 655B
Memory Gskill F3-2400C10D-16GTX
Video Card(s) Asus R9 290 DCU II OC CrossfireX
Storage Samsung 840 240 Gb +Dual WDC 4TB WD40EZRX
Display(s) BenQ XL2730Z Freesync
Case Powercolor Quake TLC-005
Audio Device(s) Asus Xonar D2X to Logitech z5500
Power Supply Corsair HX1000i
Mouse Logitech G402 Hyperion Fury
Keyboard Logitech G710+
#76
I think somebody forgot, fps are not Hz so 60 fps doesn't mean 60 Hz.Displayed image consist of many frame rendered in one second,while monitor refresh rate consist three factors : horizontal frequencies,resolution and response time.
Care to explain where this G Sync take a part?
 
Joined
Feb 23, 2008
Messages
511 (0.14/day)
Likes
95
Location
Montreal
System Name Sairikiki / Tesseract
Processor i7 920@3.56/ i5 4690k@4.2
Motherboard GB EX58-UDP4 / GB Z97MX-G5
Cooling H60 / LQ-310
Memory Corsair Something 12 / Corsair 16
Video Card(s) TriX 290 / Devil RX480
Storage Way too many...
Display(s) QNIX 1440p 96Hz / Sony w800b
Case AzzA 1000 / Carbide 240
Audio Device(s) Auzen Forte / board + Yamaha RX-V475 + Pioneer AJ
Power Supply Corsair HX750 / Dark Power PRO10
Software w10 64 / w10 64
Benchmark Scores I don't play benchmarks...
#77
I'm surprised Carmack sounded so positive about this thing. I respect him as much as the next guy, but I can't see it that way at he moment.

I am curious though, how will g-sync do in say fighting games that require to-the-frame accuracy to pull out the best combos? It could be either a real boon or a curse for them.
 
Joined
Sep 19, 2012
Messages
615 (0.32/day)
Likes
75
System Name [WIP]
Processor Intel Pentium G3420 [i7-4790K SOON(tm)]
Motherboard MSI Z87-GD65 Gaming
Cooling [Corsair H100i]
Memory G.Skill TridentX 2x8GB-2400-CL10 DDR3
Video Card(s) [MSI AMD Radeon R9-290 Gaming]
Storage Seagate 2TB Desktop SSHD / [Samsung 256GB 840 PRO]
Display(s) [BenQ XL2420Z]
Case [Corsair Obsidian 750D]
Power Supply Corsair RM750
Software Windows 8.1 x64 Pro / Linux Mint 15 / SteamOS
#78
Yes! How can they offer more options to gamers?! What impertinence! Let's make sure that all people can only buy graphic cards and monitors that are capped at 30FPS so no one can have an unfair advantage, who needs more alternatives anyways?!!

Who do these people think they are by offering innovation in this field?? Let's boycott Nvidia and burn all their engineers at the stake! And then to have the audacity to sell this technology and make a profit! How dare them?!!

/S :rolleyes:
Yes, how dare corporation care about anything else other than making money!

/S :rolleyes:


Just because the world is how it is now, doesn't mean it's any good. But what the Hell, you guys can salute your new green, blue, red or WTF ever overlords in any way you want, it's not me being dead inside while shielding myself further and further away from things that should count more, with consumerism gimmicks.

This makes me sick to my stomach, but what can I do...
 
Joined
Aug 16, 2004
Messages
2,998 (0.62/day)
Likes
1,578
Location
Visalia, CA
System Name Crimson Titan 2.5
Processor Intel Core i7 5930K @ 4.5GHz 1.28V
Motherboard Asus ROG Rampage V Extreme
Cooling CPU: Swiftech H220-X, RAM: Geil Cyclone 2, VGA: Custom EK water loop
Memory 8x4GBs G.Skill Ripjaws DDR4 XMP2 3000MHz @ 1.35V
Video Card(s) 2x EVGA GTX Titan X SCs in SLI under full cover EK water blocks
Storage OS: 256GBs Samsung 850 Pro SSD/Games: 3TBs WD Black
Display(s) Acer XB280HK 28" 4K G-Sync - 2x27" Acer HN274s 1920x1080 120Hz
Case Corsair Graphite Black 760T - EK Pump/Reservoir, 360mm EK Radiator
Audio Device(s) SB X-Fi Fatal1ty Pro on Logitech THX Z-5500 5.1/Razer Tiamat 7.1 Headset
Power Supply Silvestone ST1500 1.5 kW
Mouse Cyborg R.A.T. 9
Keyboard Corsair K70 RGB Cherry MX Red
Software Windows 10 Pro 64bit
#79
exactly.. if i decide to please someone, i do it for free..

remember when your momma asked you to take out the litter back then? did you ask like $270 for the service? the whole world will be soon eaten by the false religion that is the economy, you just wait.
Yes, how dare corporation care about anything else other than making money!

/S :rolleyes:


Just because the world is how it is now, doesn't mean it's any good. But what the Hell, you guys can salute your new green, blue, red or WTF ever overlords in any way you want, it's not me being dead inside while shielding myself further and further away from things that should count more, with consumerism gimmicks.

This makes me sick to my stomach, but what can I do...
Aww, how nice of you both, but let me ask you a few questions: Do you have a job? If you do, do you work for free? I mean, do you offer your services without expecting to be remunerated for them? And if that's the case, how do you support yourself and/or your family? Through charity/welfare?

I mean, you have to find a way to pay your bills somehow, am I right?

I would assume that people who work for these companies (and please note that this applies to any given company in our "evil economy") expect some sort of compensation for their work, wouldn't they?

Anyway, not going to discuss the basics of how our society works, not the right forum to do so, but I just found your counter argument really amusing; besides, no one is pointing a gun to your face forcing you to buy these new monitors, so there's no reason to get all worked up about this superfluous piece of technology, when there're obviously way more important things to worry about and fix like the state of the economy, world hunger, world peace and other serious matters...:rolleyes:

Back to topic, it sounds like nVidia is genuinely interested in fixing this V-Sync problem, that has existed for so long, I for one am excited to see that someone is focusing some research to addressing this issue, I just hope they do offer the results of their findings to more costumers than just owners of new monitors and/or Kepler based cards.
 
Last edited:
Joined
Dec 16, 2010
Messages
1,484 (0.58/day)
Likes
544
System Name My Surround PC
Processor Intel Core i7 4770K @ 4.2 GHz (1.15 V)
Motherboard ASRock Z87 Extreme6
Cooling Swiftech MCP35X / XSPC Rasa CPU / Swiftech MCW82 / Koolance HX-1320 w/ 8 Scythe Fans
Memory 16GB (2 x 8 GB) Mushkin Blackline DDR3-2400 CL11-13-13-31
Video Card(s) MSI Nvidia GeForce GTX 980 Ti Armor 2X
Storage Samsung SSD 850 Pro 256GB, 2 x 4TB HGST NAS HDD in RAID 1
Display(s) 3 x Acer K272HUL 27" in Surround 7860x1440
Case NZXT Source 530
Audio Device(s) Integrated ALC1150 + Logitech Z-5500 5.1
Power Supply Seasonic X-1250 1.25kW
Mouse Gigabyte Aivia Krypton
Keyboard Logitech G15
Software Windows 8.1 Pro x64
#80
nvidia said that gamers will love it.. this sounds like notebook gamers will love it.. are you ready to ditch your $1300 monitor for even more expensive one with the variable refresh rate, which doesnt make your reasponse any faster?

ok, i have to give you one thing, you can now watch your ass being fragged without stutter :D
By that logic there should be no reason why we should buy new graphics cards to run at more detailed settings because the increased detail doesn't make you respond any faster. Have fun running at "low" settings on your integrated graphics. There's a lot more to video gaming than competition.

Back to topic, it sounds like nVidia is genuinely interested in fixing this V-Sync problem, that has existed for so long, I for one am excited to see that someone is focusing some research to addressing this issue, I just hope they do offer the results of their findings to more costumers than just owners of new monitors and/or Kepler based cards.
Thank you. It seems as if for some people NVidia producing any technology first makes it inherently evil. I don't care who invented it, I support the technology. Kudos to NVidia for doing it first.

For those who claim this is proprietary with no evidence, I quote Anandtech stating exactly the opposite:

Meanwhile we do have limited information on the interface itself; G-Sync is designed to work over DisplayPort (since it’s packet based), with NVIDIA manipulating the timing of the v-blank signal to indicate a refresh. Importantly, this indicates that NVIDIA may not be significantly modifying the DisplayPort protocol, which at least cracks open the door to other implementations on the source/video card side.
Just because no one else supports it yet doesn't mean that no one else ever will. Someone has to be first. Most standards aren't designed by committee but by everyone adopting one manufacturer's implementation for simplicity.
 
Last edited:
Joined
Aug 16, 2004
Messages
2,998 (0.62/day)
Likes
1,578
Location
Visalia, CA
System Name Crimson Titan 2.5
Processor Intel Core i7 5930K @ 4.5GHz 1.28V
Motherboard Asus ROG Rampage V Extreme
Cooling CPU: Swiftech H220-X, RAM: Geil Cyclone 2, VGA: Custom EK water loop
Memory 8x4GBs G.Skill Ripjaws DDR4 XMP2 3000MHz @ 1.35V
Video Card(s) 2x EVGA GTX Titan X SCs in SLI under full cover EK water blocks
Storage OS: 256GBs Samsung 850 Pro SSD/Games: 3TBs WD Black
Display(s) Acer XB280HK 28" 4K G-Sync - 2x27" Acer HN274s 1920x1080 120Hz
Case Corsair Graphite Black 760T - EK Pump/Reservoir, 360mm EK Radiator
Audio Device(s) SB X-Fi Fatal1ty Pro on Logitech THX Z-5500 5.1/Razer Tiamat 7.1 Headset
Power Supply Silvestone ST1500 1.5 kW
Mouse Cyborg R.A.T. 9
Keyboard Corsair K70 RGB Cherry MX Red
Software Windows 10 Pro 64bit
#81
By that logic there should be no reason why we should buy new graphics cards to run at more detailed settings because the increased detail doesn't make you respond any faster. Have fun running at "low" settings on your integrated graphics. There's a lot more to video gaming than competition.



Thank you. It seems as if for some people NVidia producing any technology first makes it inherently evil. I don't care who invented it, I support the technology. Kudos to NVidia for doing it first.

For those who claim this is proprietary with no evidence, I quote Anandtech stating exactly the opposite:


Just because no one else supports it yet doesn't mean that no one else ever will. Someone has to be first. Most standards aren't designed by committee but by everyone adopting one manufacturer's implementation for simplicity.
Exactly, seems like they have taken addressing this problem to heart, with innovations like Adaptative V-Sync, FCAT and now G-Sync.
 
Joined
Dec 22, 2011
Messages
2,074 (0.95/day)
Likes
1,155
System Name Zimmer Frame Rates
Processor Intel i7 920 @ Stock speeds baby
Motherboard EVGA X58 3X SLI
Cooling True 120
Memory Corsair Vengeance 12GB
Video Card(s) Palit GTX 980 Ti Super JetStream
Storage Of course
Display(s) Crossover 27Q 27" 2560x1440
Case Antec 1200
Audio Device(s) Don't be silly
Power Supply XFX 650W Core
Mouse Razer Deathadder Chroma
Keyboard Logitech UltraX
Software Windows 10
Benchmark Scores Epic
#83
Sounds like a GTX xx0 card combined with an G-SYNC enabled monitor will offer a pretty damn sweet BF4 experience.

Oh nVidia, you big meanies, no wonder peeps here are mad.
 
Joined
Dec 16, 2010
Messages
1,484 (0.58/day)
Likes
544
System Name My Surround PC
Processor Intel Core i7 4770K @ 4.2 GHz (1.15 V)
Motherboard ASRock Z87 Extreme6
Cooling Swiftech MCP35X / XSPC Rasa CPU / Swiftech MCW82 / Koolance HX-1320 w/ 8 Scythe Fans
Memory 16GB (2 x 8 GB) Mushkin Blackline DDR3-2400 CL11-13-13-31
Video Card(s) MSI Nvidia GeForce GTX 980 Ti Armor 2X
Storage Samsung SSD 850 Pro 256GB, 2 x 4TB HGST NAS HDD in RAID 1
Display(s) 3 x Acer K272HUL 27" in Surround 7860x1440
Case NZXT Source 530
Audio Device(s) Integrated ALC1150 + Logitech Z-5500 5.1
Power Supply Seasonic X-1250 1.25kW
Mouse Gigabyte Aivia Krypton
Keyboard Logitech G15
Software Windows 8.1 Pro x64
#84
I think people read Nvidia own FAQ on G-Sync

You can draw a conclusion there
Your comment does not disprove mine. As I quoted, the signaling technology should be possible to reverse engineer. At that point anyone can produce monitors or video outputs that comply to that standard. G-Sync will be NVidia exclusive for a few years just because no one has had time to dissect it. It doesn't mean that there will never be generic components that are compatible with it. The only difference is that third parties won't use the trademarked term "G-sync."

You also need to consider the source when you read the quote from the FAQ. All manufacturers advertise that their products only work with first-party accessories. It doesn't mean that third parties can't make compatible accessories.
 
Last edited:
Joined
Apr 30, 2012
Messages
2,417 (1.18/day)
Likes
1,332
#85
You also need to consider the source when you read the quote from the FAQ. All manufacturers advertise that their products only work with first-party accessories. It doesn't mean that third parties can't make compatible accessories.
Which is the accessory the GPU or the G-Sync ?

Since G-Sync will be talking to the driver. When was the last time Nvidia let outsiders tinker with that ?
 
Joined
Dec 16, 2010
Messages
1,484 (0.58/day)
Likes
544
System Name My Surround PC
Processor Intel Core i7 4770K @ 4.2 GHz (1.15 V)
Motherboard ASRock Z87 Extreme6
Cooling Swiftech MCP35X / XSPC Rasa CPU / Swiftech MCW82 / Koolance HX-1320 w/ 8 Scythe Fans
Memory 16GB (2 x 8 GB) Mushkin Blackline DDR3-2400 CL11-13-13-31
Video Card(s) MSI Nvidia GeForce GTX 980 Ti Armor 2X
Storage Samsung SSD 850 Pro 256GB, 2 x 4TB HGST NAS HDD in RAID 1
Display(s) 3 x Acer K272HUL 27" in Surround 7860x1440
Case NZXT Source 530
Audio Device(s) Integrated ALC1150 + Logitech Z-5500 5.1
Power Supply Seasonic X-1250 1.25kW
Mouse Gigabyte Aivia Krypton
Keyboard Logitech G15
Software Windows 8.1 Pro x64
#86
Which is the accessory the GPU or the G-Sync ?

Since G-Sync will be talking to the driver. When was the last time Nvidia let outsiders tinker with that ?
Once you know what commands are being sent over the cable then you can implement them into your own drivers or hardware. For example, if you create a monitor that can read all the signals sent via the G-sync protocol and respond to them just like a genuine G-sync monitor, then why would this matter to the drivers? A properly reverse engineered product should be no different than the genuine device. I doubt NVidia wants manufacturers to do this, but I see no reason, engineering or legal, that third party manufacturers cannot, and the driver shouldn't be able to tell otherwise.

The only hurdle would be the investment required to reverse engineer the protocol, and if genuine G-sync doesn't catch on, then there will be no financial incentive and no third party will bother to do it.
 
Joined
Feb 18, 2005
Messages
1,257 (0.27/day)
Likes
603
Location
South Africa
System Name Firelance
Processor i5-3570K @ 4.6GHz / 1.19V
Motherboard Gigabyte Z77X-UD5H @ F16h mod BIOS
Cooling Corsair H105 + 4x Gentle Typhoon 1850
Memory 2x 8GB Crucial Ballistix Sport DDR3-1600 CL9 @ CL7
Video Card(s) MSI GTX 1070 Armor OC @ 2000 core / 2300 mem
Storage 2x 256GB Samsung 840 Pro (RAID-0) + Hitachi Deskstar 7K3000 (3TB)
Display(s) Dell U2713HM (25x14) + Acer P243W (19x12)
Case Fractal Design ARC XL
Audio Device(s) Logitech G930
Power Supply Seasonic M12-II Bronze Evo Edition 750W
Mouse Logitech G400
Keyboard Logitech G19
Software Windows 7 Professional x64 Service Pack 1
#87
Innovations like G-SYNC and SHIELD streaming show exactly why NVIDIA titles themselves "World Leader in Visual Computing Technologies". Because they are.

When was the last time AMD released ANYTHING game-changing? No, Mantle doesn't count, because the world doesn't need another API; we have DirectX and it works just great. No, TrueAudio doesn't count, because no-one gives a shit.
 

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
14,546 (3.98/day)
Likes
8,052
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K at stock (hits 5 gees+ easily)
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (4 x 4GB Corsair Vengeance DDR3 PC3-12800 C9 1600MHz)
Video Card(s) Zotac GTX 1080 AMP! Extreme Edition
Storage Samsung 850 Pro 256GB | WD Green 4TB
Display(s) BenQ XL2720Z | Asus VG278HE (both 27", 144Hz, 3D Vision 2, 1080p)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair HX 850W v1
Software Windows 10 Pro 64-bit
#88
So, all nvidia have done is reverse sync direction, making the monitor sync with the card's varying frame rate output instead. A simple enough change technically, but it looks like the visual impact is big judging by the PR and articles I've read.

Couple of things that might be worse though are motion blur and the shape distortion* of moving objects, both of which are currently fixed by nvidia's LightBoost strobing backlight feature which my monitor has. The PR doesn't mention LightBoost anywhere, so I expect both of these motion artifacts to be present. The motion blur in particular is horrible and I'd rather have a bit of lag and occasional stutter than put up with this. I'd have to see G-Sync in action to properly judge it, though.

Also, it would be interesting to see this varying video signal on an oscilloscope.

*To check out the shape distortion, just open a window on the desktop, make it stretch from top to bottom, but be rather thin, then move it from side to side with the mouse. The shape will change with the top leading the bottom - moving the mouse faster makes the effect stronger. This is due to the scanning nature of the video signal, where the bottom part of the window (the whole picture, in fact) is quite literally drawn later than the top part. Note that the slower the monitor refresh, the worse the effect. Note that it's separate to the tearing artifact that you're also likely to see.

LightBoost strobing blanks the display and only shows the completed picture, eliminating this effect. Of course, this comes at the expense of maxed out lag. At least the lag is very short at 120Hz. Sometimes you just can't win, lol.
 
Joined
Oct 30, 2008
Messages
1,526 (0.46/day)
Likes
375
System Name Lailalo / Edelweiss
Processor FX 8320 @ 4.5Ghz / i7 3610QM @2.3-3.2Ghz
Motherboard ASrock 990FX Extreme 4 / Lenovo Y580
Cooling Cooler Master Hyper 212 Plus / Big hunk of copper
Memory 16GB Samsung 30nm DDR3 1600+ / 8GB Hyundai DDR3 1600
Video Card(s) XFX R9 390 / GTX 660M 2GB
Storage Seagate 3TB/1TB + OCZ Synapse 64GB SSD Cache / Western Digital 1TB 7200RPM
Display(s) LG Ultrawide 29in @ 2560x1080 / Lenovo 15.6 @ 1920x1080
Case Coolermaster Storm Sniper / Lenovo Y580
Audio Device(s) Asus Xonar DG / Whatever Lenovo used
Power Supply Antec Truepower Blue 750W + Thermaltake 5.25in 250W / Big Power Brick
Software Windows 10 Pro / Windows 10 Home
#89
This honestly isn't worth it nVidia. VSYNC is not such a terrible thing that it needs a special dedicated chip which...you aren't opening to the entire industry, will increase production costs, and likely only make the situation worse later when someone comes out with an alternative that does it without all the negatives.

You should have just made the tech and licensed it for everyone to use then enjoyed the royalties for years. I seriously doubt it requires a Kepler GPU to use it. Already know PhysX will work on non NV cards. This isn't something special either. Just setting yourself up for the fall later when someone, maybe even AMD, does it and does it better and for everyone to use.
 
Joined
Apr 30, 2008
Messages
4,315 (1.23/day)
Likes
1,015
Location
Multidimensional
System Name Derp!
Processor i7 7700 @ 4.2Ghz Turbo On
Motherboard Gigabyte B250 Phoenix Wifi ITX Motherboard
Cooling Noctua NH-L9i LP Cooler || Cooler Master Fan Pro RGB 120mm x 2
Memory 16GB Corsair Vengeance LPX DDR4 2400mhz RAM
Video Card(s) AMD Reference RX 480 8GB
Storage 250GB SS 960 Evo M.2 || WD Blue 500GB SSD || 2TB SG FC SSHD
Display(s) Hisense 1080p Smart LED HDTV 40inch
Case Fractal Node 202 Mini ITX Case
Audio Device(s) Realtek HD Audio / HDMI Audio Via GPU
Power Supply Corsair SFX 600W PSU
Mouse CoolerMaster Masterkeys Lite L RGB Mouse
Keyboard CoolerMaster Masterkeys Lite L RGB Mem-Chanical Keyboard
Software Windows 10 Home 64bit
Benchmark Scores Later
#91


:banghead::banghead:

This site has gone to shit with all the fanboy's & trolls, Jesus Christ lolz
 
Last edited:
Joined
Sep 19, 2012
Messages
615 (0.32/day)
Likes
75
System Name [WIP]
Processor Intel Pentium G3420 [i7-4790K SOON(tm)]
Motherboard MSI Z87-GD65 Gaming
Cooling [Corsair H100i]
Memory G.Skill TridentX 2x8GB-2400-CL10 DDR3
Video Card(s) [MSI AMD Radeon R9-290 Gaming]
Storage Seagate 2TB Desktop SSHD / [Samsung 256GB 840 PRO]
Display(s) [BenQ XL2420Z]
Case [Corsair Obsidian 750D]
Power Supply Corsair RM750
Software Windows 8.1 x64 Pro / Linux Mint 15 / SteamOS
#92
Yeah, 'kay. You obviously 've got everything figured out. And your only real problem in your life seems to be that you need a better paying job. Righahahahahat... :roll:


Again, open or bust.

It might have taken a lot of years, but microsoft is finally on the way out as a de-facto PC gaming platform... and why? Yeah, I'll let someone else figure this one out.


But I digress, we shall see. I'm far away from getting a gaming monitor anytime soon either way.
 

Frick

Fishfaced Nincompoop
Joined
Feb 27, 2006
Messages
14,877 (3.45/day)
Likes
5,411
System Name A dancer in your disco of fire
Processor i3 4130 3.4Ghz
Motherboard MSI B85M-E45
Cooling Cooler Master Hyper 212 Evo
Memory 4 x 4GB Crucial Ballistix Sport 1400Mhz
Video Card(s) Asus GTX 760 DCU2OC 2GB
Storage Crucial BX100 120GB | WD Blue 1TB x 2
Display(s) BenQ GL2450HT
Case AeroCool DS Cube White
Power Supply Cooler Master G550M
Mouse Intellimouse Explorer 3.0
Keyboard Dell SK-3205
Software Windows 10 Pro
#93
It might have taken a lot of years, but microsoft is finally on the way out as a de-facto PC gaming platform... and why? Yeah, I'll let someone else figure this one out.
Just want to point out that no. Not in the least. Alternatives are incoming and to an extent already here, but right now? No way, no how.

EDIT: And I just can't fathom the depths to which this place has plunged. All this rage.. For something they have not seen irl. And if this does what it says it does, you do have to see it irl before you can pass judgement.
 
Joined
Nov 1, 2011
Messages
282 (0.13/day)
Likes
62
System Name 3D Vision & Sound Blaster
Processor Intel Core i5 2500K @ 4.4GHz (stock voltage, 60C on hottest core)
Motherboard Gigabyte P67A-D3-B3
Cooling Thermalright Silver Arrow SB-E Special Edition (with 3x 140mm Black Thermalright fans)
Memory Crucial 16GB (2x8GB 1600MHz CL8)
Video Card(s) Nvidia GTX TITAN X 12288MB Maxwell @1350MHz (from 4890>5870>460>660>680>460)
Storage Samsung 850 EVO 1TB SSD (Steam) + 840 250GB SSD (Origin) + 840 EVO 1TB (work install)
Display(s) Samsung 34" S34E790C (3440x1440) + BenQ XL2420T 120Hz @ 1080p + 24" PHILIPS Touchscreen IPS
Case Fractal Design Define R4 Windowed with 6x 140mm Corsair AFs
Audio Device(s) Creative SoundBlaster Recon3D Fatal1ty + Z506 5.1 speakers/Logitech UE9000
Power Supply Corsair RM750i 750W 80PLUS Gold
Mouse Logitech G700
Keyboard Logitech PS/2 keyboard
Software Windows 7 Pro 64bit (not sidegrading to Windoze 10 until I have to...)
Benchmark Scores 2fast4u,bro...
#94
Are you like, intentionally playing dumb or are you just not getting this? The refresh rate of the monitor will always be 60Hz, there is no way right now to modulate it, dynamically, on the fly. Period. End of story. V-Sync attempts to lock your frames to 60Hz so it doesn't induce screen tearing, because the monitor operates at 60Hz. You can change it, but not dynamically. Doesn't work that way. But you can still get some screen tearing, and most importantly, you can get input lag because the GPU is rendering frames faster than the monitor can handle.

Adaptive V-Sync, simply removes the restriction on frame rates when it drops below 60. So instead of hard locking your game into either 60FPS or 30FPS, if it drops below, it simply reacts like V-sync isn't enabled, so it can run at 45FPS instead of being locked to intervals of 30. Again, this has nothing to do with the monitor and can not control it. Why do you even think Adaptive V-sync can change the monitor's refresh rate? How do you expect Adaptive V-sync to change the monitor's refresh rate - on the fly - to 45Hz? It doesn't and it physically can't. There is no standard that allows that to happen. There is no protocol that allows that to happen.

That is what G-Sync is. Maybe one of the consortiums can come up with a hardware agnostic standard...at some point. We don't know when, or if any of the consortiums even care. So please, for the love of god, stop making baseless (and wholly inaccurate and ignorant) assumptions. There isn't a single freaking monitor that works the way you describe these days and no firmware update is going to change that. Ever.
Well quite clearly, you're the one playing dumb, seeing how you have yet to explain to me why someone running a 120Hz or even a 75Hz monitor, would benefit from DROPPING the refresh rate to the frame rate of the game. Can you point out to me, a single person suffering from their monitor's higher refresh rate in any games that never even exceed it? It has never been a problem and Nvidia are yet again trying to fix a problem that never even existed in the first place, and quite clearly you are being ignorant by ignoring simple facts and have not had any experience with what causes tearing.

If a monitor is running at a refresh rate above the framerate of the GPU, unless the monitor does some image post-processing, scaling or duplicates frames (like those 240Hz TVs), the monitor will only draw the frames it has. End of. That renders this G-SYNC gimmick worthless because it is trying to show a problem that was never there in the first place. Whether your monitor runs at 30Hz or 120Hz it will only draw the frames that it has -- if it is less than the refresh rate, it won't affect the monitor either way.

The ONLY problem that currently exists that has anything remotely to do with monitors is that when an old game runs too fast, you have to choose between A. running at a higher framerate and experiencing tearing or B. Capping the framerate with vsync. The biggest problem with Vsync is that it drops GPU utilization to the point where a GPU can barely distinguish between idle and 3D load and this is a problem that occurs only with Nvidia cards, even on single GPU setups (because they have too many clock profiles to switch between), which causes stuttering. AMD doesn't have this problem because they have idle, 2D (Blu-ray) and 3D load clock profiles, nothing more. This gimmick does nothing whatsoever to fix that, and every Nvidia GPU up to this point, from 200-700 series has had this problem, and Maxwell will continue to have it until they address every affected game individually in the drivers 1-2 years after initial release. My GTX 285 had this problem, my 460 had this problem until they fixed most of them 2 years back, and my 660 had this problem, until I returned it. When they can work out a way to scale their GPU cores/clusters to imitate old cards, they will solve the problem instantly. This G-SYNC crap does not affect this problem in either a positive or a negative way, therefore it is worthless (even more so to 120Hz/144Hz fast gaming monitor users). By all means feel free to explain any benefits from this tech that I am not seeing.

Innovations like G-SYNC and SHIELD streaming show exactly why NVIDIA titles themselves "World Leader in Visual Computing Technologies". Because they are.
Please give it a rest, bud. This G-sync and Shield (the joke of a Android tablet slapped onto a 360 controller, with about 30 games on its support list, about which almost nobody outside of North America even knows or gives a single shit about) are not innovations in the slightest, and they are exactly why Nvidia are slowly losing its consumer GPU market share to AMD, as well as the reason why hardcore PC gamers buying into this crap will continue to get ridiculed by our casual PC and console gaming bretheren. Instead of investing in features that matter, they continue churning out more pricey gimmicks. If that is what you're into, more power to you and continue buying Nvidia. I for one, see these "innovations" as gimmicks and they add no value whatsoever to their GPUs or anything else employing this sort of tech that will come at a premium because of it in comparison to AMD.

Nvidia (and AMD) deserve praise for a lot of things -- G-sync and Shield are neither of them.
 
Last edited:
Joined
Sep 28, 2012
Messages
206 (0.11/day)
Likes
44
System Name Bluish Eight
Processor AMD Vishera FX 8350
Motherboard Asus Sabertooth 990FX R.20
Cooling XSPC Raystorm +Dual XSPC Rasa + EK XTC 420 + EK XTX 240 + Swiftech 655B
Memory Gskill F3-2400C10D-16GTX
Video Card(s) Asus R9 290 DCU II OC CrossfireX
Storage Samsung 840 240 Gb +Dual WDC 4TB WD40EZRX
Display(s) BenQ XL2730Z Freesync
Case Powercolor Quake TLC-005
Audio Device(s) Asus Xonar D2X to Logitech z5500
Power Supply Corsair HX1000i
Mouse Logitech G402 Hyperion Fury
Keyboard Logitech G710+
#95
Thank you. It seems as if for some people NVidia producing any technology first makes it inherently evil. I don't care who invented it, I support the technology. Kudos to NVidia for doing it first.
For those who claim this is proprietary with no evidence, I quote Anandtech stating exactly the opposite:
Just because no one else supports it yet doesn't mean that no one else ever will. Someone has to be first. Most standards aren't designed by committee but by everyone adopting one manufacturer's implementation for simplicity.
If and only if G-Sync try manipulate TDMS and DDC Clock over Display Port,there's highly probability this will leave DCP (Displayport Content Protection) exposed.Now that will be unpleasant for some.Unless G Sync is communicate between two return channel TMDS and DDC,this will crippling frame sequence rendering within two channel.And let me guess,this will not work on SLI and or Stereoscopic.

Innovations like SHIELD streaming show exactly why NVIDIA titles themselves "World Leader in Visual Computing Technologies". Because they are.
Ah yes...that's great.Let me ask a simple question.Do you own nVidia Shield or at least ever try it?Do you use android phones?Ever try running games on much cheaper Google Nexus 4?
 

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
14,546 (3.98/day)
Likes
8,052
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K at stock (hits 5 gees+ easily)
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (4 x 4GB Corsair Vengeance DDR3 PC3-12800 C9 1600MHz)
Video Card(s) Zotac GTX 1080 AMP! Extreme Edition
Storage Samsung 850 Pro 256GB | WD Green 4TB
Display(s) BenQ XL2720Z | Asus VG278HE (both 27", 144Hz, 3D Vision 2, 1080p)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair HX 850W v1
Software Windows 10 Pro 64-bit
#96
Thinking about it a little more, while this reduces lag, you can't eliminate it like nvidia claims and here's why.

At the moment, whether vsync is on or off, once the GPU has rendered the frame, it doesn't get displayed until the next monitor refresh (this is independent of the refresh frequency, of course) so you get lag. The amount of lag will vary too, depending when in that cycle the GPU has rendered the frame. If vsync is off, then you'll get tearing and stutters, regardless of whether the GPU is rendering faster or slower than the refresh rate. Remember there's still lag in the system due to the rendering time of the GPU and general propagation delay through the controller and also the rest of the computer.

Now you turn G-Sync on, what happens? The monitor waits for the GPU, not refreshing until it gets the command from to do so. This results in the frame being displayed the instant it's ready. Lag may be reduced, but not eliminated. Tearing and stuttering will be eliminated, however, because the monitor will be displaying every frame and crucially only ever doing so once.

Lag isn't eliminated, because the GPU still requires time to build the frame. Imagine an extreme case where the GPU is slowed down to around 15-20fps (can easily happen). You'll still have lag corresponding to this variable frame rate and therefore the game will feel horribly laggy. Responsiveness to your controls will be improved however since the monitor displays the frame as soon as it's ready, but more importantly perhaps there will be no tearing or stutters, which is a significant benefit, because both of these effects look bloody awful. NVIDIA obviously gets this.

The only thing I wonder is if the player will notice dynamic artifacts with the game responsiveness, since the frame rate and synced refresh rate vary continuously and significantly? It might lead to some weird effect where the player feels disoriented perhaps? Maybe even inducing nausea in some people? I don't know, but this is something to look out for and I will when I get my hands on some demo hardware. NVIDIA are obviously not gonna tell you about this in a press release, lol.

So, in short, while I can see this technology improving the game play experience, there's still no substitute for putting out frames as fast as possible. I'm currently gaming at 120Hz with very few dropped frames which makes a world of difference over 60Hz. (Adding LightBoost strobing to the mix with its blur elimination then takes this experience to another level altogether). That doesn't change with G-Sync. It would be really interesting to get my hands on demo hardware and see G-Sync for myself.

Well quite clearly, you're the one playing dumb, seeing how you have yet to explain to me why someone running a 120Hz or even a 75Hz monitor, would benefit from DROPPING the refresh rate to the frame rate of the game. Can you point out to me, a single person suffering from their monitor's higher refresh rate in any games that never even exceed it? It has never been a problem and Nvidia are yet again trying to fix a problem that never even existed in the first place, and quite clearly you are being ignorant by ignoring simple facts and have not had any experience with what causes tearing.

If a monitor is running at a refresh rate above the framerate of the GPU, unless the monitor does some image post-processing, scaling or duplicates frames (like those 240Hz TVs), the monitor will only draw the frames it has. End of. That renders this G-SYNC gimmick worthless because it is trying to show a problem that was never there in the first place. Whether your monitor runs at 30Hz or 120Hz it will only draw the frames that it has -- if it is less than the refresh rate, it won't affect the monitor either way.
I don't think you properly understand what G-Sync does on a technical level or what causes tearing and stutters in the first place. Remember, the only way* to remove stutters is to have the GPU and monitor synced, it doesn't matter which way round that sync goes. NVIDIA have simply synced the monitor with the GPU ie synced the opposite way round to how it's done now.

Check out my explanation above and in my previous post a few posts back. Getting rid of stutters and tearing is a big deal, possibly even more so than the slight lag reduction. This feature may be proprietary for now, but it's no gimmick.

*Not strictly true, sort of. If I run an old game which is rendering at several hundred fps and leave vsync off, then it looks perfectly smooth, with very little judder or tearing - and noticeably less lag. It seems that brute forcing the problem with a large excess number of frames per second rendered by the GPU can really help, even though those frames are variable.
 
Joined
Sep 24, 2010
Messages
56 (0.02/day)
Likes
5
Processor i7 950
Motherboard Asus P6T Deluxe V2
Cooling Zalman CNPS10X Extreme
Memory Kingston HyperX 3x 2GiB
Video Card(s) MSI GTX 570
Storage OCZ Vertex 3 120GB + 8TB
Power Supply Chieftec 750W
#97
Clearly neither do you then. Adaptive V-Sync is there to:
a) remove image tearing
b) doesn't pretty much half the framerate of FPS drops below certain level like normal V-Sync
Nvidia page:
NVIDIA's Adaptive VSync fixes both problems by unlocking the frame rate when below the VSync cap, which reduces stuttering, and by locking the frame rate when performance improves once more, thereby minimizing tearing.
Below 60 (120) fps you get no stuttering (basically vsync turns off), but see tearing. Above 60 (120) fps vsync is on as usual, resulting in no tearing and no visible stutter.

Really, why would nvidia develop something thats already solved, and a guy like asus join them? Nobody thats stupid. Also this tech is praised by various tech journalists and devs like John Karmack who seen it in action.

Only sh1tty thing about this is vendor lockin. We need this for amd and intel too, on every tv and monitor. Lets hope nvidia wont be stupid and open it up.
 

tigger

I'm the only one
Joined
Mar 20, 2006
Messages
10,476 (2.45/day)
Likes
1,673
System Name Black to the Core MKIV
Processor Intel I7 6700k
Motherboard Asus Z170 Pro Gaming socket 1151
Cooling NZXT Kraken X61 280mm radiator
Memory 2x8gb Corsair vengeance LPX 2400 DDR4
Video Card(s) XFX Radeon R9 290 4gb ddr5
Storage Patriot Blast 120gb ssd Boot and WD10EADX-22TDHB0 1TB Data
Display(s) Dell 2408WFP 24" 1920x1200
Case Nzxt IS 340
Audio Device(s) Asus xonar dsx pci-e
Power Supply Corsair CX750
Mouse Logitech G502
Software Win 10 Pro x64
#98
Thinking about it a little more, while this reduces lag, you can't eliminate it like nvidia claims and here's why.

At the moment, whether vsync is on or off, once the GPU has rendered the frame, it doesn't get displayed until the next monitor refresh (this is independent of the refresh frequency, of course) so you get lag. The amount of lag will vary too, depending when in that cycle the GPU has rendered the frame. If vsync is off, then you'll get tearing and stutters, regardless of whether the GPU is rendering faster or slower than the refresh rate. Remember there's still lag in the system due to the rendering time of the GPU and general propagation delay through the controller and also the rest of the computer.

Now you turn G-Sync on, what happens? The monitor waits for the GPU, not refreshing until it gets the command from to do so. This results in the frame being displayed the instant it's ready. Lag may be reduced, but not eliminated. Tearing and stuttering will be eliminated, however, because the monitor will be displaying every frame and crucially only ever doing so once.

Lag isn't eliminated, because the GPU still requires time to build the frame. Imagine an extreme case where the GPU is slowed down to around 15-20fps (can easily happen). You'll still have lag corresponding to this variable frame rate and therefore the game will feel horribly laggy. Responsiveness to your controls will be improved however since the monitor displays the frame as soon as it's ready, but more importantly perhaps there will be no tearing or stutters, which is a significant benefit, because both of these effects look bloody awful. NVIDIA obviously gets this.

The only thing I wonder is if the player will notice dynamic artifacts with the game responsiveness, since the frame rate and synced refresh rate vary continuously and significantly? It might lead to some weird effect where the player feels disoriented perhaps? Maybe even inducing nausea in some people? I don't know, but this is something to look out for and I will when I get my hands on some demo hardware. NVIDIA are obviously not gonna tell you about this in a press release, lol.

So, in short, while I can see this technology improving the game play experience, there's still no substitute for putting out frames as fast as possible. I'm currently gaming at 120Hz with very few dropped frames which makes a world of difference over 60Hz. (Adding LightBoost strobing to the mix with its blur elimination then takes this experience to another level altogether). That doesn't change with G-Sync. It would be really interesting to get my hands on demo hardware and see G-Sync for myself.



I don't think you properly understand what G-Sync does on a technical level or what causes tearing and stutters in the first place. Remember, the only way* to remove stutters is to have the GPU and monitor synced, it doesn't matter which way round that sync goes. NVIDIA have simply synced the monitor with the GPU ie synced the opposite way round to how it's done now.

Check out my explanation above and in my previous post a few posts back. Getting rid of stutters and tearing is a big deal, possibly even more so than the slight lag reduction. This feature may be proprietary for now, but it's no gimmick.

*Not strictly true, sort of. If I run an old game which is rendering at several hundred fps and leave vsync off, then it looks perfectly smooth, with very little judder or tearing - and noticeably less lag. It seems that brute forcing the problem with a large excess number of frames per second rendered by the GPU can really help, even though those frames are variable.
Couple of interesting posts mate, much more so than the rest of the fan boy crap and uninformed arguing from others.
 

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
14,546 (3.98/day)
Likes
8,052
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K at stock (hits 5 gees+ easily)
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (4 x 4GB Corsair Vengeance DDR3 PC3-12800 C9 1600MHz)
Video Card(s) Zotac GTX 1080 AMP! Extreme Edition
Storage Samsung 850 Pro 256GB | WD Green 4TB
Display(s) BenQ XL2720Z | Asus VG278HE (both 27", 144Hz, 3D Vision 2, 1080p)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair HX 850W v1
Software Windows 10 Pro 64-bit
#99
EDIT: And I just can't fathom the depths to which this place has plunged. All this rage.. For something they have not seen irl. And if this does what it says it does, you do have to see it irl before you can pass judgement.
+1 there. Why do things like this induce such foaming at the mouth? It's fucking ridiculous. If someone doesn't want it, then just don't buy it. No one's forcing them.

Couple of interesting posts mate, much more so than the rest of the fan boy crap and uninformed arguing from others.
+1 again.

Another technical thing I've just thought of about G-Sync.

Regardless of how moving pictures are being displayed, they are still sampled, just like audio. This means that the Nyquist limit or Nyquist frequency applies.

Hence, for fast moving objects eg during frenetic FPS gaming, you want that limit to be as high as possible, since an object moving fast enough will not just be rendered with only a few frames, but will display sampling artefacts similar to the "reverse spokes" effect in cowboy movies of old. In gaming, you may not even see the object, or it may appear in completely the wrong place and of course, be heavily lagged. If the GPU drops to its minimum of 30fps, then you can bet you'll see this effect and in a twitchy FPS, that can easily mean the difference between fragging or being fragged.

So again, while G-Sync looks like a great innovation to me, there remains no substitute for a high framerate as well.
 
Joined
Jul 10, 2009
Messages
467 (0.15/day)
Likes
89
Location
TR
Processor C2duo e6750@ 2800mhz
Motherboard GA-P43T-ES3G
Cooling Xigmatek S1283
Memory 2x2Gb Kingstone HyperX DDR3 1600 (KHX1600C9D3K2/4GX )
Video Card(s) HIS HD 6870
Storage samsung HD103SJ
Case Xigmatek Utgard
Audio Device(s) X-FI Titanium Pci-e
Power Supply Xigmatek NRP-PC702 700W
Software win7 ultimate -64bit
Which innovations ? A company acting like headless chicken because losing it's main market ( GPU 's ) ? Nvidia has hardtime between AMD apus /raising tablets -smartphones - dropping pc sales etc etc and trying to find new markets nothing more nothing less.

Btw who you are to decide about mantle and true audio in behalf of all people on earth , i dont remember i made you speaker person :slap: . What you dont get is i want microsoft free gaming = No direct X fyi

Innovations like G-SYNC and SHIELD streaming show exactly why NVIDIA titles themselves "World Leader in Visual Computing Technologies". Because they are.