• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Introduces G-SYNC Technology for Gaming Monitors

Joined
Jun 11, 2013
Messages
353 (0.21/day)
Likes
63
Processor Core i5-3350P @3.5GHz
Motherboard MSI Z77MA-G45 (uATX)
Cooling Stock Intel
Memory 2x4GB Crucial Ballistix Tactical DDR3 1600
Video Card(s) |Ξ \/ G /\ GeForce GTX 670 FTW+ 4GB w/Backplate, Part Number: 04G-P4-3673-KR, ASIC 68.5%
Storage some cheap seagates and one wd green
Display(s) Dell UltraSharp U2412M
Case some cheap old eurocase, black, with integrated cheap lit lcd for basic monitoring
Audio Device(s) Realtek ALC892
Power Supply Enermax Triathlor 550W ETA550AWT bronze, non-modular, airflow audible over 300W power draw
Mouse PMSG1G
Keyboard oldschool membrane Keytronic 104 Key PS/2 (big enter, right part of right shift broken into "\" key)
#51
Fair enough.
Not useless, but walled garden ... it's basically just a pile of bones.
if nv did this in the range of 140-85Hz, id call that a gamers benefit :toast:
but dropping frames to what? 5fps@5Hz and calling it a holy grail????? :confused::twitch:
gimme a f*cking break! :banghead:

im too old to jump this bait, so im kinda happy i cant afford to buy nvidia g-card :laugh:
 
Joined
Sep 28, 2012
Messages
206 (0.11/day)
Likes
44
System Name Bluish Eight
Processor AMD Vishera FX 8350
Motherboard Asus Sabertooth 990FX R.20
Cooling XSPC Raystorm +Dual XSPC Rasa + EK XTC 420 + EK XTX 240 + Swiftech 655B
Memory Gskill F3-2400C10D-16GTX
Video Card(s) Asus R9 290 DCU II OC CrossfireX
Storage Samsung 840 240 Gb +Dual WDC 4TB WD40EZRX
Display(s) BenQ XL2730Z Freesync
Case Powercolor Quake TLC-005
Audio Device(s) Asus Xonar D2X to Logitech z5500
Power Supply Corsair HX1000i
Mouse Logitech G402 Hyperion Fury
Keyboard Logitech G710+
#53
interesting...so G-Sync could be nVidia's mobile version of lowest end SKU graphic card that doesn't sell,or it might be a defective Tegra4 comes from nVidia shield /sarcasm
 
Joined
Dec 16, 2010
Messages
1,484 (0.57/day)
Likes
544
System Name My Surround PC
Processor Intel Core i7 4770K @ 4.2 GHz (1.15 V)
Motherboard ASRock Z87 Extreme6
Cooling Swiftech MCP35X / XSPC Rasa CPU / Swiftech MCW82 / Koolance HX-1320 w/ 8 Scythe Fans
Memory 16GB (2 x 8 GB) Mushkin Blackline DDR3-2400 CL11-13-13-31
Video Card(s) MSI Nvidia GeForce GTX 980 Ti Armor 2X
Storage Samsung SSD 850 Pro 256GB, 2 x 4TB HGST NAS HDD in RAID 1
Display(s) 3 x Acer K272HUL 27" in Surround 7860x1440
Case NZXT Source 530
Audio Device(s) Integrated ALC1150 + Logitech Z-5500 5.1
Power Supply Seasonic X-1250 1.25kW
Mouse Gigabyte Aivia Krypton
Keyboard Logitech G15
Software Windows 8.1 Pro x64
#54
as a red team guy, the hatred in this thread is ridiculous (on a related note, amd setting up a 290x demo across the street from nv's event made me uneasy)

name non proprietary nv tech? someone was blind on a videocardz thread as well, physx on the cpu is multiplatform & in many games+consoles! the new FCAT method runs anywhere, FXAA (although i want to say that's one guy), how about helping devs add DX10 or 11 support? not all nv sponsored titles have locks on them, amd has done the same in helping codemasters games be good looking & efficient

sure GPU physx & cuda are very annoying, not doubting that, it's not 'good guy nvidia', but many things start out proprietary to show that they work

we should be pressuring VESA/displayport/hdmi for an adaptive refresh technique like this :respect:

i dont get why nv doesnt license things out so that everyone can enjoy instead of locking down to only their platform (look at bluray, it's standard, just a royalty is paid, it's not locked to sony products)

just cuz we didnt hit a perfect or open world doesnt mean we should destroy it, this is still better than deciding between matrox+rendition+3dfx+whoever else all at once if you want to play various games in the 90s :banghead:
Thank you for this post. It seems like for every 1 neutral post on any AMD, NVidia or Intel press release there are 2 from people who read the only first sentence of the article and immediately claim the technology is evil based on some argument that would be disproved if they had read the entire article. If only more people approached these topics from a neutral point of view and took time to understand what they were criticizing this forum would be full of intelligent discussion. I guess this is the internet and you can't expect more... (I always wished there was a "No Thanks" button on TPU to note irrelevant posts, but I can imagine how it would be abused).

I support technological advancement, whether it is NVidia, AMD, Intel, or some other company, and G-sync looks promising. This is the type of disruptive technology like AMD did with Eyefinity where no one saw it coming and it will take a few years before everyone else catches up.
 
Joined
Oct 19, 2013
Messages
1 (0.00/day)
Likes
1
#55
Seriously , AMD is the one suffering from a colossal crap load of dropped frames, tearing, frame interleaving and micro-stuttering in their CF systems , that's why they are trying to mitigate the issue with CF over PCIe.

TrueAudio doesn't mean squat , and R290X is not really the answer to Titan at all , you will see that when the review comes about.The only real advantage for AMD now is Mantle, but it remains to be seen whether they will really get it to shine or not .

On the other hand, NVIDIA always has been the brute force of advancement in the PC space, AMD just plays catch up, and today's announcements just cements that idea.

NVIDIA was the first to introduce the GPU, SLi, Frame Pacing (FCAT) , GPU Boost ,Adaptive V.Sync, first to reach unified shader architecture on PCs, first with GPGPU , CUDA , PhysX , Optix (for real time ray tracing) , first with 3D gaming (3D Vision) . first with Optimus for mobile flexibility , first with Geforce Experience program , SLi profiles , shadow play (for recording) and game streaming .

They had better support with CSAA , FXAA , TXAA , driver side Ambient Occlusion and HBAO+ , TWIMTP program , better Linux , OpenGL support, .. and now G.Sync! all of these innovations are sustained and built upon to this day.

And even when AMD beat them to a certain invention , like Eyefinity , NVIDIA didn't stop until they topped it with more features , so they answered with Surround , then did 3D Surround and now 4k Surround.

NVIDIA was at the forefront in developing all of these technologies and they continue to sustain and expand them till now , AMD just follows suit , they fight back with stuff that they don't really sustain so they end up forgotten and abandoned ,even generating more trouble than they worth. Just look at the pathetic state of their own Eyefinity and all of it's CF frame problems.

In short AMD is the one feeling the heat, NOT NVIDIA, heck NVIDIA now fights two generations of AMD cards with only one generation of their own, Fermi held off HD 5870 and 6970 , Kepler held off 7970 and x290 ! and who knows about Maxwell !
Audio does matter, better audio = better experience.

Yeah nv first with SLI, but not the first to do it "right". FCAT was just an anti AMD tool to bottleneck AMD sales. PhysX wasn't even nvidias technology, they just bought it to make better marketing than ATI. Better Linux support? Read this and cry. OpenGL and OpenCL is pretty much AMD territory at this moment.

Then what do you mean by nvidia fighting two generations by AMD with only one generation? GF10*(gen1) GF11*(Gen2). Maxwell is far FAR away(late 2014) and by then AMD will be as well hitting their next generation.
 
Joined
Dec 16, 2010
Messages
1,484 (0.57/day)
Likes
544
System Name My Surround PC
Processor Intel Core i7 4770K @ 4.2 GHz (1.15 V)
Motherboard ASRock Z87 Extreme6
Cooling Swiftech MCP35X / XSPC Rasa CPU / Swiftech MCW82 / Koolance HX-1320 w/ 8 Scythe Fans
Memory 16GB (2 x 8 GB) Mushkin Blackline DDR3-2400 CL11-13-13-31
Video Card(s) MSI Nvidia GeForce GTX 980 Ti Armor 2X
Storage Samsung SSD 850 Pro 256GB, 2 x 4TB HGST NAS HDD in RAID 1
Display(s) 3 x Acer K272HUL 27" in Surround 7860x1440
Case NZXT Source 530
Audio Device(s) Integrated ALC1150 + Logitech Z-5500 5.1
Power Supply Seasonic X-1250 1.25kW
Mouse Gigabyte Aivia Krypton
Keyboard Logitech G15
Software Windows 8.1 Pro x64
#56
Audio does matter, better audio = better experience.
No argument there. How much TrueAudio actually improves audio is still up for debate until AMD releases drivers for it.

Yeah nv first with SLI, but not the first to do it "right".
What in the world does this even mean?

FCAT was just an anti AMD tool to bottleneck AMD sales.
So you're arguing that we shouldn't have any way to measure frame pacing just because AMD couldn't do it properly?

PhysX wasn't even nvidias technology, they just bought it to make better marketing than ATI.
PhysX was doomed to fail even before NVidia took it over. There was no way that dedicated physics accelerator cards were going to take off just like dedicated sound cards had died out in the years prior.

Better Linux support? Read this and cry. OpenGL and OpenCL is pretty much AMD territory at this moment.
I don't know enough about these to make a comment.

Then what do you mean by nvidia fighting two generations by AMD with only one generation? GF10*(gen1) GF11*(Gen2). Maxwell is far FAR away(late 2014) and by then AMD will be as well hitting their next generation.
Maxwell is speculated to launch on 28nm in early 2014 then move to 20nm in late 2014 according to reports. And what does it matter how many generations each manufacturer puts out? All that matters is performance/price, which is why I don't care one bit about the rebrands that both sides do as long as the rebrands move down pricing.
 
Joined
Apr 30, 2012
Messages
2,441 (1.17/day)
Likes
1,352
#57
interesting...so G-Sync could be nVidia's mobile version of lowest end SKU graphic card that doesn't sell,or it might be a defective Tegra4 comes from nVidia shield /sarcasm
I actually wasn't kidding when I said

I was wondering what Nvidia was going to do with all the un-sold Tegra 4 chips

Besides Nvidia users don't have stuttering nor tearing....right guys ???

It reminded me of Tegra 3 short board modules. If they were full board they be almost twins





NVIDIA GeForce GT 630M


its at the lower end for sure size wise
 
Last edited:
Joined
Oct 2, 2004
Messages
12,616 (2.59/day)
Likes
5,998
Location
Europe\Slovenia
System Name Dark Silence 2
Processor Intel Core i7 5820K @ 4.5 GHz (1.15V)
Motherboard MSI X99A Gaming 7
Cooling Cooler Master Nepton 120XL
Memory 32 GB DDR4 Kingston HyperX Fury 2400 MHz @ 2666 MHz
Video Card(s) AORUS GeForce GTX 1080Ti 11GB (2000/11100)
Storage Samsung 850 Pro 2TB SSD (3D V-NAND)
Display(s) ASUS VG248QE 144Hz 1ms (DisplayPort)
Case Corsair Carbide 330R Titanium
Audio Device(s) Creative Sound BlasterX AE-5 + Altec Lansing MX5021 (HiFi capacitors and OPAMP upgrade)
Power Supply BeQuiet! Dark Power Pro 11 750W
Mouse Logitech G502 Proteus Spectrum
Keyboard Cherry Stream XT Black
Software Windows 10 Pro 64-bit (Fall Creators Update)
#58
You clearly have no idea what the point of Adaptive V-sync is. It's to keep your frame rate from reducing to 30 if it ever drops below 60 (which is what it does, intervals of 30). Basically it was a way to get the best of both worlds.
Clearly neither do you then. Adaptive V-Sync is there to:
a) remove image tearing
b) doesn't pretty much half the framerate of FPS drops below certain level like normal V-Sync does

So, why do we need G-Sync to remove image tearing again?

Please link me to where it says it won't, cause I'm reaaaaaly curious just from where the BS spring blows. And even if it wouldn't, AMD is in the position to slipstream DX11 HLSL port pretty much every multi-platform game coming out in the next half-decade or so, so having 75% chance that, after a certain point, multi-platform next-generation games can and will be written in Mantle pretty much cements them into API relevancy, w/ or w/o a leading share in the market.

Name one nVidia branded tech that ISN'T proprietary.
Well, AMD does say you need a GCN powered hardware to use it and NVIDIA doesn't have that at all. And since it's an low level API it is very hardware specific. That would be like expecting ATI, S3 and NVIDIA to support Glide back in its days. Natively.

High level API's work that way (but are slower, aka Direct3D), low level ones don't (and are faster as a result, aka Matle or Glide).
 
Joined
Apr 30, 2012
Messages
2,441 (1.17/day)
Likes
1,352
#59
Well, AMD does say you need a GCN powered hardware to use it and NVIDIA doesn't have that at all. And since it's an low level API it is very hardware specific. That would be like expecting ATI, S3 and NVIDIA to support Glide back in its days. Natively.

High level API's work that way (but are slower, aka Direct3D), low level ones don't (and are faster as a result, aka Matle or Glide).
Anyone remember GLIDE Wrappers ?

Back then it was possible but now a days it be a lawsuit frenzy. That's why there no CUDA\PhysX wrappers. Slower\buggy but at least you weren't locked out of the API entirely.

Nvidia G-Sync FAQ

Q: Does NVIDIA G-SYNC work for all games?

A: NVIDIA G-SYNC works with all games. However, we have found some games that do not behave well and for those games we recommend that users take advantage of our control panel’s ability to disable G-SYNC per game. Games that NVIDIA discovers that have trouble with G-SYNC will be disabled by default in our driver.
Boo!!!


Q: Does G-SYNC work with FCAT?

A: FCAT requires a video capture card to catch the output graphics stream going to a monitor. Since G-SYNC is DP only and G-SYNC manipulates DP in new ways, it is very unlikely that existing capture cards will work. Fortunately, FRAPS is now an accurate reflection of the performance of a G-SYNC enabled system, and FRAPS output can be directly read by FCAT for comparative processing.
:rolleyes:

Q: How much more does G-SYNC add to the cost of a monitor?

A: The NVIDIA G-SYNC Do-it-yourself kit will cost approximately $175.
 
Last edited:
Joined
Sep 25, 2010
Messages
527 (0.20/day)
Likes
482
Location
Bahrain
System Name Shin-Lazarus \ Exp Gamer \ Dino
Processor Intel Core i7 4790K @4.6GHz \ C2Q 9550\ AMD k6-2 450 @500MHz!!!
Motherboard Asrock Z97 Extreme6 \Gigabyte GA-EP35-DS3P \ Gigabyte GA-5AX
Cooling De-lid+Tt Water 3.0 Extreme S \ Custom H50\ Socket7 cooler?
Memory 16GB Corsair Vengeance Pro 2400MHz \ 2x2GB OCZ Reaper PC28500 \1x128MB 2x256MB
Video Card(s) ASUS GTX780 DC II OC \Ref' GTX 280 \ S3 Savage3D + 3dfx voodoo2
Storage OS=Micron C400 256GB + 4x HDD's \2.5" 500GB Toshiba \20GB quantum fireball
Display(s) ASUS VG248QE \ Dell E196FP \-
Case Corsair Obsidian 750D\ Tt V3 BE \ generic ancient case
Audio Device(s) sennheiser GSP300[]SB X-fi fatality-Logitech Z623 \ SB Audigy \ Creative CT4810
Power Supply FSP AURUM PT 850W \Tt Toughpower 600W \kobian 235W
Mouse Logitech G600 \ Microsoft PS2 something something
Keyboard Logitech G810 Orion Spectrum \ Microsoft PS2 something something
Software WIn 10 Pro \Win XP SP3 \ too lazy to fix
Benchmark Scores OVER 9000!!!
#60
This looked like an interesting thing to try as i just bought the VG248QE a while ago, but it looks it'll cost a kidney and might not be available for me :ohwell: will see when its released,
aand it looks i might need a new system too with a kepler card :banghead:
 
Joined
Oct 2, 2004
Messages
12,616 (2.59/day)
Likes
5,998
Location
Europe\Slovenia
System Name Dark Silence 2
Processor Intel Core i7 5820K @ 4.5 GHz (1.15V)
Motherboard MSI X99A Gaming 7
Cooling Cooler Master Nepton 120XL
Memory 32 GB DDR4 Kingston HyperX Fury 2400 MHz @ 2666 MHz
Video Card(s) AORUS GeForce GTX 1080Ti 11GB (2000/11100)
Storage Samsung 850 Pro 2TB SSD (3D V-NAND)
Display(s) ASUS VG248QE 144Hz 1ms (DisplayPort)
Case Corsair Carbide 330R Titanium
Audio Device(s) Creative Sound BlasterX AE-5 + Altec Lansing MX5021 (HiFi capacitors and OPAMP upgrade)
Power Supply BeQuiet! Dark Power Pro 11 750W
Mouse Logitech G502 Proteus Spectrum
Keyboard Cherry Stream XT Black
Software Windows 10 Pro 64-bit (Fall Creators Update)
#61
@Xzibit
Glide Wrapper means emulation. Emulation means slow. How exactly that solves anything? Glide wrappers exist because you could play games that would then look better. Besides, what you can emulate from Voodoo cards doesn't mean you can emulate modern stuff today. Stuff was rather simple back then... Vertex Shaders had software emulation, but it was slower. Pixel Shaders were not even possible to emulate without getting 1 frame per minute... and neither was available on any Voodoo card...

Mantle is performance only thing and if you negate the boost with emulation, wouldn't it be easier to just stay with Direct3D 11 then and remain at point zero?
 

Frick

Fishfaced Nincompoop
Joined
Feb 27, 2006
Messages
14,909 (3.43/day)
Likes
5,425
System Name A dancer in your disco of fire
Processor i3 4130 3.4Ghz
Motherboard MSI B85M-E45
Cooling Cooler Master Hyper 212 Evo
Memory 4 x 4GB Crucial Ballistix Sport 1400Mhz
Video Card(s) Asus GTX 760 DCU2OC 2GB
Storage Crucial BX100 120GB | WD Blue 1TB x 2
Display(s) BenQ GL2450HT
Case AeroCool DS Cube White
Power Supply Cooler Master G550M
Mouse Intellimouse Explorer 3.0
Keyboard Dell SK-3205
Software Windows 10 Pro
#62
You guys make my hatred for booth babes and Bethsoft seem rational and levelheaded (which it totally are btw). :laugh:

Anyway wasn't qubit going on about something like this some time ago?
 
Last edited:
Joined
May 13, 2010
Messages
4,443 (1.58/day)
Likes
1,623
System Name RemixedBeast
Processor Intel i5 3570K @ 3.4Ghz
Motherboard ASRock Z77 Pro3
Cooling Coolermaster Hyper 212 Evo
Memory 16GB Corsair XMS3
Video Card(s) EVGA Nvidia GTX 650 Ti SSC 1GB
Storage 1.5TB Seagate/128GB Samsung 840
Display(s) Samsung SyncMaster P2350 23in @ 1920x1080 + LG Flatron 19in Widescreen 1440x900
Case Antec Three Hundred Two
Audio Device(s) Beyerdynamic DT770 Pro 80 // Fiio E7 Amp/DAC
Power Supply 620w Antec High Current Gamer HCG-620M
Mouse Logitech G700s/G502
Keyboard Logitech K740
Software Windows Server 2012 x64 Standard
Benchmark Scores Network: APs: Cisco Meraki MR32, Ubiquiti Unifi AP-AC-LR and Lite, Ligowave NFT-3AC
#63
Ok what the hell does this have to do with Glidewrapper?

Say this like you are teaching it to people at a class and not like you are rabid fandogs or sales people.
 
Joined
Oct 9, 2009
Messages
685 (0.23/day)
Likes
662
Location
Finland
System Name :P~
Processor Intel Core i7-5930K (ES)
Motherboard Asus Rampage V Extreme/3.1
Cooling Phanteks PH-TC14PE
Memory 32GB Corsair Vengeance LPX 2400 MHz
Video Card(s) Asus GTX 1080 Strix
Storage 400GB Intel 750 PCI-E SSD, 512GB Crucial MX100 SSD, 3TB WD RED HDD
Display(s) QNIX QX2710LED OC @ 96 Hz 27"
Case Corsair Obsidian 750D
Audio Device(s) Audioquest Dragon Red + Sennheiser HD 650
Power Supply Corsair HX1000i + Cablemod sleeved cables kit
Mouse Logitech G500s
Keyboard Logitech Ultra X Flat Premium
Software Windows 10 64-bit
#64
instead of braving the 60+ fps, they aim for 30fps@30Hz hahahahahaha :roll: :slap: :nutkick:



well, dont try that with a crt monitor or you end up in epileptical convulsion in no time :rockout:
This is going to 144Hz monitors too, meaning they aim to sync everything between 30 to 144fps@144Hz (refresh rate varies with fps). You obviously didn't read a thing about this, instead came in raging like any typical nvidia hater would.

No matter what this company is releasing it is always going to get this same hatred. Doesn't matter what the greatest minds of game development think.

G-SYNC board will support Lightboost too.
http://www.neogaf.com/forum/showpost.php?p=86572603&postcount=539

edit:
30Hz is minimum, variable between 30Hz to 144Hz (obvious max limit)

https://twitter.com/ID_AA_Carmack/status/391300283672051713
 
Last edited:
Joined
Sep 6, 2013
Messages
1,204 (0.75/day)
Likes
576
Location
Athens, Greece
System Name 3 systems: Gaming/Internet/HTPC
Processor 1075T @ 3.7GHz, undervolted @ 1.32V /1055T @ 3.5GHz, undervolted @ 1.27V/A6 5400K
Motherboard Gigabyte GA-990XA-UD3/ASRock 970 Extreme3 R2.0/ASUS FM2+
Cooling CoolerMaster TX3 (2fans)/Zalman CNPS9500 (Nvidia version)/CoolerMaster
Memory 16GB Adata 2133MHz/16GB DDR3 Kingston 2400MHz/8GB Corsair 1600MHz
Video Card(s) ASUS HD7850 2GB /Gigabyte GT620/FM2 iGPU
Storage OCZ ARC 100 + HDDs/Samsung 840 EVO + HDDs/OCZ Agility 3
Display(s) Samsung LE32D550 32'' TV(2 systems connected)/LG 42''
Case Sharkoon Rebel 12/Sharkoon Rebel 9/Xigmatek Midguard
Audio Device(s) onboard
Power Supply 850W Chieftec/560W Cheieftec/400W
Mouse CoolerMaster/Rapoo/Logitech
Keyboard CoolerMaster/Microsoft/Logitech
Software Windows 10 64bit TP + Windows 7 64bit + Windows XP (triple boot,stupidity all over the drives)
#65
I am full with AMD and I hate Nvidia with their locks, but this really looks interesting and I believe AMD could follow in the future with something similar (who doesn't like to sell more hardware?).
 
Joined
Nov 1, 2011
Messages
282 (0.12/day)
Likes
62
System Name 3D Vision & Sound Blaster
Processor Intel Core i5 2500K @ 4.4GHz (stock voltage, 60C on hottest core)
Motherboard Gigabyte P67A-D3-B3
Cooling Thermalright Silver Arrow SB-E Special Edition (with 3x 140mm Black Thermalright fans)
Memory Crucial 16GB (2x8GB 1600MHz CL8)
Video Card(s) Nvidia GTX TITAN X 12288MB Maxwell @1350MHz (from 4890>5870>460>660>680>460)
Storage Samsung 850 EVO 1TB SSD (Steam) + 840 250GB SSD (Origin) + 840 EVO 1TB (work install)
Display(s) Samsung 34" S34E790C (3440x1440) + BenQ XL2420T 120Hz @ 1080p + 24" PHILIPS Touchscreen IPS
Case Fractal Design Define R4 Windowed with 6x 140mm Corsair AFs
Audio Device(s) Creative SoundBlaster Recon3D Fatal1ty + Z506 5.1 speakers/Logitech UE9000
Power Supply Corsair RM750i 750W 80PLUS Gold
Mouse Logitech G700
Keyboard Logitech PS/2 keyboard
Software Windows 7 Pro 64bit (not sidegrading to Windoze 10 until I have to...)
Benchmark Scores 2fast4u,bro...
#66
This is going to 144Hz monitors too, meaning they aim to sync everything between 30 to 144fps@144Hz (refresh rate varies with fps). You obviously didn't read a thing about this, instead came in raging like any typical nvidia hater would.

No matter what this company is releasing it is always going to get this same hatred. Doesn't matter what the greatest minds of game development think.

https://twitter.com/ID_AA_Carmack/status/391300283672051713
No, it doesn't, especially since they're in Nvidia's pockets being payed to spread testimonials about useless "tech" like this, if you can even call it that.

Really? You think bi-lateral communication with the monitor's ability to control the frame rate delivery of the GPU so it's completely in sync with the monitor can just be easily implemented via a firmware update?

The official white paper hasn't even been released yet and you have to gall to make such an inaccurate and ballsy statement such as that. What standard do you think it can already use? There isn't a display protocol that is capable of doing that. From what I've read on Anandtech, it's a modification of the DisplayPort standard - meaning non-standard - meaning it can't be implemented on ANY of today's monitors without that specific hardware bundle.
Give me one reason why it can't work in the way I stated. The only thing Nvidia would be required to do is modify Adaptive V-sync to keep in check in with the monitor's refresh rate, and there would be nothing stopping AMD from developing a similar method. Fifty bucks says DisplayPort or one of the other display standard companies come up with a non-proprietary way of doing this in the next 5 years.

Nvidia G-Sync FAQ

Q: How much more does G-SYNC add to the cost of a monitor?

A: The NVIDIA G-SYNC Do-it-yourself kit will cost approximately $175.

Q: Does NVIDIA G-SYNC work for all games?

A: NVIDIA G-SYNC works with all games. However, we have found some games that do not behave well and for those games we recommend that users take advantage of our control panel’s ability to disable G-SYNC per game. Games that NVIDIA discovers that have trouble with G-SYNC will be disabled by default in our driver.


Wait, what...$175 dollars for more useless bullshit with a support list of 10-20 games that will never even exceed the refresh rate of a 120Hz monitor, unless they run at lower than garbage settings of a console and 95% of the time needing to turn off the "exclusive" shit for which you've payed out the ass for. :banghead: And people are supporting this shit...if Nvidia (not directed at you BTW) doesn't have the most braindead and loyal fanboys, I don't know what other company does (other than Apple).

just cuz we didnt hit a perfect or open world doesnt mean we should destroy it, this is still better than deciding between matrox+rendition+3dfx+whoever else all at once if you want to play various games in the 90s :banghead:
I would take those days over the price fixing bullshit Nvidia are playing now any day -- at least competition was fierce back then, and you still had various ways to run proprietary tech yourself (with Glide wrappers etc).
 
Last edited:
Joined
Sep 28, 2012
Messages
206 (0.11/day)
Likes
44
System Name Bluish Eight
Processor AMD Vishera FX 8350
Motherboard Asus Sabertooth 990FX R.20
Cooling XSPC Raystorm +Dual XSPC Rasa + EK XTC 420 + EK XTX 240 + Swiftech 655B
Memory Gskill F3-2400C10D-16GTX
Video Card(s) Asus R9 290 DCU II OC CrossfireX
Storage Samsung 840 240 Gb +Dual WDC 4TB WD40EZRX
Display(s) BenQ XL2730Z Freesync
Case Powercolor Quake TLC-005
Audio Device(s) Asus Xonar D2X to Logitech z5500
Power Supply Corsair HX1000i
Mouse Logitech G402 Hyperion Fury
Keyboard Logitech G710+
#67
I support technological advancement, whether it is NVidia, AMD, Intel, or some other company, and G-sync looks promising. This is the type of disruptive technology like AMD did with Eyefinity where no one saw it coming and it will take a few years before everyone else catches up.
Have you remember monitor technology related that nVidia introduced between 2010?Yep...3D Vision that require very expensive 3D shutter glass,emitter and monitor approved by nVidia.I got one glasses,one emitter and Samsung's 223RZ 22 inch LCD for the price of $800, not to mention another $350 for GTX 470.As far as i remember,3D only "works" on nVidia specific cards and specific monitor.Later i knew nVidia just adopting 3D FPR from LG and bring it to desktop.For the same $1200 i switch to HD5850 + 42 inch LG 240Hz and had the same effect.Meanwhile,a pair of 3D Vision kit and Asus VG236H will cost you $650 and only works with highend GTX,or you can grab $250 LG D2343P-BN and paired it with every graphic card out there.Where are those "3D Gaming is the future" now?

Personally,i don't hate nVidia for their "innovation" or "breakthrough" technology.They push gaming to a new level,creating a better experience and better enjoyment.I just hated their price and the so called "proprietary" tech.

Audio does matter, better audio = better experience.
Yeah nv first with SLI, but not the first to do it "right". FCAT was just an anti AMD tool to bottleneck AMD sales.
A better DSP doesn't translate to better audio,it's only small fragment.You need a better amplifier that can encode DSP digital signal and better speaker to translate analog signal from amplifier.
FCAT was surely anti AMD,because AMD uses AFR rather than nVidia's SFR.Look at bright sides,AMD now working for better driver to address frame pacing issues.
 
Joined
Sep 19, 2012
Messages
615 (0.31/day)
Likes
75
System Name [WIP]
Processor Intel Pentium G3420 [i7-4790K SOON(tm)]
Motherboard MSI Z87-GD65 Gaming
Cooling [Corsair H100i]
Memory G.Skill TridentX 2x8GB-2400-CL10 DDR3
Video Card(s) [MSI AMD Radeon R9-290 Gaming]
Storage Seagate 2TB Desktop SSHD / [Samsung 256GB 840 PRO]
Display(s) [BenQ XL2420Z]
Case [Corsair Obsidian 750D]
Power Supply Corsair RM750
Software Windows 8.1 x64 Pro / Linux Mint 15 / SteamOS
#68
I don't know enough about these to make a comment
Ha! Probably the smartest comment in this thread so far. You deserve a medal.
I know how hard I try to shut up about things I don't know much about yet.


Well, AMD does say you need a GCN powered hardware to use it and NVIDIA doesn't have that at all. And since it's an low level API it is very hardware specific. That would be like expecting ATI, S3 and NVIDIA to support Glide back in its days. Natively.
I know what you're saying but:

1. Kepler might not support it, but Maxwell or Volta at the very list could. GPUs are way more complex and adaptable creatures then what they used to be back then... and even then:

2. You could use wrappers to run Glide on non-3dfx hardware...

Ah, I still remember how I finally got Carmageddon 2 to work on a glide wrapper on nV hardware, a world of difference in graphics...
 
Joined
Aug 16, 2004
Messages
2,998 (0.61/day)
Likes
1,578
Location
Visalia, CA
System Name Crimson Titan 2.5
Processor Intel Core i7 5930K @ 4.5GHz 1.28V
Motherboard Asus ROG Rampage V Extreme
Cooling CPU: Swiftech H220-X, RAM: Geil Cyclone 2, VGA: Custom EK water loop
Memory 8x4GBs G.Skill Ripjaws DDR4 XMP2 3000MHz @ 1.35V
Video Card(s) 2x EVGA GTX Titan X SCs in SLI under full cover EK water blocks
Storage OS: 256GBs Samsung 850 Pro SSD/Games: 3TBs WD Black
Display(s) Acer XB280HK 28" 4K G-Sync - 2x27" Acer HN274s 1920x1080 120Hz
Case Corsair Graphite Black 760T - EK Pump/Reservoir, 360mm EK Radiator
Audio Device(s) SB X-Fi Fatal1ty Pro on Logitech THX Z-5500 5.1/Razer Tiamat 7.1 Headset
Power Supply Silvestone ST1500 1.5 kW
Mouse Cyborg R.A.T. 9
Keyboard Corsair K70 RGB Cherry MX Red
Software Windows 10 Pro 64bit
#69
I really hope nVidia releases a version of this tech that works with existing monitors, something like a HDMI dongle between the monitor and graphic, and make it hardware agnostic please (though I don't see that last part happening...)

Most PC gamers have invested a lot of moola on their monitors (myself included) and if nVidia really wants to see this tech take off so to speak, they must make this available to as many costumers as they can, not just people who invest in new monitors starting next year...
 
Joined
May 29, 2012
Messages
438 (0.21/day)
Likes
148
System Name The Cube
Processor i7 - 4770K @ 4.2GHz
Motherboard ASUS Maximum VI Hero
Cooling Corsair H110 w/ 2x Cougar Vortex 140mm Fans
Memory 32GB G.Skill Z-series @ 2133Mhz
Video Card(s) 2 x EVGA GTX 1070 FTW
Storage 1 x 256GB Samsung 840 Pro, 1 x 512GB Samsung 850 EVO & 2 x 2TB WD BLACK RAID 0
Display(s) Dell Ultrasharp U3415W
Case Corsair Carbide Air 540
Power Supply Seasonic PRIME 1000W Titanium
Software Windows 10 Pro 64-bit
#70
if nv did this in the range of 140-85Hz, id call that a gamers benefit :toast:
but dropping frames to what? 5fps@5Hz and calling it a holy grail????? :confused::twitch:
gimme a f*cking break! :banghead:

im too old to jump this bait, so im kinda happy i cant afford to buy nvidia g-card :laugh:
It won't drop below 30 you dunce. You could have at least spent 5 minutes reading what it does. Clearly, none of you have.

Give me one reason why it can't work in the way I stated. The only thing Nvidia would be required to do is modify Adaptive V-sync to keep in check in with the monitor's refresh rate, and there would be nothing stopping AMD from developing a similar method. Fifty bucks says DisplayPort or one of the other display standard companies come up with a non-proprietary way of doing this in the next 5 years.
Are you like, intentionally playing dumb or are you just not getting this? The refresh rate of the monitor will always be 60Hz, there is no way right now to modulate it, dynamically, on the fly. Period. End of story. V-Sync attempts to lock your frames to 60Hz so it doesn't induce screen tearing, because the monitor operates at 60Hz. You can change it, but not dynamically. Doesn't work that way. But you can still get some screen tearing, and most importantly, you can get input lag because the GPU is rendering frames faster than the monitor can handle.

Adaptive V-Sync, simply removes the restriction on frame rates when it drops below 60. So instead of hard locking your game into either 60FPS or 30FPS, if it drops below, it simply reacts like V-sync isn't enabled, so it can run at 45FPS instead of being locked to intervals of 30. Again, this has nothing to do with the monitor and can not control it. Why do you even think Adaptive V-sync can change the monitor's refresh rate? How do you expect Adaptive V-sync to change the monitor's refresh rate - on the fly - to 45Hz? It doesn't and it physically can't. There is no standard that allows that to happen. There is no protocol that allows that to happen.

That is what G-Sync is. Maybe one of the consortiums can come up with a hardware agnostic standard...at some point. We don't know when, or if any of the consortiums even care. So please, for the love of god, stop making baseless (and wholly inaccurate and ignorant) assumptions. There isn't a single freaking monitor that works the way you describe these days and no firmware update is going to change that. Ever.
 
Last edited:
Joined
Jun 11, 2013
Messages
353 (0.21/day)
Likes
63
Processor Core i5-3350P @3.5GHz
Motherboard MSI Z77MA-G45 (uATX)
Cooling Stock Intel
Memory 2x4GB Crucial Ballistix Tactical DDR3 1600
Video Card(s) |Ξ \/ G /\ GeForce GTX 670 FTW+ 4GB w/Backplate, Part Number: 04G-P4-3673-KR, ASIC 68.5%
Storage some cheap seagates and one wd green
Display(s) Dell UltraSharp U2412M
Case some cheap old eurocase, black, with integrated cheap lit lcd for basic monitoring
Audio Device(s) Realtek ALC892
Power Supply Enermax Triathlor 550W ETA550AWT bronze, non-modular, airflow audible over 300W power draw
Mouse PMSG1G
Keyboard oldschool membrane Keytronic 104 Key PS/2 (big enter, right part of right shift broken into "\" key)
#71
edit: 30Hz is minimum, variable between 30Hz to 144Hz (obvious max limit)
https://twitter.com/ID_AA_Carmack/status/391300283672051713
imagine a group of players playing at 120fps and a group struggling at 30-50fps.. whos going to get more frags? the first gropu can react after 1/120th of a wsecond, whereas the other only at 1/30th - 1/50th.. their reactions will be twice as slow. wheres the benefit for gamers? and let me repeat the crucial thing - nvidia wouldnt dare to present this on a crt monitor. and as you posted the carmack link, if you move your view so that the pixel needs to be refreshed faster than 1/30th of a second, it sends a duplicate frame which doesnt contain updated information in the scene aka a player leaning outta the corner. this can lead to upto four identical frames send to you which makes it upto 4*1/30 ~ 133 milisecond later response on your side versus a 120fps guy. is that clearer now? this technology is nvididas sorry excuse for being lazy to make graphics cards that are able to render 60+ fps at 1600p+ resolution. nothing more, nothing less. so now you see why im upset. and they are able to sell this crap even to you so smoothly. unbelievable.



It won't drop below 30 you dunce. You could have at least spent 5 minutes reading what it does. Clearly, none of you have.
thanks. as soon as i realized what this technology is, and i realized it as soon as they started talking about lowering the refresh frequency of the monitor, i wasnt exactly in the mood to start searching where is the lowest acceptable limit for those guys.. im inclined to think that they really have no limit if it boils down to getting money from you, lol!
 
Last edited:
Joined
May 29, 2012
Messages
438 (0.21/day)
Likes
148
System Name The Cube
Processor i7 - 4770K @ 4.2GHz
Motherboard ASUS Maximum VI Hero
Cooling Corsair H110 w/ 2x Cougar Vortex 140mm Fans
Memory 32GB G.Skill Z-series @ 2133Mhz
Video Card(s) 2 x EVGA GTX 1070 FTW
Storage 1 x 256GB Samsung 840 Pro, 1 x 512GB Samsung 850 EVO & 2 x 2TB WD BLACK RAID 0
Display(s) Dell Ultrasharp U3415W
Case Corsair Carbide Air 540
Power Supply Seasonic PRIME 1000W Titanium
Software Windows 10 Pro 64-bit
#72
Then maybe you shouldn't have made a hyperbolic inaccurate statement on something you knew nothing about. You know, like most people would.

I've never set up a WC loop, so you don't see me going into sub-forums and threads about setting up WC loops and throwing around BS beliefs on something I clearly know nothing about.
 
Joined
Aug 16, 2004
Messages
2,998 (0.61/day)
Likes
1,578
Location
Visalia, CA
System Name Crimson Titan 2.5
Processor Intel Core i7 5930K @ 4.5GHz 1.28V
Motherboard Asus ROG Rampage V Extreme
Cooling CPU: Swiftech H220-X, RAM: Geil Cyclone 2, VGA: Custom EK water loop
Memory 8x4GBs G.Skill Ripjaws DDR4 XMP2 3000MHz @ 1.35V
Video Card(s) 2x EVGA GTX Titan X SCs in SLI under full cover EK water blocks
Storage OS: 256GBs Samsung 850 Pro SSD/Games: 3TBs WD Black
Display(s) Acer XB280HK 28" 4K G-Sync - 2x27" Acer HN274s 1920x1080 120Hz
Case Corsair Graphite Black 760T - EK Pump/Reservoir, 360mm EK Radiator
Audio Device(s) SB X-Fi Fatal1ty Pro on Logitech THX Z-5500 5.1/Razer Tiamat 7.1 Headset
Power Supply Silvestone ST1500 1.5 kW
Mouse Cyborg R.A.T. 9
Keyboard Corsair K70 RGB Cherry MX Red
Software Windows 10 Pro 64bit
#73
imagine a group of players playing at 120fps and a group struggling at 30-50fps.. whos going to get more frags? the first gropu can react after 1/120th of a wsecond, whereas the other only at 1/30th - 1/50th.. their reactions will be twice as slow. wheres the benefit for gamers? and let me repeat the crucial thing - nvidia wouldnt dare to present this on a crt monitor. and as you posted the carmack link, if you move your view so that the pixel needs to be refreshed faster than 1/30th of a second, it sends a duplicate frame which doesnt contain updated information in the scene aka a player leaning outta the corner. this can lead to upto four identical frames send to you which makes it upto 4*1/30 ~ 133 milisecond later response on your side versus a 120fps guy. is that clearer now? this technology is nvididas sorry excuse for being lazy to make graphics cards that are able to render 60+ fps at 1600p+ resolution. nothing more, nothing less. so now you see why im upset. and they are able to sell this crap even to you so smoothly. unbelievable.
Yes! How can they offer more options to gamers?! What impertinence! Let's make sure that all people can only buy graphic cards and monitors that are capped at 30FPS so no one can have an unfair advantage, who needs more alternatives anyways?!!

Who do these people think they are by offering innovation in this field?? Let's boycott Nvidia and burn all their engineers at the stake! And then to have the audacity to sell this technology and make a profit! How dare them?!!

/S :rolleyes:
 

Frick

Fishfaced Nincompoop
Joined
Feb 27, 2006
Messages
14,909 (3.43/day)
Likes
5,425
System Name A dancer in your disco of fire
Processor i3 4130 3.4Ghz
Motherboard MSI B85M-E45
Cooling Cooler Master Hyper 212 Evo
Memory 4 x 4GB Crucial Ballistix Sport 1400Mhz
Video Card(s) Asus GTX 760 DCU2OC 2GB
Storage Crucial BX100 120GB | WD Blue 1TB x 2
Display(s) BenQ GL2450HT
Case AeroCool DS Cube White
Power Supply Cooler Master G550M
Mouse Intellimouse Explorer 3.0
Keyboard Dell SK-3205
Software Windows 10 Pro
#74
@haswrong: Isn't that how it is now though?
 
Joined
Jun 11, 2013
Messages
353 (0.21/day)
Likes
63
Processor Core i5-3350P @3.5GHz
Motherboard MSI Z77MA-G45 (uATX)
Cooling Stock Intel
Memory 2x4GB Crucial Ballistix Tactical DDR3 1600
Video Card(s) |Ξ \/ G /\ GeForce GTX 670 FTW+ 4GB w/Backplate, Part Number: 04G-P4-3673-KR, ASIC 68.5%
Storage some cheap seagates and one wd green
Display(s) Dell UltraSharp U2412M
Case some cheap old eurocase, black, with integrated cheap lit lcd for basic monitoring
Audio Device(s) Realtek ALC892
Power Supply Enermax Triathlor 550W ETA550AWT bronze, non-modular, airflow audible over 300W power draw
Mouse PMSG1G
Keyboard oldschool membrane Keytronic 104 Key PS/2 (big enter, right part of right shift broken into "\" key)
#75
@haswrong: Isn't that how it is now though?
nvidia said that gamers will love it.. this sounds like notebook gamers will love it.. are you ready to ditch your $1300 monitor for even more expensive one with the variable refresh rate, which doesnt make your reasponse any faster?

ok, i have to give you one thing, you can now watch your ass being fragged without stutter :D



...the audacity to sell this technology and make a profit! How dare them?!!
/S :rolleyes:
exactly.. if i decide to please someone, i do it for free..

remember when your momma asked you to take out the litter back then? did you ask like $270 for the service? the whole world will be soon eaten by the false religion that is the economy, you just wait.