• We've upgraded our forums. Please post any issues/requests in this thread.

NVIDIA Introduces G-SYNC Technology for Gaming Monitors

Joined
May 29, 2012
Messages
429 (0.21/day)
Likes
144
System Name The Cube
Processor i7 - 4770K @ 4.2GHz
Motherboard ASUS Maximum VI Hero
Cooling Corsair H110 w/ 2x Cougar Vortex 140mm Fans
Memory 32GB G.Skill Z-series @ 2133Mhz
Video Card(s) 2 x EVGA GTX 1070 FTW
Storage 1 x 256GB Samsung 840 Pro, 1 x 512GB Samsung 850 EVO & 2 x 2TB WD BLACK RAID 0
Display(s) Dell Ultrasharp U3415W
Case Corsair Carbide Air 540
Power Supply Seasonic PRIME 1000W Titanium
Software Windows 10 Pro 64-bit
#26
Yet again, more useless, proprietary and gimmicky crap from Nvidia. This is nothing that can't be implemented as a firmware upgrade for literally every monitor available by forcing it to drop excessive frames or simply stopping the GPU from rendering more than the refresh rate allows in the drivers. I'm pretty certain several monitors, including mine, already implements something like this unofficially, as I see no tearing even in ancient games like COD4 and Q3A that run well over 250FPS.

Also, have these guys ever heard of Dxtory?

Please GTFO with more of this proprietary, overpriced and useless bullshit to further fragment PC gaming, Nvidia.

EDIT: and good lord, that stupid name...G-SYNC? Sounds like the name of an Nsync tribute band.
Really? You think bi-lateral communication with the monitor's ability to control the frame rate delivery of the GPU so it's completely in sync with the monitor can just be easily implemented via a firmware update?

The official white paper hasn't even been released yet and you have to gall to make such an inaccurate and ballsy statement such as that. What standard do you think it can already use? There isn't a display protocol that is capable of doing that. From what I've read on Anandtech, it's a modification of the DisplayPort standard - meaning non-standard - meaning it can't be implemented on ANY of today's monitors without that specific hardware bundle.
 
Joined
Oct 29, 2010
Messages
2,944 (1.13/day)
Likes
1,379
System Name The Sheriff / The Deputy
Processor 2500K / G4560
Motherboard ASRock P67Extreme4 / Gigabyte GA-Z170-HD3 DDR3
Cooling CM Hyper TX3 / Intel stock
Memory 16 GB Kingston HyperX / 8 GB Kingston HyperX
Video Card(s) EVGA GTX 960 SSC / Gigabyte GTX 1050 Ti LP
Storage SSD, some WD and lots of Samsungs
Display(s) BenQ GW2470 / Samsung SyncMaster T220
Case Cooler Master CM690 II Advanced / Thermaltake Core v31
Audio Device(s) Asus Xonar D1/Denon PMA500AE/Wharfedale D 10.1/Senn HD518
Power Supply Corsair TX650 / Corsair TX650M
Mouse Steelseries Rival 100/ Mionix Avior 7000
Keyboard Sidewinder
Software Windows 10 / Windows 10 Pro
#27
Really? You think bi-lateral communication with the monitor's ability to control the frame rate delivery of the GPU so it's completely in sync with the monitor can just be easily implemented via a firmware update?

The official white paper hasn't even been released yet and you have to gall to make such an inaccurate and ballsy statement such as that. What standard do you think it can already use? There isn't a display protocol that is capable of doing that. From what I've read on Anandtech, it's a modification of the DisplayPort standard - meaning non-standard - meaning it can't be implemented on ANY of today's monitors without that specific hardware bundle.
This is needed in the monitor:

 

tigger

I'm the only one
Joined
Mar 20, 2006
Messages
10,476 (2.44/day)
Likes
1,673
System Name Black to the Core MKIV
Processor Intel I7 6700k
Motherboard Asus Z170 Pro Gaming socket 1151
Cooling NZXT Kraken X61 280mm radiator
Memory 2x8gb Corsair vengeance LPX 2400 DDR4
Video Card(s) XFX Radeon R9 290 4gb ddr5
Storage Patriot Blast 120gb ssd Boot and WD10EADX-22TDHB0 1TB Data
Display(s) Dell 2408WFP 24" 1920x1200
Case Nzxt IS 340
Audio Device(s) Asus xonar dsx pci-e
Power Supply Corsair CX750
Mouse Logitech G502
Software Win 10 Pro x64
#28
Could they not do something similar to this that goes between the card and monitor in the cable, maybe a box, so it will work without a new monitor?
 
Joined
Dec 16, 2010
Messages
1,484 (0.58/day)
Likes
544
System Name My Surround PC
Processor Intel Core i7 4770K @ 4.2 GHz (1.15 V)
Motherboard ASRock Z87 Extreme6
Cooling Swiftech MCP35X / XSPC Rasa CPU / Swiftech MCW82 / Koolance HX-1320 w/ 8 Scythe Fans
Memory 16GB (2 x 8 GB) Mushkin Blackline DDR3-2400 CL11-13-13-31
Video Card(s) MSI Nvidia GeForce GTX 980 Ti Armor 2X
Storage Samsung SSD 850 Pro 256GB, 2 x 4TB HGST NAS HDD in RAID 1
Display(s) 3 x Acer K272HUL 27" in Surround 7860x1440
Case NZXT Source 530
Audio Device(s) Integrated ALC1150 + Logitech Z-5500 5.1
Power Supply Seasonic X-1250 1.25kW
Mouse Gigabyte Aivia Krypton
Keyboard Logitech G15
Software Windows 8.1 Pro x64
#29

tigger

I'm the only one
Joined
Mar 20, 2006
Messages
10,476 (2.44/day)
Likes
1,673
System Name Black to the Core MKIV
Processor Intel I7 6700k
Motherboard Asus Z170 Pro Gaming socket 1151
Cooling NZXT Kraken X61 280mm radiator
Memory 2x8gb Corsair vengeance LPX 2400 DDR4
Video Card(s) XFX Radeon R9 290 4gb ddr5
Storage Patriot Blast 120gb ssd Boot and WD10EADX-22TDHB0 1TB Data
Display(s) Dell 2408WFP 24" 1920x1200
Case Nzxt IS 340
Audio Device(s) Asus xonar dsx pci-e
Power Supply Corsair CX750
Mouse Logitech G502
Software Win 10 Pro x64
#30
The mounting holes are evenly spaced around the "processor," which indicates to me that this needs a heatsink when operating. I hope it doesn't need active cooling; having a small, loud fan in a monitor is certainly not a good thing.
Looks like a MXM card for laptops

 
Joined
Sep 29, 2013
Messages
96 (0.06/day)
Likes
25
Processor Intel i7 4960x Ivy-Bridge E @ 4.6 Ghz @ 1.42V
Motherboard x79 AsRock Extreme 11.0
Cooling EK Supremacy Copper Waterblock
Memory 65.5 GBs Corsair Platinum Kit @ 666.7Mhz
Video Card(s) PCIe 3.0 x16 -- Asus GTX Titan Maxwell
Storage Samsung 840 500GBs + OCZ Vertex 4 500GBs 2x 1TB Samsung 850
Audio Device(s) Soundblaster ZXR
Power Supply Corsair 1000W
Mouse Razer Naga
Keyboard Corsair K95
Software Zbrush, 3Dmax, Maya, Softimage, Vue, Sony Vegas Pro, Acid, Soundforge, Adobe Aftereffects, Photoshop
#31
Re:

I can only see G-Sync being useful or desired by consumers if it did the following:

1. Increased Frame Rate Performance. If this gimmick actually increased your FPS. As if there was some form of loss or leakage in performance, and G-Sync prevented that from happening. Instead of actually getting 60 FPS on Tomb Raider with maxed out graphics settings, you're only getting 32 FPS. G-Sync would push it closer to 60 FPS.

To imply that NVidia users need a hardware component in the monitor to improve video fidelity, also implies that NVidia cards still suffer from their own form of micro stutters and screen tearing.

Sadly, in my opinion, I think this is the only innovative niche NVidia could come up with in such a short period of time. Especially when the spot-light just got wider, and it was placed on AMD since RX9 series, Consoles, and AMD Mantle have been the top buzz in the news as of late. AMD is gaining momentum. NVidia wants to push it's own game bundle. No surprise there. Copying the concept from AMD to continue competition, and react to the current situation. AMD is pushing AMD Mantle (can be used by both AMD and NVidia), TrueAudio, DX11.2 Support, PCIe CrossfireX through the PCIe Bus, and the buzz about the RX9-290 possibly outperforming GTX Titan. It seems like AMD lit a match under NVidia's foot besides Microsoft. Now, NVidia is reacting...

If this niche is proprietary, nobody is going to buy it unless the 3rd party benchers like Techpowerup.com, Anandtech.com, and others glorify it. Otherwise, I see this as an unnecessary, supplemental feature that's cause the NVidia Base Consumers to purchase more NVidia products for outrages prices with very little improvements...
 
Joined
Mar 26, 2009
Messages
60 (0.02/day)
Likes
30
#32
NVIDIA will allow people to purchase the G-Sync module , and install it by modding their monitors , you don't actually have to buy a new monitor at all !


Alternatively, if you’re a dab hand with a Philips screwdriver, you can purchase the kit itself and mod an ASUS VG248QE monitor at home. This is of course the cheaper option, and you’ll still receive a 1-year warranty on the G-SYNC module, though this obviously won’t cover modding accidents that are a result of your own doing. A complete installation instruction manual will be available to view online when the module becomes available, giving you a good idea of the skill level required for the DIY solution; assuming proficiency with modding, our technical gurus believe installation should take approximately 30 minutes.

If you prefer to simply buy a monitor off the shelf from a retailer or e-tailer, NVIDIA G-SYNC monitors developed and manufactured by monitor OEMs will be available for sale next year. These monitors will range in size and resolution, scaling all the way up to deluxe 3840x2160 “4K” models, resulting in the ultimate combination of image quality, image smoothness, and input responsiveness.
http://www.geforce.com/whats-new/ar...evolutionary-ultra-smooth-stutter-free-gaming
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
20,919 (6.24/day)
Likes
10,020
Location
IA, USA
System Name BY-2015
Processor Intel Core i7-6700K (4 x 4.00 GHz) w/ HT and Turbo on
Motherboard MSI Z170A GAMING M7
Cooling Scythe Kotetsu
Memory 2 x Kingston HyperX DDR4-2133 8 GiB
Video Card(s) PowerColor PCS+ 390 8 GiB DVI + HDMI
Storage Crucial MX300 275 GB, Seagate 6 TB 7200 RPM
Display(s) Samsung SyncMaster T240 24" LCD (1920x1200 HDMI) + Samsung SyncMaster 906BW 19" LCD (1440x900 DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay
Audio Device(s) Realtek Onboard, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse SteelSeries Sensei RAW
Keyboard Tesoro Excalibur
Software Windows 10 Pro 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
#33
Unless it is GPU manufacturer neutral and standardized, it's a gimmick just like the rest. If NVIDIA was serious about improving monitor tech, it would create a forum so all manufactures can implement it (even if there is a reasonable licensing fee involved).
 
Joined
Mar 26, 2009
Messages
60 (0.02/day)
Likes
30
#34
I can only see G-Sync being useful or desired by consumers if it did the following:

1. Increased Frame Rate Performance. If this gimmick actually increased your FPS. As if there was some form of loss or leakage in performance, and G-Sync prevented that from happening. Instead of actually getting 60 FPS on Tomb Raider with maxed out graphics settings, you're only getting 32 FPS. G-Sync would push it closer to 60 FPS.

To imply that NVidia users need a hardware component in the monitor to improve video fidelity, also implies that NVidia cards still suffer from their own form of micro stutters and screen tearing.

Sadly, in my opinion, I think this is the only innovative niche NVidia could come up with in such a short period of time. Especially when the spot-light just got wider, and it was placed on AMD since RX9 series, Consoles, and AMD Mantle have been the top buzz in the news as of late. AMD is gaining momentum. NVidia wants to push it's own game bundle. No surprise there. Copying the concept from AMD to continue competition, and react to the current situation. AMD is pushing AMD Mantle (can be used by both AMD and NVidia), TrueAudio, DX11.2 Support, PCIe CrossfireX through the PCIe Bus, and the buzz about the RX9-290 possibly outperforming GTX Titan. It seems like AMD lit a match under NVidia's foot besides Microsoft. Now, NVidia is reacting...
Seriously , AMD is the one suffering from a colossal crap load of dropped frames, tearing, frame interleaving and micro-stuttering in their CF systems , that's why they are trying to mitigate the issue with CF over PCIe.

TrueAudio doesn't mean squat , and R290X is not really the answer to Titan at all , you will see that when the review comes about.The only real advantage for AMD now is Mantle, but it remains to be seen whether they will really get it to shine or not .

On the other hand, NVIDIA always has been the brute force of advancement in the PC space, AMD just plays catch up, and today's announcements just cements that idea.

NVIDIA was the first to introduce the GPU, SLi, Frame Pacing (FCAT) , GPU Boost ,Adaptive V.Sync, first to reach unified shader architecture on PCs, first with GPGPU , CUDA , PhysX , Optix (for real time ray tracing) , first with 3D gaming (3D Vision) . first with Optimus for mobile flexibility , first with Geforce Experience program , SLi profiles , shadow play (for recording) and game streaming .

They had better support with CSAA , FXAA , TXAA , driver side Ambient Occlusion and HBAO+ , TWIMTP program , better Linux , OpenGL support, .. and now G.Sync! all of these innovations are sustained and built upon to this day.

And even when AMD beat them to a certain invention , like Eyefinity , NVIDIA didn't stop until they topped it with more features , so they answered with Surround , then did 3D Surround and now 4k Surround.

NVIDIA was at the forefront in developing all of these technologies and they continue to sustain and expand them till now , AMD just follows suit , they fight back with stuff that they don't really sustain so they end up forgotten and abandoned ,even generating more trouble than they worth. Just look at the pathetic state of their own Eyefinity and all of it's CF frame problems.

In short AMD is the one feeling the heat, NOT NVIDIA, heck NVIDIA now fights two generations of AMD cards with only one generation of their own, Fermi held off HD 5870 and 6970 , Kepler held off 7970 and x290 ! and who knows about Maxwell !
 
Last edited:
Joined
Oct 2, 2004
Messages
12,361 (2.56/day)
Likes
5,816
Location
Europe\Slovenia
System Name Dark Silence 2
Processor Intel Core i7 5820K @ 4.5 GHz (1.15V)
Motherboard MSI X99A Gaming 7
Cooling Cooler Master Nepton 120XL
Memory 32 GB DDR4 Kingston HyperX Fury 2400 MHz @ 2666 MHz 15-15-15-32 1T (1.25V)
Video Card(s) AORUS GeForce GTX 1080Ti 11GB (1950/11000 OC Mode)
Storage Samsung 850 Pro 2TB SSD (3D V-NAND)
Display(s) ASUS VG248QE 144Hz 1ms (DisplayPort)
Case Corsair Carbide 330R Titanium
Audio Device(s) Creative Sound BlasterX AE-5 + Altec Lansing MX5021 (HiFi capacitors and OPAMP upgrade)
Power Supply BeQuiet! Dark Power Pro 11 750W
Mouse Logitech G502 Proteus Spectrum
Keyboard Cherry Stream XT Black
Software Windows 10 Pro 64-bit (Fall Creators Update)
#35
Desperate AMD fanboys likes AMD's "exclusive" features onscreen tearing, stuttering and lag. :rolleyes:



Today local pharmacy closed earlier. Did you attacked them because they are selling useless, proprietary and gimmicky pharmacy companies drugs?



Mantle won't work on Nvidia. Why it's not useless?
You can't compare awesome Mantle with useless G-Sync. Besides, isn't Adaptive V-Sync suppose to solve all the image tearing problems? I guess it wasn't as wonderful after all if they need to design some crappy special monitor to counter that...
 
Joined
Mar 26, 2009
Messages
60 (0.02/day)
Likes
30
#36
It's funny when the greatest minds of game development speak highly of G.Sync and it's application, while AMD fanboys try to shoot it down on the basis of nothing but their ignorance!
 
Joined
Dec 16, 2010
Messages
1,484 (0.58/day)
Likes
544
System Name My Surround PC
Processor Intel Core i7 4770K @ 4.2 GHz (1.15 V)
Motherboard ASRock Z87 Extreme6
Cooling Swiftech MCP35X / XSPC Rasa CPU / Swiftech MCW82 / Koolance HX-1320 w/ 8 Scythe Fans
Memory 16GB (2 x 8 GB) Mushkin Blackline DDR3-2400 CL11-13-13-31
Video Card(s) MSI Nvidia GeForce GTX 980 Ti Armor 2X
Storage Samsung SSD 850 Pro 256GB, 2 x 4TB HGST NAS HDD in RAID 1
Display(s) 3 x Acer K272HUL 27" in Surround 7860x1440
Case NZXT Source 530
Audio Device(s) Integrated ALC1150 + Logitech Z-5500 5.1
Power Supply Seasonic X-1250 1.25kW
Mouse Gigabyte Aivia Krypton
Keyboard Logitech G15
Software Windows 8.1 Pro x64
#37
I can only see G-Sync being useful or desired by consumers if it did the following:

1. Increased Frame Rate Performance. If this gimmick actually increased your FPS. As if there was some form of loss or leakage in performance, and G-Sync prevented that from happening. Instead of actually getting 60 FPS on Tomb Raider with maxed out graphics settings, you're only getting 32 FPS. G-Sync would push it closer to 60 FPS.

To imply that NVidia users need a hardware component in the monitor to improve video fidelity, also implies that NVidia cards still suffer from their own form of micro stutters and screen tearing.
I want to clarify what G-Sync is and why it is different than other technologies other posters have presented.

G-Sync is meant to be used in conjunction with frame pacing; both solve different problems. G-Sync has nothing to do with NVidia specific issue; it is one faced by any output with a fixed refresh rate. Monitors currently only draw whole frames at a fixed interval. Therefore, if your framerate output by your video card is lower than your monitor's refresh rate, then you will have some frames duplicated and others not resulting in judder.

Let me explain the difference, using an example where you are displaying a game at 45fps on a 60Hz monitor. This should be helpful to anyone who is confused about the purpose of G-Sync. Let's use a small portion of this scenario (1/15 second or 4 60Hz frames)


Scenario 1 has no frame pacing or G-Sync:

2ms - Frame 1 is written to the video buffer
4ms - Frame 2 is written to the video buffer
16.7ms - Frame 2 is displayed on the monitor since it is the most recent frame
33.4ms - Frame 2 is displayed on the monitor again since it is still the most recent frame
35ms - Frame 3 is written to the video buffer
50ms - Frame 3 is displayed on the monitor since it is the most recent frame
66.7ms - Frame 3 is displayed on the monitor again since it is still the most recent frame

In Scenario 1, the effective frame rate is 30fps since only two distinct frames were displayed by the monitor. The frames were displayed for identical periods of time, so the framerate appears smooth.


Scenario 2 has frame pacing but no G-Sync:

2ms - Frame 1 is written to the video buffer
4ms - Frame 2 is written to the video buffer
16.7ms - Frame 1 is displayed on the monitor since it is the oldest frame in the video buffer
33.4ms - Frame 2 is displayed on the monitor since it is the oldest frame in the video buffer that is still newer than Frame 1, and Frame 1 is deleted from the video buffer
35ms - Frame 3 is written to the video buffer
50ms - Frame 3 is displayed on the monitor since it is the oldest frame in the video buffer that is still newer than Frame 2, and Frame 2 is deleted from the video buffer
66.7ms - Frame 3 is displayed on the monitor again since no newer frames are available

In Scenario 2, the effective frame rate is 45fps since three distinct frames were displayed by the monitor. However, the third frame was displayed for twice as long as either of the first two, so judder is experienced.


Scenario 3 has frame pacing and G-Sync:

0ms - G-sync realizes that the graphics card is generating frames at an average of 45fps and adjust's the monitor's refresh rate to 45Hz
2ms - Frame 1 is written to the video buffer
4ms - Frame 2 is written to the video buffer
22.2ms - Frame 1 is displayed on the monitor since it is the oldest frame in the video buffer
44.4ms - Frame 2 is displayed on the monitor since it is the oldest frame in the video buffer that is still newer than Frame 1, and Frame 1 is deleted from the video buffer
35ms - Frame 3 is written to the video buffer
66.6ms - Frame 3 is displayed on the monitor since it is the oldest frame in the video buffer that is still newer than Frame 2, and Frame 2 is deleted from the video buffer

In Scenario 3, the effective frame rate is 45fps since three distinct frames were displayed by the monitor. In addition, all three frames were displayed for equal amounts of time, so no judder is experienced.


The whole point of this is to have a variable refresh rate on the monitor. Standard frame pacing can't adjust the monitor's refresh rate; it only helps to make sure that all the frames generated by the graphics card are displayed. Nothing currently can remove all judder when the frame rate is different the monitor's refresh rate. This is what G-Sync aims to solve

G-Sync won't improve frame rate, but it will make lower frame rates look better. This is something that cannot be solved with frame pacing alone.

Adaptive V-Sync suppose to solve all the image tearing problems? I guess it wasn't as wonderful after all if they need to design some crappy special monitor to counter that...
Adaptive V-sync was meant to reduce the performance impact of V-sync; it doesn't change anything visually. The idea behind Adaptive V-sync is that if your monitor's refersh rate is higher than the frame rate output of your graphics card at any given time, then V-sync does nothing and the extra computational power required by V-sync is wasted. Adaptive V-sync turns off V-sync in this situation thus resulting in extra performance when the frame rate is low.

See http://www.geforce.com/hardware/technology/adaptive-vsync/technology
 
Last edited:
Joined
May 29, 2012
Messages
429 (0.21/day)
Likes
144
System Name The Cube
Processor i7 - 4770K @ 4.2GHz
Motherboard ASUS Maximum VI Hero
Cooling Corsair H110 w/ 2x Cougar Vortex 140mm Fans
Memory 32GB G.Skill Z-series @ 2133Mhz
Video Card(s) 2 x EVGA GTX 1070 FTW
Storage 1 x 256GB Samsung 840 Pro, 1 x 512GB Samsung 850 EVO & 2 x 2TB WD BLACK RAID 0
Display(s) Dell Ultrasharp U3415W
Case Corsair Carbide Air 540
Power Supply Seasonic PRIME 1000W Titanium
Software Windows 10 Pro 64-bit
#38
You can't compare awesome Mantle with useless G-Sync. Besides, isn't Adaptive V-Sync suppose to solve all the image tearing problems? I guess it wasn't as wonderful after all if they need to design some crappy special monitor to counter that...
You clearly have no idea what the point of Adaptive V-sync is. It's to keep your frame rate from reducing to 30 if it ever drops below 60 (which is what it does, intervals of 30). Basically it was a way to get the best of both worlds.
 
Joined
Sep 19, 2012
Messages
615 (0.32/day)
Likes
75
System Name [WIP]
Processor Intel Pentium G3420 [i7-4790K SOON(tm)]
Motherboard MSI Z87-GD65 Gaming
Cooling [Corsair H100i]
Memory G.Skill TridentX 2x8GB-2400-CL10 DDR3
Video Card(s) [MSI AMD Radeon R9-290 Gaming]
Storage Seagate 2TB Desktop SSHD / [Samsung 256GB 840 PRO]
Display(s) [BenQ XL2420Z]
Case [Corsair Obsidian 750D]
Power Supply Corsair RM750
Software Windows 8.1 x64 Pro / Linux Mint 15 / SteamOS
#39
LOL, hypocrisy at it's finest level, nVidia saying PC gaming is awesome this and that... then releasing sum'more proprietary crapware to fragment it even further.

Jesus, I kinda wish Intel did pursue that dGPU thing... 3 GPU players = pretty much no one affording to fragment an already mother-less and scattered segment of modern video gaming...


AMD might not be perfect, but at least they are more open-minded and aware that walled-garden features aren't the future... They're the Linux of computer hardware...

Give me SteamOS/Mint + OGL w/ good driver support & a low-level API and I'm game.
 
Last edited:
Joined
Aug 5, 2012
Messages
11 (0.01/day)
Likes
2
#40
I have been waiting for this feature in a monitor for years now, tearing and stuttering have always annoyed me more than overall FPS / Picture quality. If this really can fix these issues, I would pay just about anything to get it.

In regards to some other posts, this can not be done with a firmware upgrade for monitors, and no current videocard/monitor combination is free from the terrors of constant or even occasional micro stuttering / tearing... Some people's eyes seem to be less sensitive to it though.

Cant wait for a test drive, but it will make me sad to have to trade off my crossfire 7950 setup..
 
Joined
Dec 16, 2010
Messages
1,484 (0.58/day)
Likes
544
System Name My Surround PC
Processor Intel Core i7 4770K @ 4.2 GHz (1.15 V)
Motherboard ASRock Z87 Extreme6
Cooling Swiftech MCP35X / XSPC Rasa CPU / Swiftech MCW82 / Koolance HX-1320 w/ 8 Scythe Fans
Memory 16GB (2 x 8 GB) Mushkin Blackline DDR3-2400 CL11-13-13-31
Video Card(s) MSI Nvidia GeForce GTX 980 Ti Armor 2X
Storage Samsung SSD 850 Pro 256GB, 2 x 4TB HGST NAS HDD in RAID 1
Display(s) 3 x Acer K272HUL 27" in Surround 7860x1440
Case NZXT Source 530
Audio Device(s) Integrated ALC1150 + Logitech Z-5500 5.1
Power Supply Seasonic X-1250 1.25kW
Mouse Gigabyte Aivia Krypton
Keyboard Logitech G15
Software Windows 8.1 Pro x64
#41
LOL, hypocrisy at it's finest level, nVidia saying PC gaming is awesome this and that... then releasing sum'more proprietary crapware to fragment it even further.
Can we at least let NVidia confirm that it's proprietary before you complain? The only thing announced was that it was running on NVidia hardware and an NVidia display controller. NVidia said nothing about disallowing other manufacturers from producing control boards.
 
Joined
Sep 19, 2012
Messages
615 (0.32/day)
Likes
75
System Name [WIP]
Processor Intel Pentium G3420 [i7-4790K SOON(tm)]
Motherboard MSI Z87-GD65 Gaming
Cooling [Corsair H100i]
Memory G.Skill TridentX 2x8GB-2400-CL10 DDR3
Video Card(s) [MSI AMD Radeon R9-290 Gaming]
Storage Seagate 2TB Desktop SSHD / [Samsung 256GB 840 PRO]
Display(s) [BenQ XL2420Z]
Case [Corsair Obsidian 750D]
Power Supply Corsair RM750
Software Windows 8.1 x64 Pro / Linux Mint 15 / SteamOS
#42
Mantle won't work on Nvidia. Why it's not useless?
Please link me to where it says it won't, cause I'm reaaaaaly curious just from where the BS spring blows. And even if it wouldn't, AMD is in the position to slipstream DX11 HLSL port pretty much every multi-platform game coming out in the next half-decade or so, so having 75% chance that, after a certain point, multi-platform next-generation games can and will be written in Mantle pretty much cements them into API relevancy, w/ or w/o a leading share in the market.

Can we at least let NVidia confirm that it's proprietary before you complain? The only thing announced was that it was running on NVidia hardware and an NVidia display controller. NVidia said nothing about disallowing other manufacturers from producing control boards.
Name one nVidia branded tech that ISN'T proprietary.
 
Joined
Dec 16, 2010
Messages
1,484 (0.58/day)
Likes
544
System Name My Surround PC
Processor Intel Core i7 4770K @ 4.2 GHz (1.15 V)
Motherboard ASRock Z87 Extreme6
Cooling Swiftech MCP35X / XSPC Rasa CPU / Swiftech MCW82 / Koolance HX-1320 w/ 8 Scythe Fans
Memory 16GB (2 x 8 GB) Mushkin Blackline DDR3-2400 CL11-13-13-31
Video Card(s) MSI Nvidia GeForce GTX 980 Ti Armor 2X
Storage Samsung SSD 850 Pro 256GB, 2 x 4TB HGST NAS HDD in RAID 1
Display(s) 3 x Acer K272HUL 27" in Surround 7860x1440
Case NZXT Source 530
Audio Device(s) Integrated ALC1150 + Logitech Z-5500 5.1
Power Supply Seasonic X-1250 1.25kW
Mouse Gigabyte Aivia Krypton
Keyboard Logitech G15
Software Windows 8.1 Pro x64
#43
Name one nVidia branded tech that ISN'T proprietary.
There's no point in debating an assumption. I'm going to wait to hear more before I make any conclusions.
 
Joined
Oct 9, 2009
Messages
675 (0.23/day)
Likes
656
Location
Finland
System Name :P~
Processor Intel Core i7-5930K (ES)
Motherboard Asus Rampage V Extreme/3.1
Cooling Phanteks PH-TC14PE
Memory 32GB Corsair Vengeance LPX 2400 MHz
Video Card(s) Asus GTX 1080 Strix
Storage 400GB Intel 750 PCI-E SSD, 512GB Crucial MX100 SSD, 3TB WD RED HDD
Display(s) QNIX QX2710LED OC @ 96 Hz 27"
Case Corsair Obsidian 750D
Audio Device(s) Audioquest Dragon Red + Sennheiser HD 650
Power Supply Corsair HX1000i + Cablemod sleeved cables kit
Mouse Logitech G500s
Keyboard Logitech Ultra X Flat Premium
Software Windows 10 64-bit
#44
Joined
Sep 19, 2012
Messages
615 (0.32/day)
Likes
75
System Name [WIP]
Processor Intel Pentium G3420 [i7-4790K SOON(tm)]
Motherboard MSI Z87-GD65 Gaming
Cooling [Corsair H100i]
Memory G.Skill TridentX 2x8GB-2400-CL10 DDR3
Video Card(s) [MSI AMD Radeon R9-290 Gaming]
Storage Seagate 2TB Desktop SSHD / [Samsung 256GB 840 PRO]
Display(s) [BenQ XL2420Z]
Case [Corsair Obsidian 750D]
Power Supply Corsair RM750
Software Windows 8.1 x64 Pro / Linux Mint 15 / SteamOS
#45
There's no point in debating an assumption. I'm going to wait to hear more before I make any conclusions.
Fair enough.

You know what this means...

good bye SLi microstuttering. Give me naooo :eek:

I can't believe you guys are calling this useless. :mad:

Guru3D - NVIDIA G-SYNC Overview - Shaky Cam Voice ...
Not useless, but walled garden gaming isn't my idea of PC gaming... or entertainment, in general... Also it's very very debatable of how effective or easy-to-implement this will actually be (yay, from the really small number of choices for gaming displays, now we'll get an even smaller palette!). If TXAA, 3D Vision or PhysX are things to go by... it will be a stillborn horse that nVidia (with it's massive budget) will still be beating on for years and years, even after it's basically just a pile of bones.
 
Joined
Sep 13, 2011
Messages
252 (0.11/day)
Likes
55
Location
Malaysia
Processor Intel Core i3-2100 @ 3.1 GHz
Motherboard Asus P8H61-M LX (B3)
Cooling Intel stock HSF
Memory 2x 4GB Kingston DDR3 @ 1333 MHz
Video Card(s) Gigabyte Radeon HD 6750 OC
Storage Western Digital Caviar Blue 500GB SATA 3
Display(s) HP W2072a 20"
Case Cooler Master Elite 430 Black
Audio Device(s) Integrated (Realtek ALC887)
Power Supply Cooler Master eXtreme Power Plus 500W
Software Windows 7 Home Premium x64
#46
Like always, hate fest! Why am I not surprise? Typical..
 
Joined
Apr 30, 2012
Messages
2,419 (1.18/day)
Likes
1,333
#47
Another question is...

With the current Asus VG248QE going for as low as $249-$269. Is the $130+ mark-up on G-Sync ($399) worth it.
or
Are you better off saving it for a second monitor since its 50% of the price.

Will it be a set price or scale with screen size $$$$
 
Joined
Feb 9, 2009
Messages
1,469 (0.45/day)
Likes
381
Location
Toronto
Processor i7-2670QM / Q9550 3.6ghz
Motherboard laptop / Asus P5Q-E
Cooling laptop / Cooler Master Hyper 212
Memory 2x4gb ddr3sd / 2x2gb ddr2
Video Card(s) 570m / MSI 660 Gaming OC
Storage ST9750420AS / ST1000DM003
Display(s) BenQ FP241VW / BenQ GW2265HM
Case MSI gx780 / Corsair 500r
Audio Device(s) onboard
Power Supply laptop / Corsair 750tx
Mouse Steelseries Kinzu V2 / Logitech M120
Keyboard Logitech Deluxe 250 / Logitech K120
Software Windows 7
#48
as a red team guy, the hatred in this thread is ridiculous (on a related note, amd setting up a 290x demo across the street from nv's event made me uneasy)

name non proprietary nv tech? someone was blind on a videocardz thread as well, physx on the cpu is multiplatform & in many games+consoles! the new FCAT method runs anywhere, FXAA (although i want to say that's one guy), how about helping devs add DX10 or 11 support? not all nv sponsored titles have locks on them, amd has done the same in helping codemasters games be good looking & efficient

sure GPU physx & cuda are very annoying, not doubting that, it's not 'good guy nvidia', but many things start out proprietary to show that they work

we should be pressuring VESA/displayport/hdmi for an adaptive refresh technique like this :respect:

i dont get why nv doesnt license things out so that everyone can enjoy instead of locking down to only their platform (look at bluray, it's standard, just a royalty is paid, it's not locked to sony products)

just cuz we didnt hit a perfect or open world doesnt mean we should destroy it, this is still better than deciding between matrox+rendition+3dfx+whoever else all at once if you want to play various games in the 90s :banghead:
 
Joined
Jun 20, 2009
Messages
279 (0.09/day)
Likes
40
System Name Nazgul
Processor Core i5 6600k
Motherboard MSI Krait Gaming 170a
Cooling Phanteks Dual Tower
Memory 8gb Corsair DDR4 3600mhz
Video Card(s) EVGA FTW+ GTX970
Storage Samsung 128gb SSD OS, 500gb 7200rpm storage
Display(s) Benq 144hz 1080p
Case Fractal Design Define r5 titanium window
Power Supply Corsair AX850
Mouse Logitech
Keyboard Razer
Software Windows 7 Pro 64 bit
#49
I'm skeptical, I would like to sit down and play some games with this to see if I can actually tell a difference
 
Joined
Jun 11, 2013
Messages
353 (0.21/day)
Likes
63
Processor Core i5-3350P @3.5GHz
Motherboard MSI Z77MA-G45 (uATX)
Cooling Stock Intel
Memory 2x4GB Crucial Ballistix Tactical DDR3 1600
Video Card(s) |Ξ \/ G /\ GeForce GTX 670 FTW+ 4GB w/Backplate, Part Number: 04G-P4-3673-KR, ASIC 68.5%
Storage some cheap seagates and one wd green
Display(s) Dell UltraSharp U2412M
Case some cheap old eurocase, black, with integrated cheap lit lcd for basic monitoring
Audio Device(s) Realtek ALC892
Power Supply Enermax Triathlor 550W ETA550AWT bronze, non-modular, airflow audible over 300W power draw
Mouse PMSG1G
Keyboard oldschool membrane Keytronic 104 Key PS/2 (big enter, right part of right shift broken into "\" key)
#50
instead of braving the 60+ fps, they aim for 30fps@30Hz hahahahahaha :roll: :slap: :nutkick:


I'm skeptical, I would like to sit down and play some games with this to see if I can actually tell a difference
well, dont try that with a crt monitor or you end up in epileptical convulsion in no time :rockout:
 
Last edited: