• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA G-Sync HDR Module Adds $500 to Monitor Pricing

Joined
Aug 6, 2017
Messages
2,755 (5.59/day)
Likes
1,636
Location
Poland
System Name skurwiel szatan
Processor i7 5775c OC:4.3/1.385v/EDRAM @2GHz Power save: 3.3/1.06v
Motherboard Z97X Gaming 5
Cooling Noctua D15S
Memory Crucial Ballistix Tactical VLP 1600 CL8 1.35v OC @2400 CL10 1.6v
Video Card(s) MSI GTX 1080 Ti Gaming X Trio 2GHz
Storage SU900 128 (OS),850 Pro 256+256+ 512+860 Evo 500 (games),4TB of HDDs (3+1)
Display(s) Acer XB241YU+Dell S2716DG dual monitor setup
Case Full tower
Audio Device(s) Mad Catz FREQ wireless
Power Supply Superflower Leadex Gold 850W
Mouse G403 wireless + Steelseries DeX + Roccat rest
Keyboard Razer Deathstalker
Software Windows 10
Benchmark Scores A LOT
#26
You just can't take whatever it is you find written on the internet by someone who calls themselves an expert and argue with the personal experience of just about anyone who jumped from 60hz to 120hz or higher. This is ridiculous. But I know you'd rather take a French leave when people point out the obvious flaws in your thinking and gaps in your experience and knowledge than admit even an expert can be 100% wrong, therefore you are too.

And why is it that people's first reaction when they see someone calls them out on their statements is to block or ignore. I've seen dozens of people point out my bad thinking, and among hundreds of things I don't agree with I still can find a lot of things that changed my perspective. One can't be 100% right at all times, but blocking whatever you see that you don't like is cowardly.
 
Joined
Oct 17, 2014
Messages
1,760 (1.16/day)
Likes
796
Location
USA
System Name $170 family PC, rest was reused old parts.
Processor Ryzen Athlon 200GE @ 3.2Ghz ($55)
Motherboard Biostar A320M Pro ($42)
Memory 8gb (2x4) DDR4 2666 CAS 15-15-15 ($63)
Video Card(s) Vega 3 Integrated
Storage 120GB SSD
Display(s) AOC 22V2H 21.5" Frameless 75hz Freesync
Case Thermaltake View 22
Audio Device(s) Schiit Modi 3 + Custom Tube Amp + Sennheiser HD58X
Power Supply Corsair 750w Bronze
Mouse Roccat Kone AIMO
Keyboard Logitech G610+ Cherry MX Red
Software Ubuntu
#27
Yeah, people have no fuckin idea what G-Sync and Freesync does. Running a monitor at 120Hz+ without VRR technology doesn't make tearing go away, you need to sync up the frames between the video card and monitor.

i have never seen tearing on my gsync monitor once, i use rivatuner to cap frame 4 fps below max. so gsync never turns off.
 
Joined
Aug 2, 2011
Messages
1,109 (0.41/day)
Likes
547
Processor Core i7 7820X @ 4.8GHz
Motherboard Asrock Taichi XE
Cooling Custom loop
Memory 4x8GB Corsair Vengence LED @3000MHz
Video Card(s) Gigabyte Aorus GTX 1080 Ti @ +25core +450mem
Storage 960 EVO 1TB, Micron 1100 2TB SSD, OCZ Vector 512GB, 750GB Samsung, 1.5TB Caviar Green
Display(s) Acer X34P, Qnix QX2710
Case Corsair Carbide 500R White
Audio Device(s) ASUS Xonar DX, Klipsch Promedia Ultra 5.1, Corsair Void Pro RGB
Power Supply EVGA P2 850W
Mouse Logitech G502 Proteus Spectrum
Keyboard Razer Black Widow Ultimate 2016 Edition
Software Win 10 x64
#28
@Vya Domus You can see the input lag at your desktop if you hook your computer up to a regular TV that doesn't have low response times.

One of my PCs is hooked up to my TV every now at then, and there's always a heartbeat between moving the mouse physically, and action taking place on the screen.

Head on over to www.rtings.com and read through some of their reviews and look at the response time section. They are no nonsense and will give you just the facts on the TVs.
 
Joined
Jan 8, 2017
Messages
3,462 (4.93/day)
Likes
2,570
System Name Good enough
Processor AMD Ryzen R7 1700X - 4.0 Ghz / 1.350V
Motherboard ASRock B450M Pro4
Cooling Scythe Katana 4 - 3x 120mm case fans
Memory 16GB - Corsair Vengeance LPX
Video Card(s) ASUS GTX 1060 Turbo 6GB ~ 2139 Mhz / 9.4 Gbps
Storage 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) 4K Samsung TV
Case Zalman R1
Power Supply 500W
#29
@Vya Domus You can see the input lag at your desktop if you hook your computer up to a regular TV that doesn't have low response times.
Never said input lag doesn't exist , I simply find it unbelievable people can seriously claim they can perceive differences in the order of 10-20ms. As in 120hz synchronized vs not. We are talking 8ms vs less , that's just simply minuscule.
 

SoTOP

New Member
Joined
Jun 25, 2018
Messages
1 (0.01/day)
Likes
1
#30
Never said input lag doesn't exist , I simply find it unbelievable people can seriously claim they can perceive differences in the order of 10-20ms. As in 120hz synchronized vs not. We are talking 8ms vs less , that's just simply minuscule.
It doesn't matter what you believe. http://www.100fps.com/how_many_frames_can_humans_see.htm Tests with Air force pilots have shown, that they could identify the plane on a flashed picture that was flashed only for 1/220th of a second.
 
Joined
Feb 14, 2012
Messages
1,915 (0.77/day)
Likes
699
System Name msdos
Processor 8086
Motherboard mainboard
Cooling passive
Memory 640KB + 384KB extended
Video Card(s) EGA
Storage 5.25"
Display(s) 80x25
Case plastic
Audio Device(s) modchip
Power Supply 45 watts
Mouse serial
Keyboard yes
Software disk commander
Benchmark Scores still running
#31
There is no way that VRR needs a $2000 fpga, there is something wrong with this info. There has been plenty of time to spin a custom chip to do what's needed. VRR is not magic, it's a very basic concept. Show the current frame until the next one arrives. That does not require loads of tricky transistors.
 
Joined
Mar 10, 2014
Messages
1,207 (0.69/day)
Likes
329
#32
Never said input lag doesn't exist , I simply find it unbelievable people can seriously claim they can perceive differences in the order of 10-20ms. As in 120hz synchronized vs not. We are talking 8ms vs less , that's just simply minuscule.
Well look at tftcentrals reviews, they have quite good explanation about input lag. In short it is tied on panel refresh rate, higher the refresh rate is, lower the input lag has to be. So if you have 60Hz screen and about 16ms input lag, it's better than having 120Hz screen and 15ms input lag.

There is no way that VRR needs a $2000 fpga, there is something wrong with this info. There has been plenty of time to spin a custom chip to do what's needed. VRR is not magic, it's a very basic concept. Show the current frame until the next one arrives. That does not require loads of tricky transistors.
People seems to forget that it replaces whole mainboard from the monitor, so it's not automatic +$xxx over non-gsync monitor. And yeah there's no change in hell that nvidia will buy those fpgas at $2000 each either.
 
Last edited:

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
15,718 (3.91/day)
Likes
9,321
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K at stock (hits 5 gees+ easily)
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (4 x 4GB Corsair Vengeance DDR3 PC3-12800 C9 1600MHz)
Video Card(s) Zotac GTX 1080 AMP! Extreme Edition
Storage Samsung 850 Pro 256GB | WD Green 4TB
Display(s) BenQ XL2720Z | Asus VG278HE (both 27", 144Hz, 3D Vision 2, 1080p)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair HX 850W v1
Software Windows 10 Pro 64-bit
#33
@Vya Domus You can see the input lag at your desktop if you hook your computer up to a regular TV that doesn't have low response times.

One of my PCs is hooked up to my TV every now at then, and there's always a heartbeat between moving the mouse physically, and action taking place on the screen.

Head on over to www.rtings.com and read through some of their reviews and look at the response time section. They are no nonsense and will give you just the facts on the TVs.
If a TVs slow enough, you may even notice it on pressing the buttons on the remote control lol.
 
Joined
Sep 1, 2009
Messages
884 (0.26/day)
Likes
134
Location
Manteca, Ca
System Name Rebirth
Processor Intel i5 2500k @4.5Ghz
Motherboard Asus P8P67 Pro
Cooling Megahalem 120x25 x2 GT AP-15 Push/Pull
Memory 2x4Gb Corsair Vengeance
Video Card(s) Sapphire HD7950 Vapor-X + MSI HD7950 TF3
Storage Samsung 840 Pro 120 SSD + Seagate 7200.12 1TB + 500gig WD + 3TB Hitachi
Display(s) X-Star Glossy DP2710
Case Antec 1200
Audio Device(s) Asus Xonar STX
Power Supply Antec CP-850
Software Microsoft Windows 8 Pro x64
#34
Nvidia will be forced to pick up the Adaptive Sync standard and stop using an FPGA. HDMI 2.1 and consoles are now starting to use VRR. Nvidia responds with their Big TVs to play on but who wants to buy an OLED for movies and buy an over priced Nvidia TV for games. Samsung has already begun shipping VRR in their TVs and Nvidia will feel the squeeze because most cant buy more than one TV.
 
Joined
Aug 6, 2017
Messages
2,755 (5.59/day)
Likes
1,636
Location
Poland
System Name skurwiel szatan
Processor i7 5775c OC:4.3/1.385v/EDRAM @2GHz Power save: 3.3/1.06v
Motherboard Z97X Gaming 5
Cooling Noctua D15S
Memory Crucial Ballistix Tactical VLP 1600 CL8 1.35v OC @2400 CL10 1.6v
Video Card(s) MSI GTX 1080 Ti Gaming X Trio 2GHz
Storage SU900 128 (OS),850 Pro 256+256+ 512+860 Evo 500 (games),4TB of HDDs (3+1)
Display(s) Acer XB241YU+Dell S2716DG dual monitor setup
Case Full tower
Audio Device(s) Mad Catz FREQ wireless
Power Supply Superflower Leadex Gold 850W
Mouse G403 wireless + Steelseries DeX + Roccat rest
Keyboard Razer Deathstalker
Software Windows 10
Benchmark Scores A LOT
#36
Never said input lag doesn't exist , I simply find it unbelievable people can seriously claim they can perceive differences in the order of 10-20ms. As in 120hz synchronized vs not. We are talking 8ms vs less , that's just simply minuscule.
Of course they can. I mean if you play at v-sync 60 fps and you get tons of lag constantly, then adding another 10-20 ms will not be as noticeable. The point we're making here is when you try playing at 30ms and get used to it then adding 20ms on top of that is gonna feel bad instantly. Another example of how you just don't understand perspectives and try to always find some equivalency between cases that are completely different.

I remember playing the COD WWII beta , using v-sync at 60fps and a shitty wireless mouse I was pretty much always in the top 3. With all that "unbearable lag" , go figure.
60 fps v-sync felt great when I was running games on a 60hz display, but it never feels the same since I got used to +90 fps with g-sync. Doesn't mean 60 fps is all you ever need.
 
Last edited:

Steeda

New Member
Joined
Jun 25, 2018
Messages
1 (0.01/day)
Likes
0
Location
Boise, Idaho
#37
This is pathetic, i cant believe people have to pay this kind of $ for this. Everytime I see something about G-Sync i want to throw up in my mouth. I would love to buy g-sync but at the pricing it is completely and utterly stupid to do so. I have had nvidia gpus for quite some time b/c AMD can not give me the performance i want in gaming. So I am stuck with a 35" ultrawide with no g-sync b/c the damn thing is too expensive for the technology.

FYI - just venting
 
Joined
Aug 6, 2017
Messages
2,755 (5.59/day)
Likes
1,636
Location
Poland
System Name skurwiel szatan
Processor i7 5775c OC:4.3/1.385v/EDRAM @2GHz Power save: 3.3/1.06v
Motherboard Z97X Gaming 5
Cooling Noctua D15S
Memory Crucial Ballistix Tactical VLP 1600 CL8 1.35v OC @2400 CL10 1.6v
Video Card(s) MSI GTX 1080 Ti Gaming X Trio 2GHz
Storage SU900 128 (OS),850 Pro 256+256+ 512+860 Evo 500 (games),4TB of HDDs (3+1)
Display(s) Acer XB241YU+Dell S2716DG dual monitor setup
Case Full tower
Audio Device(s) Mad Catz FREQ wireless
Power Supply Superflower Leadex Gold 850W
Mouse G403 wireless + Steelseries DeX + Roccat rest
Keyboard Razer Deathstalker
Software Windows 10
Benchmark Scores A LOT
#38
I agree nvidia is charging an arm and a leg for those, but you get ulmb+adaptive sync in nvcp, those who want great response with no blur know it is friggin worth it. If I could choose to get one with g-sync implemented like freesync for a lower price and a normal g-sync one with ulmb and whole 30-1xx/2xx hz range guaranteed to work, I'd gladly pay the premium.
 
Joined
Jun 3, 2007
Messages
22,599 (5.37/day)
Likes
9,133
Location
'Merica. The Great SOUTH!
System Name TheMailbox 5.0 / The Mailbox 4.5
Processor RYZEN 1700X / Intel i7 2600k @ 4.2GHz
Motherboard Fatal1ty X370 Gaming K4 / Gigabyte Z77X-UP5 TH Intel LGA 1155
Cooling MasterLiquid PRO 280 / Scythe Katana 4
Memory ADATA RGB 16GB DDR4 2666 16-16-16-39 / G.SKILL Sniper Series 16GB DDR3 1866: 9-9-9-24
Video Card(s) MSI 1080 "Duke" with 8Gb of RAM. Boost Clock 1847 MHz / ASUS 780ti
Storage 256Gb M4 SSD / 128Gb Agelity 4 SSD , 500Gb WD (7200)
Display(s) LG 29" Class 21:9 UltraWide® IPS LED Monitor 2560 x 1080 / Dell 27"
Case Cooler Master MASTERBOX 5t / Cooler Master 922 HAF
Audio Device(s) Realtek ALC1220 Audio Codec / SupremeFX X-Fi with Bose Companion 2 speakers.
Power Supply Seasonic FOCUS Plus Series SSR-750PX 750W Platinum / SeaSonic X Series X650 Gold
Mouse SteelSeries Sensei (RAW) / Logitech G5
Keyboard Razer BlackWidow / Logitech (Unknown)
Software Windows 10 Pro (64-bit)
Benchmark Scores Benching is for bitches.
#39
@Vya Domus You can see the input lag at your desktop if you hook your computer up to a regular TV that doesn't have low response times.

One of my PCs is hooked up to my TV every now at then, and there's always a heartbeat between moving the mouse physically, and action taking place on the screen.

Head on over to www.rtings.com and read through some of their reviews and look at the response time section. They are no nonsense and will give you just the facts on the TVs.
This is true. The Xbox just upped its refresh rate to 120mhz from the last update to address this.

https://news.xbox.com/en-us/2018/04/20/may-xbox-update/
 
Joined
Jan 24, 2008
Messages
807 (0.20/day)
Likes
191
Location
Belgium ⇒ Limburg
System Name Meshify C Ryzen 2018
Processor AMD Ryzen 2700x @ ( 8 x 4.15Ghz )
Motherboard X470 AORUS ULTRA GAMING
Cooling AMD Wraith Prism LED Cooler
Memory 32GB DDR4 ( F4-3200C16D-32GTZKW, 16-16-16-36 @ 3000Mhz )
Video Card(s) Gigabyte RX Vega 56 ( with Vega 64 bios )
Storage Samsung Evo 960
Display(s) EIZO FS2434-BK
Case Fractal Design Meshify C – Dark TG
Audio Device(s) Sennheiser GSP 300 ( Headset )
Power Supply Seasonic FOCUS Plus Series 650W
Mouse Logitech G502
Keyboard Logitech G 15
Software Windows 10 Pro 64bit
#40
Freesync and G-Sync are very welcome technologies. It is completely up to the invidiual person if he notices the advantages or not. These adaptive synchronization technologies can eliminate microstutter and tearing with no added input lag, what regural vertical synchronization will induce, at any supported refresh rate defined by the monitor at hand.

My personal opinion is that at refresh rate more than 120 Hz tearing is noticable, but it doesn't bother me, unlike at 60 Hz when it certainly does. I can definetly enjoy games with such unnoticable tearing. However, depending on the game being played, microstutter can occur for various reasons, which adaptive sync is often able to reduce or completely eliminate.
You think Nvidia cares about that? If they can keep this up for a couple of years. They have earned millions. So what if the technologie doesn't survive.
 
Joined
Mar 18, 2015
Messages
1,238 (0.91/day)
Likes
718
Location
Long Island
#41
Two pages of posts about G-Sync technology without mention of the hardware module's MBR function ?

G-Sync - The term "G-Sync" oddly is still used even when the user turns off G-Sync. A "G-Sync" monitor can be used 2 ways. Using Adaptive Sync to sync frame rates is only one of them. The sync technology provided has its most noticeable impact from 30 to 75ish fps at which point users will often choose to switch to ULMB, the usage of which requires G-Sync to be disabled. The nVidia hardware module provides the back light strobing required for motion blur (MBR) reduction.

Freesync - Freesync provides a similar adaptive sync technology and again has its most significant impact from 40 to 75ish fps. Like G-Sync it continues to have an impact above 75 fps, but in both instances it trails, off quickly. But here's the kicker. Freesync has no hardware module and is therefore incapable of providing Motion Blur Reduction technology (backlight strobing) which virtually eliminates ghosting. Some monitor manufacturers have provided such technology on there own ... problem is there's a myriad of designs and results vary. And when it's done well, the cost of providing the necessary MBR hardware, erases Freesync's cot advantage.

If you are running a nVidia top tier card on a Acer XB271HU or Asus PG279Q and don't have G-Sync disabled and instead using ULMB, you're missing out. When the new 4k 144 hz versions of those monitors drop, I'd expect users will be bouncing back between each setting depending on frame rates.
 
Joined
Feb 13, 2009
Messages
291 (0.08/day)
Likes
116
Processor Intel Core i7 8700K
Motherboard ROG STRIX Z370-G GAMING AC
Cooling Corsair H115i Pro RGB
Memory G.Skill Trident Z RGB 16GB DDR4 3200Mhz
Video Card(s) Gigabyte GTX 1070 G1
Storage Samsung 970 Evo 500GB
Display(s) Dell S2417DG 165Hz
Case Corsair Obsidian 350D
Power Supply Corsair AX760
Mouse Razer Deathadder Chroma
Keyboard Cooler Master - Masterkeys Pro L RGB
Software Windows 10 Pro 64Bit
#42
I love my G-Sync monitor. My second monitor is only 60hz while my G-Sync monitor is at 165hz. The difference in smoothness without screen tearing is immense. Even desktop usage is so much better.

I use G-Sync mode, i have tried ULMB mode but meh, i still prefer G-Sync mode.

Don't knock it till you try it.
 
Joined
Nov 4, 2005
Messages
10,145 (2.12/day)
Likes
2,521
System Name MoFo 2
Processor AMD PhenomII 1100T @ 4.2Ghz
Motherboard Asus Crosshair IV
Cooling Swiftec 655 pump, Apogee GT,, MCR360mm Rad, 1/2 loop.
Memory 8GB DDR3-2133 @ 1900 8.9.9.24 1T
Video Card(s) HD7970 1250/1750
Storage Agility 3 SSD 6TB RAID 0 on RAID Card
Display(s) 46" 1080P Toshiba LCD
Case Rosewill R6A34-BK modded (thanks to MKmods)
Audio Device(s) ATI HDMI
Power Supply 750W PC Power & Cooling modded (thanks to MKmods)
Software A lot.
Benchmark Scores Its fast. Enough.
#43
I'd love to know what G-SYNC does so much better over a FreeSync monitor to achieve a similar result that takes so much more engineering and money. I can only think that the solution is much more sophisticated, but there doesn't seem to be that much difference in reviews.

In the end, the expensive G-SYNC v the free FreeSync difference will ensure that G-SYNC eventually dies out while FreeSync becomes the de facto standard, which is happening already. NVIDIA really needs to work on the pricing of this technology, or they'll end up creating GeForce cards that support FreeSync eventually. That would be no bad thing for the consumer, presumably.
It's running security checks to ensure only Nvidia cards are connected, that requires a VM, which needs RAM, and a fast processor.......
 
Joined
Nov 29, 2016
Messages
496 (0.67/day)
Likes
179
System Name Unimatrix
Processor Intel Xeon X5675 @ 4.2GHz
Motherboard Asus P6T6 WS Revolution
Cooling Enermax AIO
Memory 12GB Corsair Dominator DDR3 @ 1600
Video Card(s) EVGA 2080 XC
Storage MyDigitalSSD BPX 512GB M.2 SSD, WD Black 2TB
Display(s) Alienware 34" Ultrawide 3440x1440
Case Corsair
Power Supply Enermax Revolution 85+ 850W
Keyboard Corsair K75
Benchmark Scores Really High
#44
I'd love to know what G-SYNC does so much better over a FreeSync monitor to achieve a similar result that takes so much more engineering and money. I can only think that the solution is much more sophisticated, but there doesn't seem to be that much difference in reviews.

In the end, the expensive G-SYNC v the free FreeSync difference will ensure that G-SYNC eventually dies out while FreeSync becomes the de facto standard, which is happening already. NVIDIA really needs to work on the pricing of this technology, or they'll end up creating GeForce cards that support FreeSync eventually. That would be no bad thing for the consumer, presumably.
In order to get the G-Sync certification, minimum technical specifications and features must be available and certified to work correctly. Freesync is free for all.

https://www.rtings.com/monitor/guide/freesync-amd-vs-gsync-nvidia

It's running security checks to ensure only Nvidia cards are connected, that requires a VM, which needs RAM, and a fast processor.......
Stop lying. You can use G-Sync monitors with AMD cards.
 
Joined
Aug 6, 2017
Messages
2,755 (5.59/day)
Likes
1,636
Location
Poland
System Name skurwiel szatan
Processor i7 5775c OC:4.3/1.385v/EDRAM @2GHz Power save: 3.3/1.06v
Motherboard Z97X Gaming 5
Cooling Noctua D15S
Memory Crucial Ballistix Tactical VLP 1600 CL8 1.35v OC @2400 CL10 1.6v
Video Card(s) MSI GTX 1080 Ti Gaming X Trio 2GHz
Storage SU900 128 (OS),850 Pro 256+256+ 512+860 Evo 500 (games),4TB of HDDs (3+1)
Display(s) Acer XB241YU+Dell S2716DG dual monitor setup
Case Full tower
Audio Device(s) Mad Catz FREQ wireless
Power Supply Superflower Leadex Gold 850W
Mouse G403 wireless + Steelseries DeX + Roccat rest
Keyboard Razer Deathstalker
Software Windows 10
Benchmark Scores A LOT
#45
Joined
May 22, 2015
Messages
4,777 (3.67/day)
Likes
1,971
Processor Intel i5-6600k
Motherboard ASRock Z170 Extreme7+
Cooling Arctic Cooling Freezer i11
Memory 2x8GB DDR4 2400 G.Skill
Video Card(s) EVGA GTX 1060 SC
Storage 128 and 256GB OCZ Vertex4, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 3TB Seagate
Display(s) HP ZR24w
Case Chieftec BX01
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
#46
I'd love to know what G-SYNC does so much better over a FreeSync monitor to achieve a similar result that takes so much more engineering and money. I can only think that the solution is much more sophisticated, but there doesn't seem to be that much difference in reviews.

In the end, the expensive G-SYNC v the free FreeSync difference will ensure that G-SYNC eventually dies out while FreeSync becomes the de facto standard, which is happening already. NVIDIA really needs to work on the pricing of this technology, or they'll end up creating GeForce cards that support FreeSync eventually. That would be no bad thing for the consumer, presumably.
First-gen GSync did ULMB and didn't have that annoying FPS restriction that came with FreeSync.

But $500 for a GSync2 module is actually great news. It means the death of GSync is that much closer, so we can settle one one standard, like, you know, sane people.
 

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
15,718 (3.91/day)
Likes
9,321
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K at stock (hits 5 gees+ easily)
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (4 x 4GB Corsair Vengeance DDR3 PC3-12800 C9 1600MHz)
Video Card(s) Zotac GTX 1080 AMP! Extreme Edition
Storage Samsung 850 Pro 256GB | WD Green 4TB
Display(s) BenQ XL2720Z | Asus VG278HE (both 27", 144Hz, 3D Vision 2, 1080p)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair HX 850W v1
Software Windows 10 Pro 64-bit
#47
First-gen GSync did ULMB and didn't have that annoying FPS restriction that came with FreeSync.

But $500 for a GSync2 module is actually great news. It means the death of GSync is that much closer, so we can settle one one standard, like, you know, sane people.
Sane people. Hmmmm....
 
Joined
May 22, 2015
Messages
4,777 (3.67/day)
Likes
1,971
Processor Intel i5-6600k
Motherboard ASRock Z170 Extreme7+
Cooling Arctic Cooling Freezer i11
Memory 2x8GB DDR4 2400 G.Skill
Video Card(s) EVGA GTX 1060 SC
Storage 128 and 256GB OCZ Vertex4, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 3TB Seagate
Display(s) HP ZR24w
Case Chieftec BX01
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
#48
Joined
Feb 18, 2017
Messages
255 (0.39/day)
Likes
163
#49
I'd love to know what G-SYNC does so much better over a FreeSync monitor to achieve a similar result that takes so much more engineering and money. I can only think that the solution is much more sophisticated, but there doesn't seem to be that much difference in reviews.

In the end, the expensive G-SYNC v the free FreeSync difference will ensure that G-SYNC eventually dies out while FreeSync becomes the de facto standard, which is happening already. NVIDIA really needs to work on the pricing of this technology, or they'll end up creating GeForce cards that support FreeSync eventually. That would be no bad thing for the consumer, presumably.
Just check how many Freesync and how many G-Sync monitors are on the market. Even the second biggest TV manufacturer allowed Freesync support on some of their 2018 models. People can taunt AMD but getting a much cheaper monitor with the same specifications is good for everyone.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
23,484 (6.33/day)
Likes
12,529
Location
IA, USA
System Name BY-2015
Processor Intel Core i7-6700K (4 x 4.00 GHz) w/ HT and Turbo on
Motherboard MSI Z170A GAMING M7
Cooling Scythe Kotetsu
Memory 2 x Kingston HyperX DDR4-2133 8 GiB
Video Card(s) PowerColor PCS+ 390 8 GiB DVI + HDMI
Storage Crucial MX300 275 GB, Seagate 6 TB 7200 RPM
Display(s) Samsung SyncMaster T240 24" LCD (1920x1200 HDMI) + Samsung SyncMaster 906BW 19" LCD (1440x900 DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay
Audio Device(s) Realtek Onboard, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse SteelSeries Sensei RAW
Keyboard Tesoro Excalibur
Software Windows 10 Pro 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
#50
I laugh my ass off every time I see this. The average human reaction time is something like 250ms , whoever seriously thinks that a time frame of 16ms of less can make a perceivable difference is being delusional.

I remember playing the COD WWII beta , using v-sync at 60fps and a shitty wireless mouse I was pretty much always in the top 3. With all that "unbearable lag" , go figure.
It's persistance of vision, or lack thereof. Reaction time isn't just the visual component, it's understanding what you're looking at, then the brain telling the muscles to move, and then the muscles actually moving. Persistance of vision just being *slightly* off can cause motion sickness or dizziness.

In the end, the expensive G-SYNC v the free FreeSync difference will ensure that G-SYNC eventually dies out while FreeSync becomes the de facto standard, which is happening already. NVIDIA really needs to work on the pricing of this technology, or they'll end up creating GeForce cards that support FreeSync eventually. That would be no bad thing for the consumer, presumably.
NVIDIA would rather sell you a bridge to no where than put driver resources and certification testing into adopting adaptive sync.
 
Top