Monday, June 25th 2018

NVIDIA G-Sync HDR Module Adds $500 to Monitor Pricing

PCPer had the opportunity to disassemble the ASUS ROG Swift PG27UQ 27", a 4K 144 Hz G-Sync HDR Monitor and found that the G-Sync module is a newer version than the one used on 1st generation G-Sync monitors (which of course do not support 4K / 144 Hz / HDR). The module is powered by an FPGA made by Altera (Intel-owned since 2015). The exact model number is Arria 10 GX 480, which is a high-performance 20 nanometer SoC that provides enough bandwidth and LVDS pins to process the data stream.

The FPGA is sold in low quantities for $2000 at Digikey and Mouser. Assuming that NVIDIA buys thousands, PCPer suggests that the price of this chip alone will add $500 to monitor cost. The BOM cost is further increased by 3 GB of DDR4 memory on the module. With added licensing fees for G-SYNC, this explains why these monitors are so expensive.
Sources: PCPer Review, Altera Product Page
Add your own comment

89 Comments on NVIDIA G-Sync HDR Module Adds $500 to Monitor Pricing

#1
lynx29
Gasaraki said:
Yeah, people have no fuckin idea what G-Sync and Freesync does. Running a monitor at 120Hz+ without VRR technology doesn't make tearing go away, you need to sync up the frames between the video card and monitor.
i have never seen tearing on my gsync monitor once, i use rivatuner to cap frame 4 fps below max. so gsync never turns off.
Posted on Reply
#2
Slizzo
@Vya Domus You can see the input lag at your desktop if you hook your computer up to a regular TV that doesn't have low response times.

One of my PCs is hooked up to my TV every now at then, and there's always a heartbeat between moving the mouse physically, and action taking place on the screen.

Head on over to www.rtings.com and read through some of their reviews and look at the response time section. They are no nonsense and will give you just the facts on the TVs.
Posted on Reply
#3
Vya Domus
Slizzo said:
@Vya Domus You can see the input lag at your desktop if you hook your computer up to a regular TV that doesn't have low response times.
Never said input lag doesn't exist , I simply find it unbelievable people can seriously claim they can perceive differences in the order of 10-20ms. As in 120hz synchronized vs not. We are talking 8ms vs less , that's just simply minuscule.
Posted on Reply
#4
SoTOP
Vya Domus said:
Never said input lag doesn't exist , I simply find it unbelievable people can seriously claim they can perceive differences in the order of 10-20ms. As in 120hz synchronized vs not. We are talking 8ms vs less , that's just simply minuscule.
It doesn't matter what you believe. http://www.100fps.com/how_many_frames_can_humans_see.htm Tests with Air force pilots have shown, that they could identify the plane on a flashed picture that was flashed only for 1/220th of a second.
Posted on Reply
#5
xorbe
There is no way that VRR needs a $2000 fpga, there is something wrong with this info. There has been plenty of time to spin a custom chip to do what's needed. VRR is not magic, it's a very basic concept. Show the current frame until the next one arrives. That does not require loads of tricky transistors.
Posted on Reply
#6
jabbadap
Vya Domus said:
Never said input lag doesn't exist , I simply find it unbelievable people can seriously claim they can perceive differences in the order of 10-20ms. As in 120hz synchronized vs not. We are talking 8ms vs less , that's just simply minuscule.
Well look at tftcentrals reviews, they have quite good explanation about input lag. In short it is tied on panel refresh rate, higher the refresh rate is, lower the input lag has to be. So if you have 60Hz screen and about 16ms input lag, it's better than having 120Hz screen and 15ms input lag.

xorbe said:
There is no way that VRR needs a $2000 fpga, there is something wrong with this info. There has been plenty of time to spin a custom chip to do what's needed. VRR is not magic, it's a very basic concept. Show the current frame until the next one arrives. That does not require loads of tricky transistors.
People seems to forget that it replaces whole mainboard from the monitor, so it's not automatic +$xxx over non-gsync monitor. And yeah there's no change in hell that nvidia will buy those fpgas at $2000 each either.
Posted on Reply
#7
qubit
Overclocked quantum bit
Slizzo said:
@Vya Domus You can see the input lag at your desktop if you hook your computer up to a regular TV that doesn't have low response times.

One of my PCs is hooked up to my TV every now at then, and there's always a heartbeat between moving the mouse physically, and action taking place on the screen.

Head on over to www.rtings.com and read through some of their reviews and look at the response time section. They are no nonsense and will give you just the facts on the TVs.
If a TVs slow enough, you may even notice it on pressing the buttons on the remote control lol.
Posted on Reply
#8
dir_d
Nvidia will be forced to pick up the Adaptive Sync standard and stop using an FPGA. HDMI 2.1 and consoles are now starting to use VRR. Nvidia responds with their Big TVs to play on but who wants to buy an OLED for movies and buy an over priced Nvidia TV for games. Samsung has already begun shipping VRR in their TVs and Nvidia will feel the squeeze because most cant buy more than one TV.
Posted on Reply
#9
jabbadap
qubit said:
If a TVs slow enough, you may even notice it on pressing the buttons on the remote control lol.
Most TVs have gaming mode(those consoles) though, which might help with input lag.
Posted on Reply
#10
cucker tarlson
Vya Domus said:
Never said input lag doesn't exist , I simply find it unbelievable people can seriously claim they can perceive differences in the order of 10-20ms. As in 120hz synchronized vs not. We are talking 8ms vs less , that's just simply minuscule.
Of course they can. I mean if you play at v-sync 60 fps and you get tons of lag constantly, then adding another 10-20 ms will not be as noticeable. The point we're making here is when you try playing at 30ms and get used to it then adding 20ms on top of that is gonna feel bad instantly. Another example of how you just don't understand perspectives and try to always find some equivalency between cases that are completely different.
I remember playing the COD WWII beta , using v-sync at 60fps and a shitty wireless mouse I was pretty much always in the top 3. With all that "unbearable lag" , go figure.
60 fps v-sync felt great when I was running games on a 60hz display, but it never feels the same since I got used to +90 fps with g-sync. Doesn't mean 60 fps is all you ever need.
Posted on Reply
#11
Steeda
This is pathetic, i cant believe people have to pay this kind of $ for this. Everytime I see something about G-Sync i want to throw up in my mouth. I would love to buy g-sync but at the pricing it is completely and utterly stupid to do so. I have had nvidia gpus for quite some time b/c AMD can not give me the performance i want in gaming. So I am stuck with a 35" ultrawide with no g-sync b/c the damn thing is too expensive for the technology.

FYI - just venting
Posted on Reply
#12
cucker tarlson
I agree nvidia is charging an arm and a leg for those, but you get ulmb+adaptive sync in nvcp, those who want great response with no blur know it is friggin worth it. If I could choose to get one with g-sync implemented like freesync for a lower price and a normal g-sync one with ulmb and whole 30-1xx/2xx hz range guaranteed to work, I'd gladly pay the premium.
Posted on Reply
#13
TheMailMan78
Big Member
Slizzo said:
@Vya Domus You can see the input lag at your desktop if you hook your computer up to a regular TV that doesn't have low response times.

One of my PCs is hooked up to my TV every now at then, and there's always a heartbeat between moving the mouse physically, and action taking place on the screen.

Head on over to www.rtings.com and read through some of their reviews and look at the response time section. They are no nonsense and will give you just the facts on the TVs.
This is true. The Xbox just upped its refresh rate to 120mhz from the last update to address this.

https://news.xbox.com/en-us/2018/04/20/may-xbox-update/
Posted on Reply
#14
Jelle Mees
GamerNerves said:
Freesync and G-Sync are very welcome technologies. It is completely up to the invidiual person if he notices the advantages or not. These adaptive synchronization technologies can eliminate microstutter and tearing with no added input lag, what regural vertical synchronization will induce, at any supported refresh rate defined by the monitor at hand.

My personal opinion is that at refresh rate more than 120 Hz tearing is noticable, but it doesn't bother me, unlike at 60 Hz when it certainly does. I can definetly enjoy games with such unnoticable tearing. However, depending on the game being played, microstutter can occur for various reasons, which adaptive sync is often able to reduce or completely eliminate.
You think Nvidia cares about that? If they can keep this up for a couple of years. They have earned millions. So what if the technologie doesn't survive.
Posted on Reply
#15
John Naylor
Two pages of posts about G-Sync technology without mention of the hardware module's MBR function ?

G-Sync - The term "G-Sync" oddly is still used even when the user turns off G-Sync. A "G-Sync" monitor can be used 2 ways. Using Adaptive Sync to sync frame rates is only one of them. The sync technology provided has its most noticeable impact from 30 to 75ish fps at which point users will often choose to switch to ULMB, the usage of which requires G-Sync to be disabled. The nVidia hardware module provides the back light strobing required for motion blur (MBR) reduction.

Freesync - Freesync provides a similar adaptive sync technology and again has its most significant impact from 40 to 75ish fps. Like G-Sync it continues to have an impact above 75 fps, but in both instances it trails, off quickly. But here's the kicker. Freesync has no hardware module and is therefore incapable of providing Motion Blur Reduction technology (backlight strobing) which virtually eliminates ghosting. Some monitor manufacturers have provided such technology on there own ... problem is there's a myriad of designs and results vary. And when it's done well, the cost of providing the necessary MBR hardware, erases Freesync's cot advantage.

If you are running a nVidia top tier card on a Acer XB271HU or Asus PG279Q and don't have G-Sync disabled and instead using ULMB, you're missing out. When the new 4k 144 hz versions of those monitors drop, I'd expect users will be bouncing back between each setting depending on frame rates.
Posted on Reply
#16
Dimi
I love my G-Sync monitor. My second monitor is only 60hz while my G-Sync monitor is at 165hz. The difference in smoothness without screen tearing is immense. Even desktop usage is so much better.

I use G-Sync mode, i have tried ULMB mode but meh, i still prefer G-Sync mode.

Don't knock it till you try it.
Posted on Reply
#17
Steevo
qubit said:
I'd love to know what G-SYNC does so much better over a FreeSync monitor to achieve a similar result that takes so much more engineering and money. I can only think that the solution is much more sophisticated, but there doesn't seem to be that much difference in reviews.

In the end, the expensive G-SYNC v the free FreeSync difference will ensure that G-SYNC eventually dies out while FreeSync becomes the de facto standard, which is happening already. NVIDIA really needs to work on the pricing of this technology, or they'll end up creating GeForce cards that support FreeSync eventually. That would be no bad thing for the consumer, presumably.
It's running security checks to ensure only Nvidia cards are connected, that requires a VM, which needs RAM, and a fast processor.......
Posted on Reply
#18
Gasaraki
qubit said:
I'd love to know what G-SYNC does so much better over a FreeSync monitor to achieve a similar result that takes so much more engineering and money. I can only think that the solution is much more sophisticated, but there doesn't seem to be that much difference in reviews.

In the end, the expensive G-SYNC v the free FreeSync difference will ensure that G-SYNC eventually dies out while FreeSync becomes the de facto standard, which is happening already. NVIDIA really needs to work on the pricing of this technology, or they'll end up creating GeForce cards that support FreeSync eventually. That would be no bad thing for the consumer, presumably.
In order to get the G-Sync certification, minimum technical specifications and features must be available and certified to work correctly. Freesync is free for all.

https://www.rtings.com/monitor/guide/freesync-amd-vs-gsync-nvidia

Steevo said:
It's running security checks to ensure only Nvidia cards are connected, that requires a VM, which needs RAM, and a fast processor.......
Stop lying. You can use G-Sync monitors with AMD cards.
Posted on Reply
#19
cucker tarlson
Gasaraki said:
In order to get the G-Sync certification, minimum technical specifications and features must be available and certified to work correctly. Freesync is free for all.

https://www.rtings.com/monitor/guide/freesync-amd-vs-gsync-nvidia



Stop lying. You can use G-Sync monitors with AMD cards.
Yes but all you get is fastsync, or whatever it is called on amd. Enhanced sync I think.
Posted on Reply
#20
bug
qubit said:
I'd love to know what G-SYNC does so much better over a FreeSync monitor to achieve a similar result that takes so much more engineering and money. I can only think that the solution is much more sophisticated, but there doesn't seem to be that much difference in reviews.

In the end, the expensive G-SYNC v the free FreeSync difference will ensure that G-SYNC eventually dies out while FreeSync becomes the de facto standard, which is happening already. NVIDIA really needs to work on the pricing of this technology, or they'll end up creating GeForce cards that support FreeSync eventually. That would be no bad thing for the consumer, presumably.
First-gen GSync did ULMB and didn't have that annoying FPS restriction that came with FreeSync.

But $500 for a GSync2 module is actually great news. It means the death of GSync is that much closer, so we can settle one one standard, like, you know, sane people.
Posted on Reply
#21
qubit
Overclocked quantum bit
bug said:
First-gen GSync did ULMB and didn't have that annoying FPS restriction that came with FreeSync.

But $500 for a GSync2 module is actually great news. It means the death of GSync is that much closer, so we can settle one one standard, like, you know, sane people.
Sane people. Hmmmm....
Posted on Reply
#22
bug
qubit said:
Sane people. Hmmmm....
Like sane people. Big difference ;)
Posted on Reply
#23
B-Real
qubit said:
I'd love to know what G-SYNC does so much better over a FreeSync monitor to achieve a similar result that takes so much more engineering and money. I can only think that the solution is much more sophisticated, but there doesn't seem to be that much difference in reviews.

In the end, the expensive G-SYNC v the free FreeSync difference will ensure that G-SYNC eventually dies out while FreeSync becomes the de facto standard, which is happening already. NVIDIA really needs to work on the pricing of this technology, or they'll end up creating GeForce cards that support FreeSync eventually. That would be no bad thing for the consumer, presumably.
Just check how many Freesync and how many G-Sync monitors are on the market. Even the second biggest TV manufacturer allowed Freesync support on some of their 2018 models. People can taunt AMD but getting a much cheaper monitor with the same specifications is good for everyone.
Posted on Reply
#24
FordGT90Concept
"I go fast!1!11!1!"
Vya Domus said:
I laugh my ass off every time I see this. The average human reaction time is something like 250ms , whoever seriously thinks that a time frame of 16ms of less can make a perceivable difference is being delusional.

I remember playing the COD WWII beta , using v-sync at 60fps and a shitty wireless mouse I was pretty much always in the top 3. With all that "unbearable lag" , go figure.
It's persistance of vision, or lack thereof. Reaction time isn't just the visual component, it's understanding what you're looking at, then the brain telling the muscles to move, and then the muscles actually moving. Persistance of vision just being *slightly* off can cause motion sickness or dizziness.

qubit said:
In the end, the expensive G-SYNC v the free FreeSync difference will ensure that G-SYNC eventually dies out while FreeSync becomes the de facto standard, which is happening already. NVIDIA really needs to work on the pricing of this technology, or they'll end up creating GeForce cards that support FreeSync eventually. That would be no bad thing for the consumer, presumably.
NVIDIA would rather sell you a bridge to no where than put driver resources and certification testing into adopting adaptive sync.
Posted on Reply
#25
bug
FordGT90Concept said:
It's persistance of vision, or lack thereof. Reaction time isn't just the visual component, it's understanding what you're looking at, then the brain telling the muscles to move, and then the muscles actually moving. Persistance of vision just being *slightly* off can cause motion sickness or dizziness.
Slightly off from what? It's not like you're also seeing a reference frame while gaming.
I don't often agree with Vya, but making a big fuss of a 16ms delay when your reaction times is 15x that is a little out there.
The only time that would make a difference would be if your mouse movement lagged perceivably behind. But you can test that on the desktop: it doesn't. The only reason these things are advertised is because professional monitors value color accuracy above everything else, thus they don't tend to use overdrive and can have latencies in the 30-50ms (or worse). That could cause the mouse to lag a little behind fast movements.
Posted on Reply
Add your own comment