Monday, June 25th 2018

NVIDIA G-Sync HDR Module Adds $500 to Monitor Pricing

PCPer had the opportunity to disassemble the ASUS ROG Swift PG27UQ 27", a 4K 144 Hz G-Sync HDR Monitor and found that the G-Sync module is a newer version than the one used on 1st generation G-Sync monitors (which of course do not support 4K / 144 Hz / HDR). The module is powered by an FPGA made by Altera (Intel-owned since 2015). The exact model number is Arria 10 GX 480, which is a high-performance 20 nanometer SoC that provides enough bandwidth and LVDS pins to process the data stream.

The FPGA is sold in low quantities for $2000 at Digikey and Mouser. Assuming that NVIDIA buys thousands, PCPer suggests that the price of this chip alone will add $500 to monitor cost. The BOM cost is further increased by 3 GB of DDR4 memory on the module. With added licensing fees for G-SYNC, this explains why these monitors are so expensive.
Sources: PCPer Review, Altera Product Page
Add your own comment

89 Comments on NVIDIA G-Sync HDR Module Adds $500 to Monitor Pricing

#2
Fluffmeister
It's it true HDR? And is it ironic Freesync is a trademark?

The reality is a lot of this display nonsense is expensive and it will get cheaper over time, I just can't get invested in the salty tears of the dodgy supply chain.
Posted on Reply
#3
Xzibit
I think Intel is missing an opportunity here. Intel owns Altera

All G-Sync monitors should have
Posted on Reply
#4
Fluffmeister
Overpriced Intel Altera tech and DRAM companies price fixing. We spend years waiting for the competition to turn up, then suddenly some get into bed together with some weird Kaby G VegaPolaris M frankenstein lark.

We all have Intel inside, God bless them.
Posted on Reply
#5
Vya Domus
FordGT90Concept said:
Persistance of vision just being *slightly* off can cause motion sickness or dizziness.
And single digit ms variations can cause that ? We are talking a few 1/1000ths of a second , I was unable to find any study or price of information than can confirm humans can spot variations that small and be able to respond to them in a consistent way. Not to mention humans can't even maintain a consistent reaction time in the first place , it always varies by 20 , 30 ,50ms , you are telling something like 16ms can have a measurable effect ?
Posted on Reply
#6
qubit
Overclocked quantum bit
B-Real said:
Just check how many Freesync and how many G-Sync monitors are on the market. Even the second biggest TV manufacturer allowed Freesync support on some of their 2018 models. People can taunt AMD but getting a much cheaper monitor with the same specifications is good for everyone.
Yeah, that's what I'm saying, FreeSync is becoming the de facto standard. NVIDIA better wake up and smell the roses before it's too late.
Posted on Reply
#7
TheGuruStud
las said:
Gsync and Freesync is a joke, unless you play in 30-60 fps range. Tearing is not an issue at 120+ fps using 120+ Hz.

No serious gamer should use VSYNC. Adds input lag.

I have Gsync. I use ULMB instead. Way better. Any gaming LCD should use black frame insertion. Much less blur in fast paced games.
I used to play exclusively at 120 fps/120 hz on a huge CRT back in the day and even then I needed vsync. Get a clue, tearing is monumentally distracting and drives me nuts at any rate. You must be blind. Adaptive sync is awesome and the future for all gaming. Go get some more milk from mommy.
Posted on Reply
#8
Jism
Vya Domus said:
I laugh my ass off every time I see this. The average human reaction time is something like 250ms , whoever seriously thinks that a time frame of 16ms of less can make a perceivable difference is being delusional.

I remember playing the COD WWII beta , using v-sync at 60fps and a shitty wireless mouse I was pretty much always in the top 3. With all that "unbearable lag" , go figure.
Yeah and why do you think there is a professional gaming market playing with systems like 144FPS?

I can feel a difference in both Vsync on (capped at 72hz) or 72hz without Vsync and 110FPS produced by my GPU. The difference is is that i can see litterally the input lag from sweeping left to right in Pubg for example. Your GPU is actually being constrained to limit at your refresh rate, and this constain adds input lag. Maybe you or your neighbour does'nt note this but anyone playing a FPS game will note / feel the difference.

Play long enough and you'll understand that at some point the 60Hz / 60FPS becomes a limitation in any fast pace FPS game.
Posted on Reply
#9
Prima.Vera
TheLostSwede said:
Yet G-sync feels like a over rated technology on a whole. Not overly impressed by my own screen, just glad I didn't pay the full price for it.
The fact they're using an FPGA suggests Nvidia doesn't expect to sell the kind of volume of these screens were a custom ASIC would make sense from a cost perspective, which further shows how over rated G-sync is.
I completelly agree. I have also a G-Spot monitor, but to be honest playing on a 1440p with an GTX 1080 cannot go more than 100 fps anyways and it's always averaging between 40-80 fps. So I never get any tearing even with those x-SYNCs enabled or not... Plus fluidity....hmmm
Posted on Reply
#10
FordGT90Concept
"I go fast!1!11!1!"
Vya Domus said:
And single digit ms variations can cause that ? We are talking a few 1/1000ths of a second , I was unable to find any study or price of information than can confirm humans can spot variations that small and be able to respond to them in a consistent way. Not to mention humans can't even maintain a consistent reaction time in the first place , it always varies by 20 , 30 ,50ms , you are telling something like 16ms can have a measurable effect ?
https://en.wikipedia.org/wiki/Subliminal_stimuli

Just because the higher orders of the brain ignores something doesn't mean it didn't register with lower orders of brain function.

Higher refresh rate and more frames translates to a smoother, clearer picture. Example: get in a vehicle and drive the speed limit, now focus on grass just beyond the road out the passenger window. Your brain will take all of that data, force your eyes to hook on to a specific reference point, and take a snap shot of it. In that instance, you'll have clarity. Try to do the same thing video and the clarity simply isn't there. There's huge gaps in the data between frames. Your brain will end up hooking on to a single frame and recalling that picture.

Again, this has nothing to do with reaction time and everything to do with how the brain handles eye sight:
https://en.wikipedia.org/wiki/Persistence_of_vision
Posted on Reply
#11
cucker tarlson
B-Real said:
Just check how many Freesync and how many G-Sync monitors are on the market. Even the second biggest TV manufacturer allowed Freesync support on some of their 2018 models. People can taunt AMD but getting a much cheaper monitor with the same specifications is good for everyone.
Freesync equivalents lack strobing, which is probably the best technology for gaming ATM as long as you can keep a constant, high fps.
Vya Domus said:
And single digit ms variations can cause that ? We are talking a few 1/1000ths of a second , I was unable to find any study or price of information than can confirm humans can spot variations that small and be able to respond to them in a consistent way. Not to mention humans can't even maintain a consistent reaction time in the first place , it always varies by 20 , 30 ,50ms , you are telling something like 16ms can have a measurable effect ?
You're mistaking reaction time with what one can observe. Still confused. And why do you need a study,there's a million people who can tell you this and, better even, you can get a high refresh display yourself instead of relying on some digits taken out of context. What "human reaction" varies by 50ms ? Certainly not players' muscle memory when they play shooters. You're taking stuff out of context again.
Posted on Reply
#13
cucker tarlson
FordGT90Concept said:


"Surprisingly, there’s no overdrive setting in the OSD menu. Instead, the ‘Response Time’ setting offers Standard, Faster, and Fastest options. Both Faster and Fastest modes enable backlight strobing which delivers the specified 1ms response time speed."

G-SYNC vs FreeSync is like SLI versus Crossfire. The former is hardware, the latter is software.
Ah,good find.
A ‘1ms MRPT’ (Moving Picture Response Time) is specified, achieved using the ‘Impulsive Scanning’ strobe backlight mode.
Though I wonder how dark a VA panel gets in strobing mode.
Posted on Reply
#14
las
TheGuruStud said:
I used to play exclusively at 120 fps/120 hz on a huge CRT back in the day and even then I needed vsync. Get a clue, tearing is monumentally distracting and drives me nuts at any rate. You must be blind. Adaptive sync is awesome and the future for all gaming. Go get some more milk from mommy.
You used VSYNC? Hahaha... You like input lag? Casual gamer detected. Case closed.
Posted on Reply
#16
TheGuruStud
las said:
You used VSYNC? Hahaha... You like input lag? Casual gamer detected. Case closed.
Ban this retard, please. He knows nothing. He's never even used a CRT nor even knows the amount of input lag incurred from anything. And thinks that you can actually see anything while gaming with tearing. Mommy should have aborted this mouth breather.
Posted on Reply
#17
las
TheGuruStud said:
Ban this retard, please. He knows nothing. He's never even used a CRT nor even knows the amount of input lag incurred from anything. And thinks that you can actually see anything while gaming with tearing. Mommy should have aborted this mouth breather.
Read and learn - https://www.blurbusters.com/faq/motion-blur-reduction/

Professional gamers don't use Gsync or Freesync. Why do you think?

Btw how old are you? Talking about CRT's yet acting like a teen :laugh:
Posted on Reply
#18
TheGuruStud
las said:
Read and learn - https://www.blurbusters.com/faq/motion-blur-reduction/

Professional gamers don't use Gsync or Freesync. Why do you think?

Btw how old are you? Talking about CRT's yet acting like a teen :laugh:
Do you think you're a pro gamer? Lolololololololololololololololololol. I bet you also think you're a world class driver in mommy's Civic, dumb ass. Tearing ruins gaming a lot more than 10ms.

And it's funny, vsync didn't stop me from ROFL stomping wannabes like yourself online in every FPS.
Posted on Reply
#19
Pure Wop
las said:
Gsync and Freesync is a joke, unless you play in 30-60 fps range. Tearing is not an issue at 120+ fps using 120+ Hz.

No serious gamer should use VSYNC. Adds input lag.

I have Gsync. I use ULMB instead. Way better. Any gaming LCD should use black frame insertion. Much less blur in fast paced games.
The problem is even on my 3440*1440 monitor 120+ fps requires SLI and good game support of SLI, not to mention 4K ones.

To my surprise I can't find ULMB ultrawides either.
Posted on Reply
#20
las
TheGuruStud said:
Do you think you're a pro gamer? Lolololololololololololololololololol. I bet you also think you're a world class driver in mommy's Civic, dumb ass. Tearing ruins gaming a lot more than 10ms.

And it's funny, vsync didn't stop me from ROFL stomping wannabes like yourself online in every FPS.
You are clueless. Pro gamers use what's best, and they don't use Gsync or Freesync.
Nah I'm not pro, but I know how games are supposed to run. No motion blur and lowest possible input lag.

Seriously, how old are you? Hahaha. You act like a mad teen. Ragekid?
Posted on Reply
#22
bug
las said:
You are clueless. Pro gamers use what's best, and they don't use Gsync or Freesync.
That is where you're wrong. In the absence of statistics showing gamers using fast refresh screens consistently win more than the others, the more likely explanation is gamers use whatever the sponsors make available for them.
If refresh was paramount, there'd be a plethora of CRTs at every competition.

Also, judging what's best for you based of what professional gamers use is like looking at Formula 1 or Nascar to see what car you need to buy next.
Posted on Reply
#23
las
bug said:
That is where you're wrong. In the absence of statistics showing gamers using fast refresh screens consistently win more than the others, the more likely explanation is gamers use whatever the sponsors make available for them.
If refresh was paramount, there'd be a plethora of CRTs at every competition.

Also, judging what's best for you based of what professional gamers use is like looking at Formula 1 or Nascar to see what car you need to buy next.
Not really the same. They are not superhumans. Anyone will benefit from no motion blur. LCD/OLED has tons of motion blur without proper blur reduction mode.
Have you tried gaming with motion blur reduction mode? It's like playing on CRT again.

Posted on Reply
#24
bug
las said:
Not really the same. They are not superhumans. Anyone will benefit from no motion blur. LCD/OLED has tons of motion blur without proper blur reduction mode.
Have you tried gaming with motion blur reduction mode? It's like playing on CRT again.
You continue to assume gaming can only happen on the very high end...
Posted on Reply
#25
lexluthermiester
TheGuruStud said:
Go get some more milk from mommy.
You were making a good point up until that.
TheGuruStud said:
Mommy should have aborted this mouth breather.
Really?

Maybe you're the one that needs growing up, eh?

I do agree with adaptive-sync being very useful, but whether or not it's the future remains to be seen.

las said:
It's like playing on CRT again.
CRT's had motion-blur as well. It was less pronounced at lower refresh-rates, but it was still there.
Posted on Reply
Add your own comment