• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA G-Sync HDR Module Adds $500 to Monitor Pricing

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
28,873 (3.74/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
PCPer had the opportunity to disassemble the ASUS ROG Swift PG27UQ 27", a 4K 144 Hz G-Sync HDR Monitor and found that the G-Sync module is a newer version than the one used on 1st generation G-Sync monitors (which of course do not support 4K / 144 Hz / HDR). The module is powered by an FPGA made by Altera (Intel-owned since 2015). The exact model number is Arria 10 GX 480, which is a high-performance 20 nanometer SoC that provides enough bandwidth and LVDS pins to process the data stream.

The FPGA is sold in low quantities for $2000 at Digikey and Mouser. Assuming that NVIDIA buys thousands, PCPer suggests that the price of this chip alone will add $500 to monitor cost. The BOM cost is further increased by 3 GB of DDR4 memory on the module. With added licensing fees for G-SYNC, this explains why these monitors are so expensive.



View at TechPowerUp Main Site
 
The price you pay for high resolution and HDR. betwen this, the panel and the GPU, vs a 1440p144 setup.
 
Yet G-sync feels like a over rated technology on a whole. Not overly impressed by my own screen, just glad I didn't pay the full price for it.
The fact they're using an FPGA suggests Nvidia doesn't expect to sell the kind of volume of these screens were a custom ASIC would make sense from a cost perspective, which further shows how over rated G-sync is.
 
Yet G-sync feels like a over rated technology on a whole. Not overly impressed by my own screen, just glad I didn't pay the full price for it.
The fact they're using an FPGA suggests Nvidia doesn't expect to sell the kind of volume of these screens were a custom ASIC would make sense from a cost perspective, which further shows how over rated G-sync is.
I think it depends on both the games you play and your setup. on a 144hz setup with fast paced FPS games, it makes a huge difference in smoothness and usability VS using vsync. But on slower games like RTS games, or at lower refresh rates, GSYNC's usefulness is diminished.

It's biggest advantage is not having to use vsync, which helps for any reaction based game.
 
Gsync and Freesync is a joke, unless you play in 30-60 fps range. Tearing is not an issue at 120+ fps using 120+ Hz.

No serious gamer should use VSYNC. Adds input lag.

I have Gsync. I use ULMB instead. Way better. Any gaming LCD should use black frame insertion. Much less blur in fast paced games.
 
No serious gamer should use VSYNC. Adds input lag.

I laugh my ass off every time I see this. The average human reaction time is something like 250ms , whoever seriously thinks that a time frame of 16ms of less can make a perceivable difference is being delusional.

I remember playing the COD WWII beta , using v-sync at 60fps and a shitty wireless mouse I was pretty much always in the top 3. With all that "unbearable lag" , go figure.
 
I laugh my ass off every time I see this. The average human reaction time is something like 250ms , whoever seriously thinks that a time frame of 16ms of less can make a perceivable difference is being delusional.

I remember playing the COD WWII beta , using v-sync at 60fps and a shitty wireless mouse I was pretty much always in the top 3. With all that "unbearable lag" , go figure.

Delusional. Haha, I can feel it instantly. You sound like a casual gamer and you probably are with that CPU. Your 60 Hz TV has tons of input lag too. No wonder you can't tell the difference between VSYNC on and off..

Go try a low input lag 120-240 Hz monitor with 120+ fps ....
 
I think it depends on both the games you play and your setup. on a 144hz setup with fast paced FPS games, it makes a huge difference in smoothness and usability VS using vsync. But on slower games like RTS games, or at lower refresh rates, GSYNC's usefulness is diminished.

It's biggest advantage is not having to use vsync, which helps for any reaction based game.

You can certainly still see tearing at 144Hz despite what some people say, and G-Sync can of course help with this, but tearing at that high a refresh rate doesn't last as long as it does at lower refresh rates, so it's wrong to say G-Sync makes a "huge" difference when you're reaching that kind of frame rate. Yes it will help, but it would be more beneficial and appreciated at lower frame rates where, without it, you'd be more aware of the tearing. I'm all for G-Sync though and yes, it means you don't need V-Sync, and ultimately 144Hz with G-Sync is definitely going to give the smoothest gaming experience. Of course, if your GPU can't even get close to pushing the monitor that high it's a moot point. Let's not also forget that input lag and certain monitor panel characteristics (smearing, ghosting, gamma shift etc.) can also factor in heavily to the overall gaming experience, with some people being far more sensitive than others to these things.
 
Last edited:
Gaming feels smoother to me when I use G-Sync, so I don't know what to tell the naysayers. I guess turn it off and don't enjoy it. I personally notice a difference in my overall gaming experience being smoother.
 
Gaming feels smoother to me when I use G-Sync

Of course you do , variable refresh rate ensures you see only unique frames where the frame time variation is minimal. That's a huge chunk of what constitutes a smooth gaming experience not the lack of 10ms or whatever in terms of response time.
 
Of course you do , variable refresh rate ensures you see only unique frames where the frame time variation is minimal. That's a huge chunk of what constitutes a smooth gaming experience not the lack of 10ms or whatever in terms of response time.


Yeah, people have no idea what G-Sync and Freesync does. Running a monitor at 120Hz+ without VRR technology doesn't make tearing go away, you need to sync up the frames between the video card and monitor.
 
It is kind of weird to have a small PC powering a monitor to connect it to an actual computer :D
 
Freesync and G-Sync are very welcome technologies. It is completely up to the invidiual person if he notices the advantages or not. These adaptive synchronization technologies can eliminate microstutter and tearing with no added input lag, what regural vertical synchronization will induce, at any supported refresh rate defined by the monitor at hand.

My personal opinion is that at refresh rate more than 120 Hz tearing is noticable, but it doesn't bother me, unlike at 60 Hz when it certainly does. I can definetly enjoy games with such unnoticable tearing. However, depending on the game being played, microstutter can occur for various reasons, which adaptive sync is often able to reduce or completely eliminate.
 
I'd love to know what G-SYNC does so much better over a FreeSync monitor to achieve a similar result that takes so much more engineering and money. I can only think that the solution is much more sophisticated, but there doesn't seem to be that much difference in reviews.

In the end, the expensive G-SYNC v the free FreeSync difference will ensure that G-SYNC eventually dies out while FreeSync becomes the de facto standard, which is happening already. NVIDIA really needs to work on the pricing of this technology, or they'll end up creating GeForce cards that support FreeSync eventually. That would be no bad thing for the consumer, presumably.
 
Yeah, people have no fuckin idea what G-Sync and Freesync does. Running a monitor at 120Hz+ without VRR technology doesn't make tearing go away, you need to sync up the frames between the video card and monitor.

It does not matter when frames are replaced that fast. No pro gamers use VSYNC or adaptive sync. Wonder why...
VRR is mainly for low fps gaming. Both Gsync and Freesync has shown to add input lag vs VSYNC OFF, depends on game how much.

Once again, ULMB is far superior to Gsync. I have Gsync and it's a joke at high fps. ULMB is delivering CRT like motion with no blur. Why on earth would I use Gsync when I can have buttery smooth motion.

Most people here don't have a clue about how smooth games CAN run. 120+ fps using 120+ Hz with ULMB ... Come again when you have tried it.
 
I'd love to know what G-SYNC does so much better over a FreeSync monitor to achieve a similar result that takes so much more engineering and money. I can only think that the solution is much more sophisticated, but there doesn't seem to be that much difference in reviews.

In the end, the expensive G-SYNC v the free FreeSync difference will ensure that G-SYNC eventually dies out while FreeSync becomes the de facto standard, which is happening already. NVIDIA really needs to work on the pricing of this technology, or they'll end up creating GeForce cards that support FreeSync eventually. That would be no bad thing for the consumer, presumably.

I agree, I think they need to step up their game and use the hardware in the graphics card to do most of the work
 
Ahahahaahahahahahahah how pathetic...
 
Last edited:
I'd love to know what G-SYNC does so much better over a FreeSync monitor to achieve a similar result that takes so much more engineering and money. I can only think that the solution is much more sophisticated, but there doesn't seem to be that much difference in reviews.

In the end, the expensive G-SYNC v the free FreeSync difference will ensure that G-SYNC eventually dies out while FreeSync becomes the de facto standard, which is happening already. NVIDIA really needs to work on the pricing of this technology, or they'll end up creating GeForce cards that support FreeSync eventually. That would be no bad thing for the consumer, presumably.
One simple answer: the backing of nvidia GPUs.

You want to use adaptive sync with a 1080ti? You must use gsync. The 144hz freesync limit doesnt matter all that much because AMD doesnt make a GPU that can actually push that frame rate with any reasonable detail level. When freesync was being pushed hard, the 480 was the best AMD had to offer.

I imagine there is a lot more you can do with hardware based monitor expansions VS the freesync standard, but I doubt nvidia will pursue that path until AMD can bother to compete.
 
One simple answer: the backing of nvidia GPUs.

You want to use adaptive sync with a 1080ti? You must use gsync. The 144hz freesync limit doesnt matter all that much because AMD doesnt make a GPU that can actually push that frame rate with any reasonable detail level. When freesync was being pushed hard, the 480 was the best AMD had to offer.

I imagine there is a lot more you can do with hardware based monitor expansions VS the freesync standard, but I doubt nvidia will pursue that path until AMD can bother to compete.
Thing is, attempting consumer or business lock-in is something most companies try at some point, but it can easily backfire on them when they're undercut by the cheaper rival, regardless of technical merit. I think that's happening here between G-SYNC and FreeSync.
 
Thing is, attempting consumer or business lock-in is something most companies try at some point, but it can easily backfire on them when they're undercut by the cheaper rival, regardless of technical merit. I think that's happening here between G-SYNC and FreeSync.
I want to agree, but I dont see freesync winning any victories here, as nvidia GPUs dominate steam's numbers and AMD faffs around with Vega.

The nvidia lock-in will absolutely backfire on them the moment AMD gets their act together.
 
i mean is that because of Nvidia, or just because HDR is still teething...?

How many said modules are out there at the moment?

Seems like that's an issue that will fix itself with economies of scale / less bloated implementations of HDR Gsync boards.
 
Vya Domus is just so clueless,just like in most of his posts. Thinks he knows best. Plays on a TV - says HRR/ULMB is a gimmick. Plays on FX6300 - says his CPU doesn't bottleneck GPUs in new games. Fights against nvidia anti-consumer practices - runs a 1060. This man is a walking casserole of contradiction and nonsense.

Of course you do , variable refresh rate ensures you see only unique frames where the frame time variation is minimal. That's a huge chunk of what constitutes a smooth gaming experience not the lack of 10ms or whatever in terms of response time.

I agree with the first sentence 110% and I think you hit the nail on the head here, but what you wrote at the end of the second one is so stupid it cancells the right part out. :laugh: Of course low lag is important,as is fast pixel response. That's why I'll never buy a VA for fps games.

@las
ULMB is better than g-sync,but requires A LOT more CPU and GPU horsepower to run. Strobing is very hard on my eyes at anything less than 120Hz. Plus running ULMB with a wide amplitude in fps (let's say 90-130 fps) and vsync off produces less fluid animation, unless you play with fast sync, but from my experience that only produces the desired effect at very high framerate (~200 fps or higher for me). ULMB with v-sync on feels very fluid and has very,very little lag. G-sync is incredible for making the animation look smooth at lower fps, with very little added lag and no tearing. Of course there's more blur than ULMB, but the game still feels very,very fluid.

I'd say for me the hierarchy goes like this

1. ULMB @120 fps locked vsync on - but that's just impossible to run on most modern games.
2. G-sync at avg. of 90 fps or higher, this is the one I most often use due to the insane requirements for no.1
3.ULMB at avg. +100 fps with fast sync

I prefer fluid animation with no stutter. I pick that up instantly when I play with v-sync off, be it even at 165hz and 150 fps.It's not even about tearing,though I see it at 165hz too. I notice lack of frame synchronisation immediately,that's just me. That's why to me g-sync is the best thing that I've seen implemented in gaming in the recent years, by a mile.
 
Last edited:
Vya Domus is just so clueless,just like in most of his posts. Thinks he knows best. Plays on a TV - says HRR/ULMB is a gimmick. Plays on FX6300 - says his CPU doesn't bottleneck GPUs in new games. Fights against nvidia anti-consumer practices - runs a 1060. This man is a walking contradiction.

Honestly I am tired of reading your nonsense everywhere , I am pretty tolerant to this sort of stuff by I genuinely feel like I am reading spam. Always picking up a fight and trying to insult me before you even try to argue with what I say , as unsuccessful as you are at doing that you can't even contain yourself from showing how toxic you are with every occasion. You've proven to me you can't read and interpret information properly numerous times so I have no need to waste my time with you anymore. You don't need to do that either since you'll be on ignore from now on.
 
Back
Top