Thursday, January 5th 2017

NVIDIA Announces the G-SYNC HDR Technology

NVIDIA today announced the G-SYNC HDR technology. An evolution of the company's proprietary adaptive display sync technology, which keeps the display's refresh-rates dynamically in-sync with the graphics card's frame-rates, G-SYNC HDR, as its name suggests, adds support for HDR (high dynamic range) displays. NVIDIA's partner display manufacturers such as Acer, and ASUS have each announced displays with this technology, which will will be available later this year.

NVIDIA worked with display panel maker AUOptronics to develop G-SYNC HDR. It leverages full 384-zone LED backlights, and a quantum-dot technology. The monitors rely on wide color gamuts, with 10-bit (1.07 billion color palettes) to bring HDR to life. G-SYNC HDR monitors come with support for the HDR10 standard. The year's most anticipated game, "Mass Effect: Andromeda," will come with support for G-SYNC HDR.
Add your own comment

12 Comments on NVIDIA Announces the G-SYNC HDR Technology

#1
ZoneDymo
always something new on the horizon to wait for
Posted on Reply
#2
xorbe
HDR is not just 10 bits + wider gamut, but that's the marketing dept for you. Standardization on 10 bits per channel and wide gamut handling would be nice. The latter is more difficult to achieve since there's a real cost.
Posted on Reply
#3
Chaitanya
xorbeHDR is not just 10 bits + wider gamut, but that's the marketing dept for you. Standardization on 10 bits per channel and wide gamut handling would be nice. The latter is more difficult to achieve since there's a real cost.
Already G-Sync carries unwarranted nvidia tax, having 10bit panels with decent resolutions supporting Gsync would certainly mean breaking the bank.
Posted on Reply
#4
Cybrnook2002
Don't like the fact that it's down to the game dev's to support this technology? The last sentence of Mass Effect supporting it....... (just something else to be polarized about)
Posted on Reply
#5
Zeki
Can you at least make G-Sync work perfectly before milking G-Sync2 out of our pockets?
It still stutters much more then you made us believe it would in most popular games.
Posted on Reply
#6
Beastie
ZekiCan you at least make G-Sync work perfectly before milking G-Sync2 out of our pockets?
It still stutters much more then you made us believe it would in most popular games.
I've had zero issues with gsync.

I don't play loads of different games so I may have just got lucky. But it has always worked as advertised for me.
Posted on Reply
#7
Zeki
I am glad for all of you who had great experience with g-sync. I also find it world apart better than v-sync or no-sync, but there is still abundant amount of stutter here and there, I feel that I was led to believe, that could be remedied with g-sync. I have been playing with it since 2014 and had all of my main components and windows upgraded/reinstalled since like most of you gamers.
Posted on Reply
#8
Ikaruga
xorbeHDR is not just 10 bits + wider gamut, but that's the marketing dept for you. Standardization on 10 bits per channel and wide gamut handling would be nice. The latter is more difficult to achieve since there's a real cost.
The former also induces "cost" since you would preferably need an engine which processes and renders everything in 10bit too, and that means more GPU work.
Posted on Reply
#9
Prima.Vera
ZekiI am glad for all of you who had great experience with g-sync. I also find it world apart better than v-sync or no-sync, but there is still abundant amount of stutter here and there, I feel that I was led to believe, that could be remedied with g-sync. I have been playing with it since 2014 and had all of my main components and windows upgraded/reinstalled since like most of you gamers.
Sorry man, I still don't get what kind of stutter are you talking about. I have almost the same specs as you, even lower, including the monitor res which is higher, and even so, I have yet to experience any stutter, even in the low fps scenarios. What games are you playing and found the issue?
Posted on Reply
#10
Zeki
Prima.VeraSorry man, I still don't get what kind of stutter are you talking about. I have almost the same specs as you, even lower, including the monitor res which is higher, and even so, I have yet to experience any stutter, even in the low fps scenarios. What games are you playing and found the issue?
Well, there are Farcry3 and 4 and Counter Strike GO, which I consider among worse with g-sync, that I would note 1/10. On the opposite side of the spectrum, 10/10, there are Crysis1 to 3 and Borderlands 1 that just refuse to stutter no matter the framerate and scene at play. Borderlands 2, 9/10, stutters at specific scenes... Also the things are different, in my experience, depending on windows generation. On windows 10, I find that Farcry3 is just stuttering like v-sync no matter what options I select in 3d settings manager.
Posted on Reply
#11
Prima.Vera
ZekiWell, there are Farcry3 and 4 and Counter Strike GO, which I consider among worse with g-sync, that I would note 1/10. On the opposite side of the spectrum, 10/10, there are Crysis1 to 3 and Borderlands 1 that just refuse to stutter no matter the framerate and scene at play. Borderlands 2, 9/10, stutters at specific scenes... Also the things are different, in my experience, depending on windows generation. On windows 10, I find that Farcry3 is just stuttering like v-sync no matter what options I select in 3d settings manager.
Well the explanation is simple actually. If you get more than 144FPS in games, you should disable not only V-SYNC but also G-SYNC. G-Sync is best used for FPS lower than 60fps, otherwise it will just induce extra input lag and can create stuttering. Specially games like CS.
Posted on Reply
#12
efikkan
LCD panels are unable to represent the range of HDR so they "cheat" by using local dimming, which of course increases the total range but also introduces large local color deviations. No thanks.
Posted on Reply
Add your own comment
Apr 25th, 2024 05:26 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts