• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Announces the G-SYNC HDR Technology

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
46,362 (7.68/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
NVIDIA today announced the G-SYNC HDR technology. An evolution of the company's proprietary adaptive display sync technology, which keeps the display's refresh-rates dynamically in-sync with the graphics card's frame-rates, G-SYNC HDR, as its name suggests, adds support for HDR (high dynamic range) displays. NVIDIA's partner display manufacturers such as Acer, and ASUS have each announced displays with this technology, which will will be available later this year.

NVIDIA worked with display panel maker AUOptronics to develop G-SYNC HDR. It leverages full 384-zone LED backlights, and a quantum-dot technology. The monitors rely on wide color gamuts, with 10-bit (1.07 billion color palettes) to bring HDR to life. G-SYNC HDR monitors come with support for the HDR10 standard. The year's most anticipated game, "Mass Effect: Andromeda," will come with support for G-SYNC HDR.



View at TechPowerUp Main Site
 
Joined
Feb 11, 2009
Messages
5,397 (0.97/day)
System Name Cyberline
Processor Intel Core i7 2600k -> 12600k
Motherboard Asus P8P67 LE Rev 3.0 -> Gigabyte Z690 Auros Elite DDR4
Cooling Tuniq Tower 120 -> Custom Watercoolingloop
Memory Corsair (4x2) 8gb 1600mhz -> Crucial (8x2) 16gb 3600mhz
Video Card(s) AMD RX480 -> ... nope still the same :'(
Storage Samsung 750 Evo 250gb SSD + WD 1tb x 2 + WD 2tb -> 2tb MVMe SSD
Display(s) Philips 32inch LPF5605H (television) -> Dell S3220DGF
Case antec 600 -> Thermaltake Tenor HTCP case
Audio Device(s) Focusrite 2i4 (USB)
Power Supply Seasonic 620watt 80+ Platinum
Mouse Elecom EX-G
Keyboard Rapoo V700
Software Windows 10 Pro 64bit
always something new on the horizon to wait for
 
Joined
Feb 14, 2012
Messages
2,323 (0.52/day)
System Name msdos
Processor 8086
Motherboard mainboard
Cooling passive
Memory 640KB + 384KB extended
Video Card(s) EGA
Storage 5.25"
Display(s) 80x25
Case plastic
Audio Device(s) modchip
Power Supply 45 watts
Mouse serial
Keyboard yes
Software disk commander
Benchmark Scores still running
HDR is not just 10 bits + wider gamut, but that's the marketing dept for you. Standardization on 10 bits per channel and wide gamut handling would be nice. The latter is more difficult to achieve since there's a real cost.
 
Joined
Mar 7, 2011
Messages
3,925 (0.82/day)
HDR is not just 10 bits + wider gamut, but that's the marketing dept for you. Standardization on 10 bits per channel and wide gamut handling would be nice. The latter is more difficult to achieve since there's a real cost.
Already G-Sync carries unwarranted nvidia tax, having 10bit panels with decent resolutions supporting Gsync would certainly mean breaking the bank.
 
Joined
Nov 3, 2007
Messages
1,700 (0.28/day)
Don't like the fact that it's down to the game dev's to support this technology? The last sentence of Mass Effect supporting it....... (just something else to be polarized about)
 
Joined
Jul 30, 2015
Messages
25 (0.01/day)
Location
Québec
System Name WTF
Processor i5-9600K @ 5.1
Motherboard asus z370
Cooling H90
Memory 32gb
Video Card(s) ASUS STRIX GTX1080
Storage 30TB
Display(s) ROG PG278Q
Case HAF 912
Audio Device(s) integrated
Power Supply 1000w
Mouse rat 7 mad catz
Keyboard logitech g15
Software yes please
Benchmark Scores yes
Can you at least make G-Sync work perfectly before milking G-Sync2 out of our pockets?
It still stutters much more then you made us believe it would in most popular games.
 
Joined
Jan 26, 2016
Messages
407 (0.14/day)
Location
UK
System Name it needs a name?
Processor Xeon E3-1241 v3 @3.5GHz- standard clock
Motherboard Asus Z97A 3.1
Cooling Bequiet! Dark Rock 3 CPU cooler, 2 x 140mm intake and 1 x 120mm exhaust PWM fans in the case
Memory 16 GB Crucial DDR3 1600MHz CL9 Ballistix Sport 2 x 8 GB
Video Card(s) Palit 980ti Super Jetscream
Storage Sandisk X110 256GB SSD, Sandisk Ultra II 960GB SSD, 640GB WD Blue, 12TB Ultrastar
Display(s) Acer XB270HU
Case Lian Li PC 7H
Audio Device(s) Focusrite Scarlett 6i6 USB interface
Power Supply Seasonic P660
Mouse cheapo logitech wireless
Keyboard some keyboard from the 90's
Software Win10pro 64bit
Can you at least make G-Sync work perfectly before milking G-Sync2 out of our pockets?
It still stutters much more then you made us believe it would in most popular games.
I've had zero issues with gsync.

I don't play loads of different games so I may have just got lucky. But it has always worked as advertised for me.
 
Joined
Jul 30, 2015
Messages
25 (0.01/day)
Location
Québec
System Name WTF
Processor i5-9600K @ 5.1
Motherboard asus z370
Cooling H90
Memory 32gb
Video Card(s) ASUS STRIX GTX1080
Storage 30TB
Display(s) ROG PG278Q
Case HAF 912
Audio Device(s) integrated
Power Supply 1000w
Mouse rat 7 mad catz
Keyboard logitech g15
Software yes please
Benchmark Scores yes
I am glad for all of you who had great experience with g-sync. I also find it world apart better than v-sync or no-sync, but there is still abundant amount of stutter here and there, I feel that I was led to believe, that could be remedied with g-sync. I have been playing with it since 2014 and had all of my main components and windows upgraded/reinstalled since like most of you gamers.
 
Joined
Feb 18, 2011
Messages
1,259 (0.26/day)
HDR is not just 10 bits + wider gamut, but that's the marketing dept for you. Standardization on 10 bits per channel and wide gamut handling would be nice. The latter is more difficult to achieve since there's a real cost.
The former also induces "cost" since you would preferably need an engine which processes and renders everything in 10bit too, and that means more GPU work.
 
Joined
Sep 15, 2011
Messages
6,467 (1.41/day)
Processor Intel® Core™ i7-13700K
Motherboard Gigabyte Z790 Aorus Elite AX
Cooling Noctua NH-D15
Memory 32GB(2x16) DDR5@6600MHz G-Skill Trident Z5
Video Card(s) ZOTAC GAMING GeForce RTX 3080 AMP Holo
Storage 2TB SK Platinum P41 SSD + 4TB SanDisk Ultra SSD + 500GB Samsung 840 EVO SSD
Display(s) Acer Predator X34 3440x1440@100Hz G-Sync
Case NZXT PHANTOM410-BK
Audio Device(s) Creative X-Fi Titanium PCIe
Power Supply Corsair 850W
Mouse Logitech Hero G502 SE
Software Windows 11 Pro - 64bit
Benchmark Scores 30FPS in NFS:Rivals
I am glad for all of you who had great experience with g-sync. I also find it world apart better than v-sync or no-sync, but there is still abundant amount of stutter here and there, I feel that I was led to believe, that could be remedied with g-sync. I have been playing with it since 2014 and had all of my main components and windows upgraded/reinstalled since like most of you gamers.
Sorry man, I still don't get what kind of stutter are you talking about. I have almost the same specs as you, even lower, including the monitor res which is higher, and even so, I have yet to experience any stutter, even in the low fps scenarios. What games are you playing and found the issue?
 
Joined
Jul 30, 2015
Messages
25 (0.01/day)
Location
Québec
System Name WTF
Processor i5-9600K @ 5.1
Motherboard asus z370
Cooling H90
Memory 32gb
Video Card(s) ASUS STRIX GTX1080
Storage 30TB
Display(s) ROG PG278Q
Case HAF 912
Audio Device(s) integrated
Power Supply 1000w
Mouse rat 7 mad catz
Keyboard logitech g15
Software yes please
Benchmark Scores yes
Sorry man, I still don't get what kind of stutter are you talking about. I have almost the same specs as you, even lower, including the monitor res which is higher, and even so, I have yet to experience any stutter, even in the low fps scenarios. What games are you playing and found the issue?
Well, there are Farcry3 and 4 and Counter Strike GO, which I consider among worse with g-sync, that I would note 1/10. On the opposite side of the spectrum, 10/10, there are Crysis1 to 3 and Borderlands 1 that just refuse to stutter no matter the framerate and scene at play. Borderlands 2, 9/10, stutters at specific scenes... Also the things are different, in my experience, depending on windows generation. On windows 10, I find that Farcry3 is just stuttering like v-sync no matter what options I select in 3d settings manager.
 
Joined
Sep 15, 2011
Messages
6,467 (1.41/day)
Processor Intel® Core™ i7-13700K
Motherboard Gigabyte Z790 Aorus Elite AX
Cooling Noctua NH-D15
Memory 32GB(2x16) DDR5@6600MHz G-Skill Trident Z5
Video Card(s) ZOTAC GAMING GeForce RTX 3080 AMP Holo
Storage 2TB SK Platinum P41 SSD + 4TB SanDisk Ultra SSD + 500GB Samsung 840 EVO SSD
Display(s) Acer Predator X34 3440x1440@100Hz G-Sync
Case NZXT PHANTOM410-BK
Audio Device(s) Creative X-Fi Titanium PCIe
Power Supply Corsair 850W
Mouse Logitech Hero G502 SE
Software Windows 11 Pro - 64bit
Benchmark Scores 30FPS in NFS:Rivals
Well, there are Farcry3 and 4 and Counter Strike GO, which I consider among worse with g-sync, that I would note 1/10. On the opposite side of the spectrum, 10/10, there are Crysis1 to 3 and Borderlands 1 that just refuse to stutter no matter the framerate and scene at play. Borderlands 2, 9/10, stutters at specific scenes... Also the things are different, in my experience, depending on windows generation. On windows 10, I find that Farcry3 is just stuttering like v-sync no matter what options I select in 3d settings manager.

Well the explanation is simple actually. If you get more than 144FPS in games, you should disable not only V-SYNC but also G-SYNC. G-Sync is best used for FPS lower than 60fps, otherwise it will just induce extra input lag and can create stuttering. Specially games like CS.
 
Joined
Jun 10, 2014
Messages
2,900 (0.81/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
LCD panels are unable to represent the range of HDR so they "cheat" by using local dimming, which of course increases the total range but also introduces large local color deviations. No thanks.
 
Top