• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Showcases FreeSync 2 HDR Technology With Oasis Demo

crazyeyesreaper

Not a Moderator
Staff member
Joined
Mar 25, 2009
Messages
9,753 (1.78/day)
Location
04578
System Name Old reliable
Processor Intel 8700K @ 4.8 GHz
Motherboard MSI Z370 Gaming Pro Carbon AC
Cooling Custom Water
Memory 32 GB Crucial Ballistix 3666 MHz
Video Card(s) MSI GTX 1080 Ti Gaming X
Storage 3x SSDs 2x HDDs
Display(s) Dell U2412M + Samsung TA350
Case Thermaltake Core P3 TG
Audio Device(s) Samson Meteor Mic / Generic 2.1 / KRK KNS 6400 headset
Power Supply Zalman EBT-1000
Mouse Mionix NAOS 7000
Keyboard Mionix
AMD is looking to further push the adoption of FreeSync with the release of FreeSync 2 HDR Technology. The primary goal of the new standard is to take what FreeSync already offered including wide variable refresh rates and low framerate compensation and to pair that with HDR for a truly immersive experience. To show off what FreeSync 2 can do while also pushing for broader adoption has resulted in AMD creating their new Oasis Demo. Following the familiar principle that seeing is believing, AMD will be looking to compare their FreeSync 2 monitors against their non-HDR counterparts with this new demo at retail locations. This will allow consumers to see the difference for themselves in a way static images and youtube videos cannot convey. The Demo itself has been built using Unreal Engine 4 and has full support for HDR10 and FreeSync 2 HDR transport protocols. When it comes to settings the demo packs numerous options including FPS limits with various presets or custom options, vertical sync on/off, FreeSync on/off, Content modes, etc. You can view AMD's overview of the Demo in the video below.


View at TechPowerUp Main Site
 

las

Joined
Nov 14, 2012
Messages
1,533 (0.37/day)
System Name Obsolete / Waiting for Zen 5 or Arrow Lake
Processor i9-9900K @ 5.2 GHz @ 1.35v / No AVX Offset
Motherboard AsRock Z390 Taichi
Cooling Custom Water
Memory 32GB G.Skill @ 4000/CL15
Video Card(s) Gainward RTX 4090 Phantom / Undervolt + OC
Storage Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server
Display(s) 27" 1440p IPS @ 280 Hz + 77" QD-OLED @ 144 Hz VRR
Case Fractal Design Meshify C
Audio Device(s) Asus Essence STX / Upgraded Op-Amps
Power Supply Corsair RM1000x / Native 12VHPWR
Mouse Logitech G Pro Wireless Superlight
Keyboard Corsair K60 Pro / MX Low Profile Speed
Software Windows 10 Pro x64
After trying out Samsung C27HG70 for a week, I can say that HDR on PC is still a joke. This monitor is DisplayHDR 600 certified. I wouldn't even want to see 400 certified monitors... Basicly useless for HDR. Like all those fake HDR 4K TV's.

It will take years before PC will have proper HDR gaming, like consoles have now (in some games) - Software and hardware simply lacks.

HDR have only impressed me using OLED so far. I have a LG C7. I can't wait till Micro LED replaces LCD as the standard.
I doubt we'll see proper HDR for PC till then (unless you are going to use an OLED TV maybe). Software will mature too.

Tbh I think HDR is overrated. Tons of movies/series and games claim HDR support, when in reality there's a big difference between how good that HDR is implemented. It's still the early days...
 
Last edited:
Joined
Aug 6, 2017
Messages
7,412 (3.05/day)
Location
Poland
System Name Purple rain
Processor 10.5 thousand 4.2G 1.1v
Motherboard Zee 490 Aorus Elite
Cooling Noctua D15S
Memory 16GB 4133 CL16-16-16-31 Viper Steel
Video Card(s) RTX 2070 Super Gaming X Trio
Storage SU900 128,8200Pro 1TB,850 Pro 512+256+256,860 Evo 500,XPG950 480, Skyhawk 2TB
Display(s) Acer XB241YU+Dell S2716DG
Case P600S Silent w. Alpenfohn wing boost 3 ARGBT+ fans
Audio Device(s) K612 Pro w. FiiO E10k DAC,W830BT wireless
Power Supply Superflower Leadex Gold 850W
Mouse G903 lightspeed+powerplay,G403 wireless + Steelseries DeX + Roccat rest
Keyboard HyperX Alloy SilverSpeed (w.HyperX wrist rest),Razer Deathstalker
Software Windows 10
Benchmark Scores A LOT
I had a chance to see and compare quite a few TV sets in a big electronics shop,and a good non-HDR TV looks miles better than an entry level one with crappy HDR.Once you've got a good implementation of HDR on a quality TV it looks amazing but that's what's required.Stay away from those budget hdr monitors please.
 

las

Joined
Nov 14, 2012
Messages
1,533 (0.37/day)
System Name Obsolete / Waiting for Zen 5 or Arrow Lake
Processor i9-9900K @ 5.2 GHz @ 1.35v / No AVX Offset
Motherboard AsRock Z390 Taichi
Cooling Custom Water
Memory 32GB G.Skill @ 4000/CL15
Video Card(s) Gainward RTX 4090 Phantom / Undervolt + OC
Storage Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server
Display(s) 27" 1440p IPS @ 280 Hz + 77" QD-OLED @ 144 Hz VRR
Case Fractal Design Meshify C
Audio Device(s) Asus Essence STX / Upgraded Op-Amps
Power Supply Corsair RM1000x / Native 12VHPWR
Mouse Logitech G Pro Wireless Superlight
Keyboard Corsair K60 Pro / MX Low Profile Speed
Software Windows 10 Pro x64
Just look for Ultra HD Premium certification when buying a HDR TV :p You'll get native 10 bit panel and 1000 nits then.. (for LCD)

There's TONS of 4K HDR TV's with 8 bit panels. They can RECIEVE HDR content, but not show it. They are still allowed to sell them as 4K HDR TV's. This is how most PC monitors do HDR I guess :p

I'm fine with SDR on PC for a few more years..
 
Joined
Feb 2, 2015
Messages
2,707 (0.81/day)
Location
On The Highway To Hell \m/
Does HDR help with, or better yet eliminate, color banding? I was just looking at one of these FreeSync 2 HDR400 monitors, and it says "HDR400 specifies true 8-bit image quality". And I'm like...you lost me at 8-bit....see ya! Already made that mistake and deeply regretting it. If you can call it a mistake. As it's not advertised, ALMOST EVER, whether a specific monitor is 8-bit or not. So I had no clue, and/or any way of knowing ahead of time, what I was getting. Not only that though. I didn't even know what it was, and why I should care. How come that's not pretty much the most important specification that's given, or even a major topic of discussion? I mean...Jesus...is there anything worse than color banding? I have to not play games where it shows up now...or I feel like throwing this $450 POS out the window. Sickening...
 
Joined
Oct 1, 2006
Messages
4,884 (0.76/day)
Location
Hong Kong
Processor Core i7-12700k
Motherboard Z690 Aero G D4
Cooling Custom loop water, 3x 420 Rad
Video Card(s) RX 7900 XTX Phantom Gaming
Storage Plextor M10P 2TB
Display(s) InnoCN 27M2V
Case Thermaltake Level 20 XT
Audio Device(s) Soundblaster AE-5 Plus
Power Supply FSP Aurum PT 1200W
Software Windows 11 Pro 64-bit
After trying out Samsung C27HG70 for a week, I can say that HDR on PC is still a joke. This monitor is DisplayHDR 600 certified. I wouldn't even want to see 400 certified monitors... Basicly useless for HDR. Like all those fake HDR 4K TV's.

It will take years before PC will have proper HDR gaming, like consoles have now (in some games) - Software and hardware simply lacks.

HDR have only impressed me using OLED so far. I have a LG C7. I can't wait till Micro LED replaces LCD as the standard.
I doubt we'll see proper HDR for PC till then (unless you are going to use an OLED TV maybe). Software will mature too.

Tbh I think HDR is overrated. Tons of movies/series and games claim HDR support, when in reality there's a big difference between how good that HDR is implemented. It's still the early days...
For PC monitors the CHG70 is already miles ahead of most "gaming" monitors.
Many of those don't even run true 8-bit instead runs 6-bit + FRC rubbish.
Another issue with HDR on PC is problems with color mapping, if not done properly HDR mode will look WORST than SDR.
It also doesn't help that in order to watch HDR on youtube you need to turn on some flags in M$ Edge that is off by default.
Few games actually runs Freesync 2 HDR, and the color mapping on nVidia GPUs for HDR is IMO crap.
Especially seeing the rip-off 4k 144hs "HDR1000" G-Sync monitor from Asus in person, the HDR with that on BF5 (which is an RTX game) is absolute garbage.
Oh and don't get me started with the HDR on Anthem, it often bugs out and the Gamma goes through the roof instead of 2.2~2.4.
 
Last edited:
Joined
Sep 15, 2016
Messages
473 (0.17/day)
Is this video a joke? It looks like every 6th grader's first Powerpoint presentation when they discover transition effects.
 
Joined
Jun 3, 2010
Messages
2,540 (0.50/day)
Does HDR help with, or better yet eliminate, color banding? I was just looking at one of these FreeSync 2 HDR400 monitors, and it says "HDR400 specifies true 8-bit image quality". And I'm like...you lost me at 8-bit....see ya! Already made that mistake and deeply regretting it. If you can call it a mistake. As it's not advertised, ALMOST EVER, whether a specific monitor is 8-bit or not. So I had no clue, and/or any way of knowing ahead of time, what I was getting. Not only that though. I didn't even know what it was, and why I should care. How come that's not pretty much the most important specification that's given, or even a major topic of discussion? I mean...Jesus...is there anything worse than color banding? I have to not play games where it shows up now...or I feel like throwing this $450 POS out the window. Sickening...
DisplayHDR specification as a whole of HDR10 rests upon 8-bit displays. Dolby Vision is 10-bit displays.
 

las

Joined
Nov 14, 2012
Messages
1,533 (0.37/day)
System Name Obsolete / Waiting for Zen 5 or Arrow Lake
Processor i9-9900K @ 5.2 GHz @ 1.35v / No AVX Offset
Motherboard AsRock Z390 Taichi
Cooling Custom Water
Memory 32GB G.Skill @ 4000/CL15
Video Card(s) Gainward RTX 4090 Phantom / Undervolt + OC
Storage Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server
Display(s) 27" 1440p IPS @ 280 Hz + 77" QD-OLED @ 144 Hz VRR
Case Fractal Design Meshify C
Audio Device(s) Asus Essence STX / Upgraded Op-Amps
Power Supply Corsair RM1000x / Native 12VHPWR
Mouse Logitech G Pro Wireless Superlight
Keyboard Corsair K60 Pro / MX Low Profile Speed
Software Windows 10 Pro x64
For PC monitors the CHG70 is already miles ahead of most "gaming" monitors.
Many of those don't even run true 8-bit instead runs 6-bit + FRC rubbish.
Another issue with HDR on PC is problems with color mapping, if not done properly HDR mode will look WORST than SDR.
It also doesn't help that in order to watch HDR on youtube you need to turn on some flags in M$ Edge that is off by default.
Few games actually runs Freesync 2 HDR, and the color mapping on nVidia GPUs for HDR is IMO crap.
Especially seeing the rip-off 4k 144hs "HDR1000" G-Sync monitor from Asus in person, the HDR with that on BF5 (which is an RTX game) is absolute garbage.
Oh and don't get me started with the HDR on Anthem, it often bugs out and the Gamma goes through the roof instead of 2.2~2.4.

Miles ahead of most gaming monitors in HDR maybe - It's still bad HDR tho - Everything else about it is mediocre if you ask me. Not much of a gaming monitor IMO. I have tested tons of VA, IPS and TN gaming monitors over the past 8-10 years. VA have never impressed me in faster paced games. The slow pixel response time continues to be a problem for this tech.

The blur on CHG70, even at fastest mode with low input lag enabled, is way too much compared to even IPS/AHVA high refresh rate panels.
Pixel response is simply too high, especially in dark scenes. Too much smearing. Playing fast paced games is still a problem.

I was playing Apex Legends side-by-side with PG279Q and the difference in smoothness and responsiveness were simply huge.
Oh, I still have the CHG70 standing on the floor here. I will test it some more over the next days, vs Dell's new S2719DGF and Asus PG278QR, two 8 bit TN 1440p monitors.

So far tho, CHG70 does not have the nice black levels good VA monitors usually have. I remember testing Eizo FG2421, it had much better blacks and colors in general (also 1:5000 contrast). Failed in other aspects tho (pixel response time especially... once again and small size/res)

I like VA panels for TV's, if you don't need good viewing angles, but for PC gaming I would not recommend them, unless you are playing slower paced games or can accept some blur/smearing. It's good enough for RPG's, MMO's etc. It has some issues with text sharpness tho (because of how subpixels are generated), this is true for all Samsung and AUO VA panels.

In the end there's no perfect monitor. This is why I like to keep testing new monitors.

6 bit + FRC is typically low-end TN. There's many true 8 bit TN (mostly 1440p), there's also alot of true 8 bit high refresh rate IPS/AHVA panels and also some 8 bit + FRC.
These 8 bit 1440p TN panels have way better image quality than those cheap 6 bit + FRC 1080p TN panels.
 
Last edited:
Joined
Sep 27, 2014
Messages
550 (0.16/day)
This monitor is DisplayHDR 600 certified. I wouldn't even want to see 400 certified monitors... Basicly useless for HDR. Like all those fake HDR 4K TV's.
....
It will take years before PC will have proper HDR gaming, like consoles have now (in some games)

I don't get the logic. Consoles came with better monitors? I think this is a case of placebo effect...

monitor from Asus
I have bought a cheap-ish Acer monitor myself. Bad colors, broke down in warranty (had to pay shipping to get it fixed), came back still looking yucky. Didn't help that I was running it dual monitor next to a 24" AOC HDTV display with much better colors.
I saw an Asus laptop display (my son's). Yuck again.

Right now I am using an AOC U2879VF. Specs say 4K FreeSync, 1ms, with 10 bit (8 bit +FRC). But since it has max 300 cd/m2, it doesn't qualify for HDR 400. It come with a color calibration profile file, and that's important to me. Even if I have a Spyder calibration tool, I found out that, if the manufacturer doesn't provide that file, it usually means that you can't actually fully calibrate that monitor.

IMO, HDR is over-hyped. Another way to convince us to spend money.
Basically, if you ron't run a HRD monitor at the highest luminosity, it doesn't meet the "HDR" specifications. Try that at night!
 
Last edited:

las

Joined
Nov 14, 2012
Messages
1,533 (0.37/day)
System Name Obsolete / Waiting for Zen 5 or Arrow Lake
Processor i9-9900K @ 5.2 GHz @ 1.35v / No AVX Offset
Motherboard AsRock Z390 Taichi
Cooling Custom Water
Memory 32GB G.Skill @ 4000/CL15
Video Card(s) Gainward RTX 4090 Phantom / Undervolt + OC
Storage Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server
Display(s) 27" 1440p IPS @ 280 Hz + 77" QD-OLED @ 144 Hz VRR
Case Fractal Design Meshify C
Audio Device(s) Asus Essence STX / Upgraded Op-Amps
Power Supply Corsair RM1000x / Native 12VHPWR
Mouse Logitech G Pro Wireless Superlight
Keyboard Corsair K60 Pro / MX Low Profile Speed
Software Windows 10 Pro x64
I don't get the logic. Consoles came with better monitors? I think this is a case of placebo effect...


I have bought an Asus monitor myself. Bad colors, broke down in warranty (had to pay shipping to get it fixed), came back still looking yucky. Didn't help that I was running it dual monitor next to a 24" HDTV display with much better colors.
I saw Asus laptop display at a friend. Yuck again.

It's easier for dev's to get HDR working right on consoles. They don't have to support 1000's of different software and hardware configs. This is pretty much fact.

For proper HDR on consoles you still need an Ultra HD Premium certified TV or better, an OLED, like I got. These can show HDR properly. HDR on CHG70 looks terrible compared to OLED HDR. It does not even have FALD. Edge LED with very few zones = Bad HDR. Again, facts. Go ask in an AV forum ;)

I wonder why Asus gaming monitors generally scores very high in reviews if that's the case.
"Asus laptop display" - it's not like Asus made the panel. Your friend probably cheaped out and got a 6 bit TN panel. Asus have plenty of laptop's with good panels, as in IPS.

HDR is not overrated, come again when you have seen proper HDR content on an OLED 4K TV.
HDR on LCD, on the other hand, is kind of a joke. You need FALD on a LCD to make it decent and still it's much worse than OLED. Backlight < Self emitting pixels.
Pretty much no PC Monitors use FALD. There's a few tho, with 2000 dollar pricetag or so.

"HDR is over-hyped" is what people who have never seen proper HDR always claim.
 
Last edited:
Joined
Feb 2, 2015
Messages
2,707 (0.81/day)
Location
On The Highway To Hell \m/
DisplayHDR specification as a whole of HDR10 rests upon 8-bit displays. Dolby Vision is 10-bit displays.
Any idea what the answers to my questions would be. Or should ask the questions differently, or ask different questions? Let me try one more time.

I recently bought a 1440p 144Hz FreeSync "gaming" monitor. I was under no impression that it was HDR capable. Which it isn't. But I was also not aware that it was 8-bit. Nor did I even know why being 8-bit mattered, or made any noticeable difference. I come to find out later that it does. I'm extremely satisfied with my monitor other than the color banding. Which I'm hearing is due to it being 8-bit. And that 10-bit is the solution to it. I came from gaming on CRTs and LED TVs. Which I had never noticed color banding on. So this was all news to me.

Questions
  • It sounds like HDR would help with, or eliminate, color banding by producing a "wider color gamut". Is this true?
  • If it is true, to what extent is it true?
  • Or, if all HDR10 is effectively 10-bit(which I'm reading it is, or it would be called HDR8, making the fact that the display itself is 8-bit irrelevant), is it equivalent to SDR 10-bit in terms of helping with or eliminating color banding?
  • Basically what I'm asking in a nutshell is...if I hate color banding, do I need HDR or SDR 10-bit to get rid of it? Either one? Neither one? Dolby Vision(which in fact is 12-bit capable)?
 
Joined
Jun 3, 2010
Messages
2,540 (0.50/day)
Any idea what the answers to my questions would be. Or should ask the questions differently, or ask different questions? Let me try one more time.

I recently bought a 1440p 144Hz FreeSync "gaming" monitor. I was under no impression that it was HDR capable. Which it isn't. But I was also not aware that it was 8-bit. Nor did I even know why being 8-bit mattered, or made any noticeable difference. I come to find out later that it does. I'm extremely satisfied with my monitor other than the color banding. Which I'm hearing is due to it being 8-bit. And that 10-bit is the solution to it. I came from gaming on CRTs and LED TVs. Which I had never noticed color banding on. So this was all news to me.

Questions
  • It sounds like HDR would help with, or eliminate, color banding by producing a "wider color gamut". Is this true?
  • If it is true, to what extent is it true?
  • Or, if all HDR10 is effectively 10-bit(which I'm reading it is, or it would be called HDR8, making the fact that the display itself is 8-bit irrelevant), is it equivalent to SDR 10-bit in terms of helping with or eliminating color banding?
  • Basically what I'm asking in a nutshell is...if I hate color banding, do I need HDR or SDR 10-bit to get rid of it? Either one? Neither one? Dolby Vision(which in fact is 12-bit capable)?
The thing is, 8-bit is the desktop limit however 10-bit is activated with Dolby Vision. If however you have an issue with banding, you could try lowering gamma, or in other words screen brightness.
 
Joined
Feb 2, 2015
Messages
2,707 (0.81/day)
Location
On The Highway To Hell \m/
...you could try lowering gamma, or in other words screen brightness.
Thanks. But I'd rather not have to try anything. It looks perfect to me with the factory default monitor settings and standard Windows settings. Everything I've done to mess with them ended up looking worse. And, like I said, I'm actually very pleased with how it looks. Except the goddamn color banding in low light/dark areas of games that's driving me fricken crazy! As I'd imagine it would do to anybody. I don't know why they don't take special care to make sure it never happens. Man...I just want to go back to CRT. Nothing beats CRT image quality...PERIOD. It's just the matters of screen size and/or resolution that's poor with them. Other than that they can't be beat. All this LED, LCD, OLED, QLED, Micro LED, VA, TN, IPS bullshit is just that. BULLSHIT!!!
 
Last edited:
Joined
Oct 1, 2006
Messages
4,884 (0.76/day)
Location
Hong Kong
Processor Core i7-12700k
Motherboard Z690 Aero G D4
Cooling Custom loop water, 3x 420 Rad
Video Card(s) RX 7900 XTX Phantom Gaming
Storage Plextor M10P 2TB
Display(s) InnoCN 27M2V
Case Thermaltake Level 20 XT
Audio Device(s) Soundblaster AE-5 Plus
Power Supply FSP Aurum PT 1200W
Software Windows 11 Pro 64-bit
6 bit + FRC is typically low-end TN. There's many true 8 bit TN (mostly 1440p), there's also alot of true 8 bit high refresh rate IPS/AHVA panels and also some 8 bit + FRC.
These 8 bit 1440p TN panels have way better image quality than those cheap 6 bit + FRC 1080p TN panels.
I am mainly talking about the color reproduction and static contrast.
I had the MG278Q so I know how a actual 8-bit TN looks like, its okay, but not nearly as good as the CHG70.
Monitors are usually rated at their max brightness to cheat the contrast ratio, in the case of the Samsung VA, even the cheaper 1080p CFG70 managed 2800:1 of the rated 3000 contrast even at 130 nits (stock is 300nits).
Some of the other monitors especially the IPS monitors only gets anywhere near their rated Contrast ratio at eye blinding brightness.

HDR and Wide Color Gammut signals takes longer to process so in general HDR monitors have higher input lag.
The best E-Sport monitors are from BenQ that manages 9ms true input lag, but of course those look as bad ascheap TN, and actually induces flicker by black light strobing intentionally to give you better motion vision.
The PG27UQ on the other hand has over 20ms true input lag, so there is that.

Now on the issues of the CHG70, there is overshoot / reverse ghosting when pixel overdrive is on.
This is esepcially true on older firmware for this monitor, so try updating the firmware if you haven't already done so.
 
Last edited:
Joined
Jun 3, 2010
Messages
2,540 (0.50/day)
Thanks. But I'd rather not have to try anything. It looks perfect to me with the factory default monitor settings and standard Windows settings. Everything I've done to mess with them ended up looking worse. And, like I said, I'm actually very pleased with how it looks. Except the goddamn color banding in low light/dark areas of games that's driving me fricken crazy! As I'd imagine it would do to anybody. I don't know why they don't take special care to make sure it never happens. Man...I just want to go back to CRT. Nothing beats CRT image quality...PERIOD. It's just the matters of screen size and/or resolution that's poor with them. Other than that they can't be beat. All this LED, LCD, OLED, QLED, Micro LED, VA, TN, IPS bullshit is just that. BULLSHIT!!!
Try black equalizer, maybe?
 

las

Joined
Nov 14, 2012
Messages
1,533 (0.37/day)
System Name Obsolete / Waiting for Zen 5 or Arrow Lake
Processor i9-9900K @ 5.2 GHz @ 1.35v / No AVX Offset
Motherboard AsRock Z390 Taichi
Cooling Custom Water
Memory 32GB G.Skill @ 4000/CL15
Video Card(s) Gainward RTX 4090 Phantom / Undervolt + OC
Storage Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server
Display(s) 27" 1440p IPS @ 280 Hz + 77" QD-OLED @ 144 Hz VRR
Case Fractal Design Meshify C
Audio Device(s) Asus Essence STX / Upgraded Op-Amps
Power Supply Corsair RM1000x / Native 12VHPWR
Mouse Logitech G Pro Wireless Superlight
Keyboard Corsair K60 Pro / MX Low Profile Speed
Software Windows 10 Pro x64
I am mainly talking about the color reproduction and static contrast.
I had the MG278Q so I know how a actual 8-bit TN looks like, its okay, but not nearly as good as the CHG70.
Monitors are usually rated at their max brightness to cheat the contrast ratio, in the case of the Samsung VA, even the cheaper 1080p CFG70 managed 2800:1 of the rated 3000 contrast even at 130 nits (stock is 300nits).
Some of the other monitors especially the IPS monitors only gets anywhere near their rated Contrast ratio at eye blinding brightness.

HDR and Wide Color Gammut signals takes longer to process so in general HDR monitors have higher input lag.
The best E-Sport monitors are from BenQ that manages 9ms true input lag, but of course those look as bad ascheap TN, and actually induces flicker by black light strobing intentionally to give you better motion vision.
The PG27UQ on the other hand has over 20ms true input lag, so there is that.

Now on the issues of the CHG70, there is overshoot / reverse ghosting when pixel overdrive is on.
This is esepcially true on older firmware for this monitor, so try updating the firmware if you haven't already done so.

The monitor is brand new and came with newest firmware.

I'm not saying it's a bad monitor. It's just that HDR is not impressive and it's not suitable for fast paced games (if you ask me - This is true for all VA monitors tho).

Both Asus and Dell have monitors sub 5ms total input lag. PG278QR and Dell S2716DG (and the new S2719DGF) are all very low.
Even the PG279Q (AHVA/IPS) I'm using now, has ~5ms on avg (low 4.1 and high 6.5) - http://www.tftcentral.co.uk/reviews/asus_rog_swift_pg279q.htm#response_times

This is why VA is a problem for fast paced games: http://www.tftcentral.co.uk/reviews/samsung_c32hg70.htm#response
High is 40-50 and sometimes 60ms.

Conclusion: "Some slow response times resulting in dark smearing on some content. Fairly typical of VA panels"


You are right about HDR increasing total input lag. But if the monitors run in SDR mode, it should be the same. I don't know any fast paced MP games that support HDR on PC so you should get the "low" input lag.

I have not tried PG27UQ because I have no interrest in 2160p/4K at only 27". IMO you need 32 inches for this resolution.
 
Last edited:
Joined
Jun 3, 2010
Messages
2,540 (0.50/day)
How is HDR increasing lag? I think we discussed this, AMD gpu vs Nvidia gpu.
 

las

Joined
Nov 14, 2012
Messages
1,533 (0.37/day)
System Name Obsolete / Waiting for Zen 5 or Arrow Lake
Processor i9-9900K @ 5.2 GHz @ 1.35v / No AVX Offset
Motherboard AsRock Z390 Taichi
Cooling Custom Water
Memory 32GB G.Skill @ 4000/CL15
Video Card(s) Gainward RTX 4090 Phantom / Undervolt + OC
Storage Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server
Display(s) 27" 1440p IPS @ 280 Hz + 77" QD-OLED @ 144 Hz VRR
Case Fractal Design Meshify C
Audio Device(s) Asus Essence STX / Upgraded Op-Amps
Power Supply Corsair RM1000x / Native 12VHPWR
Mouse Logitech G Pro Wireless Superlight
Keyboard Corsair K60 Pro / MX Low Profile Speed
Software Windows 10 Pro x64
Joined
Jun 3, 2010
Messages
2,540 (0.50/day)
More processing etc.
Nope, computerbase checked it.
  • These aren't affected.
    • Assassin's Creed Origins – 3.840 × 2.160: FPS, Durchschnitt
    • Battlefield 1 – 3.840 × 2.160: FPS, Durchschnitt
    • F1 2017 – 3.840 × 2.160: FPS, Durchschnitt
    • Final Fantasy XV – 3.840 × 2.160: FPS, Durchschnitt
    • Mass Effect: Andromeda – 3.840 × 2.160: FPS, Durchschnitt
    • Shadow Warrior 2: FPS, Durchschnitt
    • Star Wars: Battlefront 2 – 3.840 × 2.160: FPS, Durchschnitt
  • These are affected >1%.
    • Call of Duty: WWII – 3.840 × 2.160: FPS, Durchschnitt
    • Destiny 2 – 3.840 × 2.160: FPS, Durchschnitt
    • Far Cry 5 – 3.840 × 2.160: FPS, Durchschnitt
    • Mittelerde: Schatten des Krieges – 3.840 × 2.160: FPS, Durchschnitt
    • Resident Evil 7 – Biohazard: FPS, Durchschnitt
 

las

Joined
Nov 14, 2012
Messages
1,533 (0.37/day)
System Name Obsolete / Waiting for Zen 5 or Arrow Lake
Processor i9-9900K @ 5.2 GHz @ 1.35v / No AVX Offset
Motherboard AsRock Z390 Taichi
Cooling Custom Water
Memory 32GB G.Skill @ 4000/CL15
Video Card(s) Gainward RTX 4090 Phantom / Undervolt + OC
Storage Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server
Display(s) 27" 1440p IPS @ 280 Hz + 77" QD-OLED @ 144 Hz VRR
Case Fractal Design Meshify C
Audio Device(s) Asus Essence STX / Upgraded Op-Amps
Power Supply Corsair RM1000x / Native 12VHPWR
Mouse Logitech G Pro Wireless Superlight
Keyboard Corsair K60 Pro / MX Low Profile Speed
Software Windows 10 Pro x64
Nope, computerbase checked it.
  • These aren't affected.
    • Assassin's Creed Origins – 3.840 × 2.160: FPS, Durchschnitt
    • Battlefield 1 – 3.840 × 2.160: FPS, Durchschnitt
    • F1 2017 – 3.840 × 2.160: FPS, Durchschnitt
    • Final Fantasy XV – 3.840 × 2.160: FPS, Durchschnitt
    • Mass Effect: Andromeda – 3.840 × 2.160: FPS, Durchschnitt
    • Shadow Warrior 2: FPS, Durchschnitt
    • Star Wars: Battlefront 2 – 3.840 × 2.160: FPS, Durchschnitt
  • These are affected >1%.
    • Call of Duty: WWII – 3.840 × 2.160: FPS, Durchschnitt
    • Destiny 2 – 3.840 × 2.160: FPS, Durchschnitt
    • Far Cry 5 – 3.840 × 2.160: FPS, Durchschnitt
    • Mittelerde: Schatten des Krieges – 3.840 × 2.160: FPS, Durchschnitt
    • Resident Evil 7 – Biohazard: FPS, Durchschnitt

We're talking input lag, not fps
 

las

Joined
Nov 14, 2012
Messages
1,533 (0.37/day)
System Name Obsolete / Waiting for Zen 5 or Arrow Lake
Processor i9-9900K @ 5.2 GHz @ 1.35v / No AVX Offset
Motherboard AsRock Z390 Taichi
Cooling Custom Water
Memory 32GB G.Skill @ 4000/CL15
Video Card(s) Gainward RTX 4090 Phantom / Undervolt + OC
Storage Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server
Display(s) 27" 1440p IPS @ 280 Hz + 77" QD-OLED @ 144 Hz VRR
Case Fractal Design Meshify C
Audio Device(s) Asus Essence STX / Upgraded Op-Amps
Power Supply Corsair RM1000x / Native 12VHPWR
Mouse Logitech G Pro Wireless Superlight
Keyboard Corsair K60 Pro / MX Low Profile Speed
Software Windows 10 Pro x64
The processing is done on the gpu.

No.

Enabling HDR on a TV or monitor increases total input lag. Has nothing to do with framerate.
 

las

Joined
Nov 14, 2012
Messages
1,533 (0.37/day)
System Name Obsolete / Waiting for Zen 5 or Arrow Lake
Processor i9-9900K @ 5.2 GHz @ 1.35v / No AVX Offset
Motherboard AsRock Z390 Taichi
Cooling Custom Water
Memory 32GB G.Skill @ 4000/CL15
Video Card(s) Gainward RTX 4090 Phantom / Undervolt + OC
Storage Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server
Display(s) 27" 1440p IPS @ 280 Hz + 77" QD-OLED @ 144 Hz VRR
Case Fractal Design Meshify C
Audio Device(s) Asus Essence STX / Upgraded Op-Amps
Power Supply Corsair RM1000x / Native 12VHPWR
Mouse Logitech G Pro Wireless Superlight
Keyboard Corsair K60 Pro / MX Low Profile Speed
Software Windows 10 Pro x64
Top