• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD to Rename "FreeSync 2" To "FreeSync 2 HDR", Increase Minimum HDR Requirement

Raevenlord

News Editor
Joined
Aug 12, 2016
Messages
3,755 (1.35/day)
Location
Portugal
System Name The Ryzening
Processor AMD Ryzen 9 5900X
Motherboard MSI X570 MAG TOMAHAWK
Cooling Lian Li Galahad 360mm AIO
Memory 32 GB G.Skill Trident Z F4-3733 (4x 8 GB)
Video Card(s) Gigabyte RTX 3070 Ti
Storage Boot: Transcend MTE220S 2TB, Kintson A2000 1TB, Seagate Firewolf Pro 14 TB
Display(s) Acer Nitro VG270UP (1440p 144 Hz IPS)
Case Lian Li O11DX Dynamic White
Audio Device(s) iFi Audio Zen DAC
Power Supply Seasonic Focus+ 750 W
Mouse Cooler Master Masterkeys Lite L
Keyboard Cooler Master Masterkeys Lite L
Software Windows 10 x64
The guys over at PC Perspective conducted an interesting interview with AMD, during which a company representative talked about impending changes to AMD's FreeSync program. Essentially, the company found that there is some consumer confusion regarding what features exactly FreeSync 2 delivers over its first-gen counterpart. As such, they feel renaming the technology to FreeSync 2 HDR conveys the focus on the new feature-set: LFC (Low Framerate Compensation) and the FreeSync 2 HDR fast-lane for tone-mapping improvements.

The AMD representative further clarified what specs are required for a monitor to receive FreeSync 2 HDR certification: support for at least HDR600, coverage of 99 percent of BT.709 and 90 percent of the DCI P3 color spectrum. Also mentioned was a minimum response time, though the exact value remains unknown. An interesting point that can be gleaned from AMD's change, though, is that this one is more than just cosmetic: AMD's first FreeSync 2 certification program required displays to only be able to adhere to HDR400. There are some examples of announced, FreeSync 2 monitors that only support that standard (and others that don't support even that but were certified all the same), instead of the aforementioned HDR600 the company will apparently start enforcing alongside the renewed "FreeSync 2 HDR" program. Here's hoping for a stricter certification program from AMD in this regard, since HDR400 was a push in itself towards being true HDR (it isn't...) - and FreeSync 2 already has all the market support and recognition it needs to now start increasing its requirements for quality support instead of mainly quantity.



View at TechPowerUp Main Site
 
Joined
Jul 10, 2011
Messages
788 (0.17/day)
Processor Intel
Motherboard MSI
Cooling Cooler Master
Memory Corsair
Video Card(s) Nvidia
Storage Samsung/Western Digital/ADATA
Display(s) Samsung
Case Thermaltake
Audio Device(s) On Board
Power Supply Seasonic
Mouse A4TECH
Keyboard UniKey
Software Windows 10 x64
So AMD lied. Let's see if AdorkedTV will add this into "AMD - Anti-Competitive, Anti-Consumer, Anti-Technology" list.
 
Joined
Jan 8, 2017
Messages
8,860 (3.36/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C

AlienIsGOD

Vanguard Beta Tester
Joined
Aug 9, 2008
Messages
5,111 (0.89/day)
Location
Kingston, Ontario Canada
System Name Aliens Ryzen Rig | 2nd Hand Omen
Processor Ryzen R5 5600 | Ryzen R5 3600
Motherboard Gigabyte B450 Aorus Elite (F61 BIOS) | B450 matx
Cooling DeepCool Castle EX V2 240mm AIO| stock for now
Memory 8GB X 2 DDR4 3000mhz Team Group Vulcan | 16GB DDR4
Video Card(s) Sapphire Pulse RX 5700 8GB | GTX 1650 4GB
Storage Adata XPG 8200 PRO 512GB SSD OS / 240 SSD + 2TB M.2 SSD Games / 1000 GB Data | SSD + HDD
Display(s) Acer ED273 27" VA 144hz Freesync |TCL 32" 1080P w/ HDR
Case NZXT H500 Black | HP Omen Obelisk
Audio Device(s) Onboard Realtek | Onboard Realtek
Power Supply EVGA SuperNOVA G3 650w 80+ Gold | 500w
Mouse Steelseries Rival 500 15 button mouse w/ Razor Goliathus Chroma XL mousemat | Logitech G502
Keyboard Logitech G910 Orion Spark RGB w/ Romer G tactile keys | Logitech G513 Carbon w/ Romer G tactile keys
Software Windows 10 Pro | Windows 10 Pro
So AMD lied. Let's see if AdorkedTV will add this into "AMD - Anti-Competitive, Anti-Consumer, Anti-Technology" list.
lol
 
Joined
Jul 10, 2011
Messages
788 (0.17/day)
Processor Intel
Motherboard MSI
Cooling Cooler Master
Memory Corsair
Video Card(s) Nvidia
Storage Samsung/Western Digital/ADATA
Display(s) Samsung
Case Thermaltake
Audio Device(s) On Board
Power Supply Seasonic
Mouse A4TECH
Keyboard UniKey
Software Windows 10 x64

Raevenlord

News Editor
Joined
Aug 12, 2016
Messages
3,755 (1.35/day)
Location
Portugal
System Name The Ryzening
Processor AMD Ryzen 9 5900X
Motherboard MSI X570 MAG TOMAHAWK
Cooling Lian Li Galahad 360mm AIO
Memory 32 GB G.Skill Trident Z F4-3733 (4x 8 GB)
Video Card(s) Gigabyte RTX 3070 Ti
Storage Boot: Transcend MTE220S 2TB, Kintson A2000 1TB, Seagate Firewolf Pro 14 TB
Display(s) Acer Nitro VG270UP (1440p 144 Hz IPS)
Case Lian Li O11DX Dynamic White
Audio Device(s) iFi Audio Zen DAC
Power Supply Seasonic Focus+ 750 W
Mouse Cooler Master Masterkeys Lite L
Keyboard Cooler Master Masterkeys Lite L
Software Windows 10 x64
You need 1000 nits for real HDR.



AMD didn't lie, HDR400 and HDR600 are both VESA standards. Yes, 1000 nits is the ideal sweetspot for HDR performance, but VESA themselves established these HDR-compliant tiers (much like the HD Ready televisions of hold, if you'll ask me, which accepted inputs, but weren't able to diplay them). AMD simply seems to be elevating the ceiling for which they'll give their FreeSync 2 HDR badge.
 
Joined
Jan 8, 2017
Messages
8,860 (3.36/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
You need 1000 nits for real HDR.

Says who ? You subjectively ? VESA has 3 distinct HDR certifications all equally true , for 400 , 600 and 1000 nits. And you still didn't explain how exactly AMD lied.
 

bug

Joined
May 22, 2015
Messages
13,161 (4.07/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Well, I have to post this again. HDR400 is for OLED that can't shine as bright as LCD, but can have much deeper blacks (thus offering a very high dynamic range). Anyone sticking a HDR400 label on an LCD monitor should the taken behind the barn and shot.
 
Joined
Jul 10, 2011
Messages
788 (0.17/day)
Processor Intel
Motherboard MSI
Cooling Cooler Master
Memory Corsair
Video Card(s) Nvidia
Storage Samsung/Western Digital/ADATA
Display(s) Samsung
Case Thermaltake
Audio Device(s) On Board
Power Supply Seasonic
Mouse A4TECH
Keyboard UniKey
Software Windows 10 x64
AMD didn't lie, HDR400 and HDR600 are both VESA standards. Yes, 1000 nits is the ideal sweetspot for HDR performance, but VESA themselves established these HDR-compliant tiers (much like the HD Ready televisions of hold, if you'll ask me, which accepted inputs, but weren't able to diplay them). AMD simply seems to be elevating the ceiling for which they'll give their FreeSync 2 HDR badge.

I'm not saying Freesync 2 HDR monitors are fake HDR. I'm talking, for example, Samsung C32HG70 350 nits?

Says who ? You subjectively ? VESA has 3 distinct HDR certifications all equally true , for 400 , 600 and 1000 nits. And you still didn't explain how exactly AMD lied.

Watch video (0:21) but I doubt it help since you take AMD word above anything else.

"AMD's first FreeSync 2 certification program required displays to only be able to adhere to HDR400. There are some examples of announced, FreeSync 2 monitors that only support that standard (and others that don't support even that but were certified all the same)"

AMD promised that Freesync 2 comes with HDR by default. But from all Freesync 2 monitors only few were certified HDR400, 600 (maybe more didn't check). Now they call it Freesync 2 HDR because only now those monitors will come with DisplayHDR 400, 600, 1000 nits.
 
Joined
Feb 23, 2008
Messages
1,064 (0.18/day)
Location
Montreal
System Name Aryzen / Sairikiki / Tesseract
Processor 5800x / i7 920@3.73 / 5800x
Motherboard Steel Legend B450M / GB EX58-UDP4 / Steel Legend B550M
Cooling Mugen 5 / Pure Rock / Glacier One 240
Memory Corsair Something 16 / Corsair Something 12 / G.Skill 32
Video Card(s) AMD 6800XT / AMD 6750XT / Sapphire 7800XT
Storage Way too many drives...
Display(s) LG 332GP850-B / Sony w800b / Sony X90J
Case EVOLV X / Carbide 540 / Carbide 280x
Audio Device(s) SB ZxR + GSP 500 / board / Denon X1700h + ELAC Uni-Fi 2 + Senn 6XX
Power Supply Seasonic PRIME GX-750 / Corsair HX750 / Seasonic Focus PX-650
Mouse G700 / none / G602
Keyboard G910
Software w11 64
Benchmark Scores I don't play benchmarks...
But from all Freesync 2 monitors only few were certified HDR400, 600 (maybe more didn't check). Now they call it Freesync 2 HDR because only now those monitors will come with DisplayHDR 400, 600, 1000 nits.

Dude, where are those Freesync 2 monitors that are not HDR400 certified? If Samsung rates the Samsung C32HG70 at 600-nits peak and VESA seems to be OK with it, how is it managing sub 400 in some situations AMD's fault?
 
Last edited:

AlienIsGOD

Vanguard Beta Tester
Joined
Aug 9, 2008
Messages
5,111 (0.89/day)
Location
Kingston, Ontario Canada
System Name Aliens Ryzen Rig | 2nd Hand Omen
Processor Ryzen R5 5600 | Ryzen R5 3600
Motherboard Gigabyte B450 Aorus Elite (F61 BIOS) | B450 matx
Cooling DeepCool Castle EX V2 240mm AIO| stock for now
Memory 8GB X 2 DDR4 3000mhz Team Group Vulcan | 16GB DDR4
Video Card(s) Sapphire Pulse RX 5700 8GB | GTX 1650 4GB
Storage Adata XPG 8200 PRO 512GB SSD OS / 240 SSD + 2TB M.2 SSD Games / 1000 GB Data | SSD + HDD
Display(s) Acer ED273 27" VA 144hz Freesync |TCL 32" 1080P w/ HDR
Case NZXT H500 Black | HP Omen Obelisk
Audio Device(s) Onboard Realtek | Onboard Realtek
Power Supply EVGA SuperNOVA G3 650w 80+ Gold | 500w
Mouse Steelseries Rival 500 15 button mouse w/ Razor Goliathus Chroma XL mousemat | Logitech G502
Keyboard Logitech G910 Orion Spark RGB w/ Romer G tactile keys | Logitech G513 Carbon w/ Romer G tactile keys
Software Windows 10 Pro | Windows 10 Pro

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.65/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
You need 1000 nits for real HDR.
False. Samsung talked about this with their FreeSync 2 certified monitors. They said that HDR 1000 is too bright for monitor use. It is intended for living room TVs where the viewer is some distance from the TV so all that brightness is drowned out by ambient light before it reaches the viewers eyes. HDR 600 is more sensible (less blinding) for monitor use.

FreeSync 2 spec was created before VESA put out the DisplayHDR standard. AMD updated FreeSync 2 spec to use the definitions VESA established. There's only three FreeSync 2 certified monitors right now and all three exceed DisplayHDR 600.

There will likely be DisplayHDR 1000 FreeSync 2 certified TVs.

I welcome this change. Less confusing for everyone.


FYI, DisplayHDR 400......most monitors are 300-350 nit but lack region lighting.
 
Last edited:
Joined
Jul 13, 2016
Messages
2,792 (0.99/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage Too much
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) Topping DX5, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse G305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
False. Samsung talked about this with their FreeSync 2 certified monitors. They said that HDR 1000 is too bright for monitor use. It is intended for living room TVs where the viewer is some distance from the TV so all that brightness is drowned out by ambient light before it reaches the viewers eyes. HDR 600 is more sensible (less blinding) for monitor use.

FreeSync 2 spec was created before VESA put out the DisplayHDR standard. AMD updated FreeSync 2 spec to use the definitions VESA established. There's only three FreeSync 2 certified monitors right now and all three exceed DisplayHDR 600.

There will likely be DisplayHDR 1000 FreeSync 2 certified TVs.

I welcome this change. Less confusing for everyone.


FYI, DisplayHDR 400......most monitors are 300-350 nit but lack region lighting.

Yeah, ambient lighting plays a huge role in how much brightness you need. Most people don't know that and just leave it at default settings but you can significantly increase image quality simply by tuning brightness. Decreasing the brightness increases the contrast. Brightness is merely a measure of the minimum level of light allowed so naturally increasing it means darker colors will no longer be dark. It's like when you turn off all the lights and notice the TV is still because you can see the light even though the scree is "black", or in this case as black as the monitor can emulate. The flipside of that though is if there is too much ambient light and you have your TV set to a low brightness, it will be hard to see much of anything.
 
Top