• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD RDNA 3 GPUs to Support DisplayPort 2.0 UHBR 20 Standard

AleksandarK

News Editor
Staff member
Joined
Aug 19, 2017
Messages
3,073 (1.08/day)
AMD's upcoming Radeon RX 7000 series of graphics cards based on the RDNA 3 architecture are supposed to feature next-generation protocols all over the board. Today, according to a patch committed to the Linux kernel, we have information about display output choices AMD will present to consumers in the upcoming products. According to a Twitter user @Kepler_L2, who discovered this patch, we know that AMD will bundle DisplayPort 2.0 technology with UHBR 20 transmission mode. The UHBR 20 standard can provide a maximum of 80 Gbps bi-directional bandwidth, representing the highest bandwidth in a display output connector currently available. With this technology, a sample RDNA 3 GPU could display 16K resolution with Display Stream Compression, 10K without compression, or two 8K HDR screens running at 120 Hz refresh rate. All of this will be handled by Display Controller Next (DCN) engine for media.

The availability of DisplayPort 2.0 capable monitors is a story of its own. VESA noted that they should come at the end of 2021; however, they got delayed due to the lack of devices supporting this output. Having AMD's RDNA 3 cards as the newest product to support these monitors, we would likely see the market adapt to demand and few available products as the transition to the latest standard is in the process.


View at TechPowerUp Main Site | Source
 
Forget 16K or even 8K monitors - I'm just excited that we'll get 5K ultrawide monitors with >75Hz refresh!
 
I am going too start saving from now. I will definitely be getting RDNA3. If the Multi GPU rumor is true I will be definitely getting one as I miss Crossfire.
 
Just looking forward to see the cards
 
Forget 16K or even 8K monitors - I'm just excited that we'll get 5K ultrawide monitors with >75Hz refresh!

As an ultrawide monitor owner there are some situations where it's not better, and I mean outside of it's just not supported. At times you want that 24inch screen where everything is in your vision field. 5k would be nice though on one!
 
Finally DP 2.0 is coming. I've been waiting for 4K 144Hz with HDR and Chroma 4:4:4. Stream compression is fine, but you lose colors in the process.
 
Finally DP 2.0 is coming. I've been waiting for 4K 144Hz with HDR and Chroma 4:4:4. Stream compression is fine, but you lose colors in the process.
I hope that LG makes 27inc 4k variant of new NANO IPS displays with DP2 by the end of the year.
 
I just want 27" 1440p 180hz OLED panel, and I want my RDNA3 to be able to pump the full 180 frames in AAA games at that rez in Cyberpunk 2077, etc. Higher resolutions don't interest me, I want that smooth as butter high refresh combined with OLED infinite blacks.
 
I just want 27" 1440p 180hz OLED panel, and I want my RDNA3 to be able to pump the full 180 frames in AAA games at that rez in Cyberpunk 2077, etc. Higher resolutions don't interest me, I want that smooth as butter high refresh combined with OLED infinite blacks.
Problem with OLED is the PWM. Very few displays (that I know of) tend to go for DC, which alone introduces problems on its own.

I for one prefer non-PWM even at the price of non-infinite contrast ratio.
 
You set your standards pretty low. If everyone was like this we'd all still be at 480p. :D
Laughs on 1080P....



The drive for pixels went from porn to mainstream media and the curved monitors gaining traction is far away from mainstream so expect it to take awhile.
 
Someone just make a 16:10 format monitor. Great to see laptops moving away from the god awful 16:9 garbage but desktop screens still stuck in the past. I want 4096 x 2400 32" OLED monitor with HDR10+, 120Hz, 100% AdobeRGB, 80% REC.2020, DP2.0, HDMI2.1, USB-C
 
Someone just make a 16:10 format monitor. Great to see laptops moving away from the god awful 16:9 garbage but desktop screens still stuck in the past. I want 4096 x 2400 32" OLED monitor with HDR10+, 120Hz, 100% AdobeRGB, 80% REC.2020, DP2.0, HDMI2.1, USB-C
Pretty sure Sony And Phillips do.
 
38.6" 5120x2160 144 Hz 144 ppi curved would be good for beginning.

43.4" 7680x3240 240 Hz 192 ppi curved should be a stretch goal for common standard.
This sounds great, only until you realize that most people are still rocking 1080 and 1440p monitors. The problem with pushing for higher resolution and refresh rate comes at a steep cost because you will need to pay for high end hardware to maximize the potential if you are looking to game smoothly on it. Next gen GPUs may help push us closer to high resolution and refresh rate gaming, again until you realize that game engines are getting more and more demanding as well. Just consider the Unreal Engine 5 demo and you get a sense.
 
Problem with OLED is the PWM. Very few displays (that I know of) tend to go for DC, which alone introduces problems on its own.

I for one prefer non-PWM even at the price of non-infinite contrast ratio.

I thought PWM was the flickering of a backlight? My old 27" 1440- QNIX monitor had that... and I felt like it gave me headaches, but I was never 100% sure.

How would OLED have PWM when each pixel can light itself?
 
I thought PWM was the flickering of a backlight? My old 27" 1440- QNIX monitor had that... and I felt like it gave me headaches, but I was never 100% sure.

How would OLED have PWM when each pixel can light itself?
From rtings review of C1:
1653640581786.png
 
Hmm, I wonder if I will have to avoid OLED... I know PWM in the past did give me a headache... this is... horrible news.
 
You set your standards pretty low. If everyone was like this we'd all still be at 480p. :D

8K monitors are completely useless, so 16k don't even bother.
 
I thought PWM was the flickering of a backlight? My old 27" 1440- QNIX monitor had that... and I felt like it gave me headaches, but I was never 100% sure.

How would OLED have PWM when each pixel can light itself?
PWM is used for brightness regulation. There are lots of resources on the web, describing the thing. Here is one example: https://www.nintendolife.com/news/2...-keep-the-switch-oleds-screen-nice-and-bright

8K monitors are completely useless, so 16k don't even bother.
My 8K telly begs to differ. Not to mention the ability to put more information on the same screen size.

Hmm, I wonder if I will have to avoid OLED... I know PWM in the past did give me a headache... this is... horrible news.
I too am very sensitive to PWM flicker. For TVs and monitors there is plenty of choices but the things go horribly when it comes to phones.
 
Hmm, I wonder if I will have to avoid OLED... I know PWM in the past did give me a headache... this is... horrible news.
According to RTINGS it's not PWM, it's brightness fluctuation at 8ms interval, the same as panel refresh rate.
 
38.6" 5120x2160 144 Hz 144 ppi curved would be good for beginning.

43.4" 7680x3240 240 Hz 192 ppi curved should be a stretch goal for common standard.
5K 29" with 120Hz+ and 12-bit color is what I look forward to. The combination of the higher resolution and color depth at a very high PPI would excellent. 5K downscale already looks really good on a 27" 1440p 10-bit panel so the added color depth and on a native panel would only make it look that much more clear and sharp.
 
Should be 28.943 for ideal pitch. The problem is that it's going to be way too small. Height of display would be 28.6 cm to 33.6 cm of 16:9 27", which is small in my experience. 5K 21:9 with 144 ppi would make it 38.6 cm tall. 8K 21:9 192 ppi would make it 42.9 cm tall and much wider than 16:9. Everything in front of you. That's what you can actually call desktop.
 
I am going too start saving from now. I will definitely be getting RDNA3. If the Multi GPU rumor is true I will be definitely getting one as I miss Crossfire.
The return of multi GPU is going to be a rocky one. unless it is baked into the API with 0 effort on the dev's end support is going to be abysmal.
 
Back
Top