• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD RDNA 2 "Big Navi" to Feature 12 GB and 16 GB VRAM Configurations

2019s LG OLEDs wont work with Big Navi

2019 LG OLEDs support HDMI VRR, which Big Navi is highly unlikely NOT to support, please, don't spread FUD, user who registered today to post that strange message.
 
If you think a $700 on par with a 2080 is mid-range then you're going to have to wait to get high end until the aliens visit.

The price tag itself doesn't tell the whole story. In this case, it's super overpriced because of several factors, but yes, on 7nm technology, that is mid-range performance.
RTX 2080 is 16nm/12nm.
 
Last edited:
Don't care the language, personal blogs yours or otherwise are not sources they are opinions. The blog itself appears to be lower than WCCFtech...
LG has been pretty clear on this for a while now. The G-sync implementation on these TVs was a bespoke implementation, not FS-over-HDMI, and does not work on AMD cards. They have also confirmed publicly that FreeSync support will not be added to them in the future. What isn't quite clear is whether this means that future HDMI 2.1-equipped AMD cards will also be out of luck, or if these are compatible through HDMI VRR - because it isn't clear whether these TVs fully support HDMI VRR. It could go either way depending on how interested LG is in providing this service to existing users, given that their new 2020 models are advertised as working with GPUs from both vendors.
 
Last edited by a moderator:
Thanks Valantar. Sry to disappoint you, Patriot and medi01.
 
LG has been pretty clear on this for a while now. The G-sync implementation on these TVs was a bespoke implementation, not FS-over-HDMI, and does not work on AMD cards. They have also confirmed publicly that FreeSync support will not be added to them in the future. What isn't quite clear is whether this means that future HDMI 2.1-equipped AMD cards will also be out of luck, or if these are compatible through HDMI VRR - because it isn't clear whether these TVs fully support HDMI VRR. It could go either way depending on how interested LG is in providing this service to existing users, given that their new 2020 models are advertised as working with GPUs from both vendors.
Ah more TVs that claim specs that they don't actually support, cool cool.

Sorry for jumping on you new germen blog poster, I read quoted message which was truncated from your original greatly changing the meaning. Will not function is different than will not support VRR universally.
Please use real sources that cite things not blogs...

https://www.thefpsreview.com/2020/08/10/lgs-2019-oled-tvs-arent-getting-amd-freesync/ Notice how they link their source?
 
Last edited:
Looking at 250mm2 5700/XT vs 2070/2060 and supers, seriously?
Are you really going to use die size to equate two products on completely different process nodes? Like, really?
/Facepalm.

Navi10 is great, but it's about a cheap, good-enough part for AMD. At 10.3 billion transistors its closest Nvidia relative is the original 2070 (full-fat TU106) that has 10.8 billion transistors.

They trade blows but I reckon the 5700XT wins more than it loses and is about 10% ahead. That sounds about right, if you look at all the various reviews around the web, the 5700XT is a little bit faster than the 5700XT and how much faster depends on the game selection tested.

Here's the thing(s) though:
  • AMD has the process node advantage; 7nm vs 12nm
  • AMD has the clock frequency advantage; ~1905MHz vs 1620MHz
  • AMD has the shader count advantage; 2560 vs 2304
  • AMD needs 30% more power, despite the more efficient node; 225W vs 175W
  • AMD uses all 10.3bn transistors without tensor cores or raytracing support; TU106's 10.8bn transistors includes all that.

So yeah, Nvidia has the architectural advantage. If you took the exact same specs that Navi10 has and made a 7nm, TU106 part with 2560 CUDA cores and let it use 225W, it would stomp all over the 5700XT. Oh, and it would still have DLSS and hardware raytracing support that Navi10 lacks.
 
I thought that every RTX 2080 Ti card can hit 2000-2100 MHz.
Where did I mention a 2080Ti? You need to read, then think, then comment ;)

The big video of 2070 vs 5700XT was the first clue, but also all the specs in that bullet-point list are 2070 specs. There are three mentions of TU106 and two mentions of 2070.

I picked the 2070 because it's the closest price/transistor count match for navi10 and is also the fully enabled silicon, not a chopped down variant like the 5600XT or 2060S.
 
2080 Ti Owners, what are your overclock settings? (Core clock/memory clock)

Pretty much every card hits 2000-2100mhz overclocked. So just boost your power to whatever maximum your card allows, add anywhere from 100-200mhz or so depending on your card to reach the before mentioned overclock range and you good to go.
Pretty much every card also gets the same memory overclock from anywhere to +500-800mhz.
FE cards just need the fan rpm turned up a bit higher but get the same clocks as the rest.


Where did I mention a 2080Ti? You need to read, then think, then comment ;)

If the largest chip with the highest power consumption can hit over 2 GHz with ease, then the smaller chips should do at least on par.
 
If the largest chip with the highest power consumption can hit over 2 GHz with ease, then the smaller chips should do at least on par.
[/QUOTE]
What?
 
The price tag itself doesn't tell the whole story. In this case, it's super overpriced because of several factors, but yes, on 7nm technology, that is mid-range performance.
RTX 2080 is 16nm/12nm.

Okay, that makes much more sense relative to itself though the 5000 series did not exist until the Radeon 7 was discontinued. It was high end for what it was: Vega at 7nm.
 
Unless you like to play at 4K.
Or lots of mip-map levels or lots of texture varieties in the same scene. Photogrammetry is the new mega texture.
 
Or lots of mip-map levels or lots of texture varieties in the same scene. Photogrammetry is the new mega texture.
Exactly why I bought a 1440P monitor for Gaming the difference in picture quality is not as drastic as 1080P to 1440P but 4K eats GPUs for breakfast and lunch.
 
I thought that every RTX 2080 Ti card can hit 2000-2100 MHz.
It probably can, but WTF does that have to do with my point? I wasn't even replying to you - you've just successfully trolled the discussion with your inability to understand english.

If the largest chip with the highest power consumption can hit over 2 GHz with ease, then the smaller chips should do at least on par.
What?
Do you even have a clue what you're talking about? Are you suggesting that a 10900K can reach 5.3GHz so the i3-10100 should be able to as well?
Are you sober right now, even?
 
It probably can, but WTF does that have to do with my point? I wasn't even replying to you - you've just successfully trolled the discussion with your inability to understand english.


What?
Do you even have a clue what you're talking about? Are you suggesting that a 10900K can reach 5.3GHz so the i3-10100 should be able to as well?
Are you sober right now, even?

You are wrong on every angle with regards to your claim that Radeons achieve higher clocks. That is some very serious shifting of the reality to some imaginary "facts". :D
 
LG has been pretty clear on this for a while now. The G-sync implementation on these TVs was a bespoke implementation, not FS-over-HDMI, and does not work on AMD cards. They have also confirmed publicly that FreeSync support will not be added to them in the future. What isn't quite clear is whether this means that future HDMI 2.1-equipped AMD cards will also be out of luck, or if these are compatible through HDMI VRR - because it isn't clear whether these TVs fully support HDMI VRR. It could go either way depending on how interested LG is in providing this service to existing users, given that their new 2020 models are advertised as working with GPUs from both vendors.
LG OLED TVs do not have a bespoke GSync implementation. These have HDMI 2.1 and its VRR which is a pretty standard thing (and not compatible with bespoke FS-over-HDMI). Although no cards have HDMI 2.1 ports, Nvidia added support for some HDMI 2.1 features - in this context namely VRR - to some of their cards with HDMI 2.0. Nothing really prevents AMD from doing the same. FS-over-HDMI will not be added but AMD can add VRR support in the same way Nvidia did. And it will probably be branded as Freesync something or another.

Not confirmed but I am willing to bet both next-gen GPUs will have HDMI 2.1 ports and VRR support.
 
You are wrong on every angle with regards to your claim that Radeons achieve higher clocks. That is some very serious shifting of the reality to some imaginary "facts". :D
My claim? It's literally the official specs.

Sure, Geforce boost is more dynamic than AMD; There are plenty of videos from mainstream channels like Jayz/GN/HW Unboxed reviewing 5700XT AIB cards with 2GHz+ game clocks at stock though so the point you're trying to make falls apart even as you're pushing it. So what? Clockspeeds was only one of my points and if you're going to argue with official specs then your argument is with Nvidia, not me. You might want to take up all the AIB cards on their power consumption figures too, if you're in that sort of mood.

You've set up a straw man by introducing a 2080Ti for no reason to a 2070/5700XT discussion and I'm not buying it.
 
Last edited:
Clock speeds? TPU has reviews.

Reference RX 5700XT - average 1887MHz. Best cards average at around 2000MHz, ASUS Strix is the only one that averages above that at 2007MHz but couple others are very close.
RTX 2070 Founder's Edition - average 1862MHz. There are less reviews and average speeds seem to end up somewhere in 19xx MHz range. Best card is Zotac's AMP Extreme with average of 1987MHz.

Pretty even overall.
Performance and power consumption seem to be pretty much at the same level as well.

May be worth noting that non-super Turings are relatively modestly clocked to keep them in power-efficient range.

In terms of shader units and other resources, RX 5700XT is equal to RTX 2070 Super but the latter uses a bigger cut-down chip.
Similarly in terms of shaders and resources RX 5700 (non-XT) is really equal to RTX 2070 (non-Super) but this time RX 5700 uses a bigger (more shader units and stuff) cut-down chip.
 
Last edited:
I still think that the GeForces as a rule of thumb always achieve higher clocks.
This is my overall observation on the things and overall impressions.

See 2.16 GHz on the RTX 2070 FE.

1597261928485.png

 
Unless you like to play at 4K.
Even 4K it’s useless. Only one or two games can consume 11GB+ VRAM. Most of the modern games are around 4-8GB according to TechPowerUp’s reviews.
Besides that, no AMD GPU can reach modern game’s 4K 60.
So the RDNA II’s large VRAM is a nonsense unless people who will buy it are creators.
 
Last edited:
Back
Top