Sunday, October 18th 2020

AMD Navi 21 XT Seemingly Confirmed to Run at ~2.3, 2.4 GHz Clock, 250 W+

AMD's RDNA2-based cards are just around the corner, with the company's full debut of the secrecy-shrouded cards being set for October 28th. Rumors of high clocks on AMD's new architecture - which were nothing more than unsubstantiated rumors up to now - have seemingly been confirmed, with Patrick Schur posting on Twitter some specifications for upcoming RNDA2-based Navi 21 XT. Navi 21 XT falls under the big Navi chip, but likely isn't the top performer from AMD - the company is allegedly working on a Navi 21 XTX solution, which ought to be exclusive to their reference designs, with higher clocks and possibly more CUs.

The specs outed by Patrick are promising, to say the least; that AMD's Big Navi can reach clocks in excess of 2.4 GHz with a 250 W+ TGP (quoted at around 255 W) is certainly good news. The 2.4 GHz (game clock) speeds are being associated with AIB cards; AMD's own reference designs should be running at a more conservative 2.3 GHz. A memory pool of 16 GB GDDR6 has also been confirmed. AMD's assault on the NVIDIA 30-series lineup should embody three models carved from the Navi 21 chip - the higher performance, AMD-exclusive XTX, XT, and the lower performance Navi 21 XL. All of these are expected to ship with the same 256 bit bus and 16 GB GDDR6 memory, whilst taking advantage of AMD's (rumored, for now) Infinity Cache to make up for the lower memory speeds and bus. Hold on to your hats; the hype train is going full speed ahead, luckily stopping in a smooth manner come October 28th.
Sources: Patrick Schur @ Twitter, via Videocardz
Add your own comment

229 Comments on AMD Navi 21 XT Seemingly Confirmed to Run at ~2.3, 2.4 GHz Clock, 250 W+

#201
BoboOOZ
EarthDogYou may want to do the math on that... at least looking at TPUs results with Metro and Control, those results do not speak to your point (assuming my math was correct). I'd stop digging the hole you're in if I was you. :p
Meh, we're discussing, I discuss to learn things, being wrong is a natural part of learning. I(m an adult, I don't need to right all the time to feel good.
But I'll look again at those benchmarks and see if the numbers I remember are wrong.

Edit:

Just looked at this, the performance hit of RT+DLSS is the same on the 2080Ti and 3080 in 4k, as I remembered:
tpucdn.com/review/evga-geforce-rtx-3080-ftw3-ultra/images/control-rtx-3840-2160.png
Posted on Reply
#202
EarthDog
BoboOOZMeh, we're discussing, I discuss to learn things, being wrong is a natural part of learning. I(m an adult, I don't need to right all the time to feel good.
But I'll look again at those benchmarks and see if the numbers I remember are wrong.
It is a discussion indeed. I was simply noting your record in this thread isn't stellar this morning. We all do it (are wrong sometimes)... but the goal is to get the correct information out so users can make their own decisions based on facts. Don't take it personally. :)
Posted on Reply
#203
BoboOOZ
EarthDogIt is a discussion indeed. I was simply noting your record in this thread isn't stellar this morning. We all do it (are wrong sometimes)... but the goal is to get the correct information out so users can make their own decisions based on facts. Don't take it personally. :)
I don't, I know some need to be right mùore than others, we're all different, that's fine.

The fact is that most next-gen games will be developed, tested and optimized for RDNA2 GPU's, and it would be naive to imagine this will have no influence.
Posted on Reply
#204
EarthDog
BoboOOZI don't, I know some need to be right mùore than others, we're all different, that's fine.

The fact is that most next-gen games will be developed, tested and optimized for RDNA2 GPU's, and it would be naive to imagine this will have no influence.
I'll ignore the passive aggressive barb there....:)

I used the 3080 reference review.... and just RT since you never mentioned DLSS until now. DLSS is different than RT, bud. Tough to hit a moving target. ;)
BoboOOZFrankly, if you look at the benchmark results with RT on and off between Turing and Ampere, the performance hit seems to be the same or worse for Ampere. The only relative performance increase seems to come for fully path-traced games, which are games with very light graphical requirements. For normal AAA games, the gains seem to come from raw performance increase.
Anyway, I digress. The point(s) has(have) been made and users can do what they please with the information out there. ;)
Posted on Reply
#205
BoboOOZ
EarthDogI'll ignore the passive aggressive barb there....:)
It wasn't at all directed especially at you, It's most important for people who upgrade in the next months, I'm one of them.
EarthDogI used the 3080 reference review.... and just RT since you never mentioned DLSS until now. DLSS is different than RT, bud. Tough to hit a moving target. ;)
I used this review because it has the latest drivers, as you had poiinted out that my info is stale. Also, reviewers put DLSS and RTX on because otherwise, framerates are still too low, so it's normal to look at them like that. Anyway, are you trying to imply that, in fact, DLSS performance has dropped, while RTX performance has increased with Ampere? Or do you have numbers that show a different performance hit in Control than what I pointed out? Please share, don't worry about statistics about how many times somebody was right/wrong this morning, we might just learn something new in the process.
Posted on Reply
#206
Zach_01
I’m not going to spend time and space for the specifics, but from what I saw so far from reviews, Ampere architecture is improved over Turing on all aspects. But most of the improvements were marginal. It’s mostly the larger number of process units (FP process) that makes the 3080 to be at best 32% than 2080Ti.

What bothers me is that nVidia’s advertisement and presentation is making it look like a leap forward, when certainly is not.
Posted on Reply
#207
EarthDog
BoboOOZIt wasn't at all directed especially at you, It's most important for people who upgrade in the next months, I'm one of them.

I used this review because it has the latest drivers, as you had poiinted out that my info is stale. Also, reviewers put DLSS and RTX on because otherwise, framerates are still too low, so it's normal to look at them like that. Anyway, are you trying to imply that, in fact, DLSS performance has dropped, while RTX performance has increased with Ampere? Or do you have numbers that show a different performance hit in Control than what I pointed out? Please share, don't worry about statistics about how many times somebody was right/wrong this morning, we might just learn something new in the process.
I just answered your question as you asked it. Moving the goal posts from there was on you. A 3080 runs over 100 FPS in Metro and almost 60 FPS in Control without DLSS (1440)... just saying. If you go up to 4K, it is needed in some titles, but... who's talking about 4K and the 2% of users that game on it (according to Steam)? Not me.. I specifically mentioned 1440p (good balance between # of users and a GPU heavy res).

Im not implying nor did infer (or straight up say) anything about DLSS. That was you. Please feel free to look at the results in your 3080/3090 review of your choice at TPU...again, DLSS was not ever mentioned by me. Please, take the time to do that. I did the math and came up with a different answer.. others support that assertion. The onus isn't on me... the math is there in the review I mentioned. I don't see it any different in the latest review (so long as you stay in the lane of RT/1440/no DLSS) considering all 3080 reviews were on 9/16 or 9/17 here at TPU (according to google). Again, DLSS isn't RT so I'm not sure why that is even being brought up at this point.

PS - It holds true in 4K on your review. ;)
Posted on Reply
#208
Blueberries
TPU... where news articles are based entirely on some random nobody with 100 followers (since-been-deleted) tweet with random specs and no evidence or explanation of where the numbers came from followed by 10 pages of people arguing about literally nothing
Posted on Reply
#209
r9
Zach_01This is were you’re wrong, thinking that the 80CU GPU is 2x5700XT on some kind of crossfire. Understand that RDNA2 has different architecture. An improved one... It’s not how you think it is. The architecture will have better IPC, better performance/watt and at least 10% higher boost clock. In raw numbers it’s performance is higher that 2x5700XT. In real life it will be around 2x5700XT and that is placing it matching 3080.


And you are wrong about your assumptions. Because when nVidia doesn’t use something, it doesn’t mean that it’s without value. nVidia is not the ruler of tech nor the God of GPUs. nVidia can’t use right now big caches because the architecture in use is compute based/oriented and not game oriented. It happens to do well on games and particularly on 4K. If you see the gains on 1080/1440p are smaller that 2080Ti.

AMD and nVidia are choosing different paths. Does not make one or the other better or worst. It will come to users to choose what is best for them and what they want in the end.
Let's hope I'm wrong.
BoboOOZAMD does raytracing the way it will be done in all coming next-gen games, excepting a handful of games paid by Nvidia to do it RTX style. That's more than enough for me. It's raytracing that will actually work in most games.
It will be impossible to compare apples to apples with nvidia's implementation. I hope enabling on amd won't cut the fps in half like on nvidia gpus. Fun times ahead ... can't wait for the reviews to come out! Usually the reviews put an end to this type of discussions but this time I think it will be just the beginning. Lol
Posted on Reply
#210
BoboOOZ
r9It will be impossible to compare apples to apples with nvidia's implementation. I hope enabling on amd won't cut the fps in half like on nvidia gpus. Fun times ahead ... can't wait for the reviews to come out! Usually the reviews put an end to this type of discussions but this time I think it will be just the beginning. Lol
Discussions never end :p
Posted on Reply
#211
r9
EarthDog...you're inferring RT doesn't work with Nvida? I'm confused at what your point is here.

Don't they both use MS DXR?
That's a good point, if RT is enabled via DX we might get apple to apple comparison after all.
Posted on Reply
#212
MarcingMarshmallow
cuemaninteresting.looks it cant beat rtx 3080 so amd drop good its tdp down.
thouse clocks heard high,very high.

under 260w and rtx 3080 speed? noway.not 4K speed
Keep in mind that this is only gpu wattage, the whole card is estimated to be 300 watts, so the gap is much smaller.
Posted on Reply
#214
medi01
MarcingMarshmallowKeep in mind that this is only gpu wattage, the whole card is estimated to be 300 watts, so the gap is much smaller.
JoinedOct 19, 2020
Messages1 (1.00/day)

Thanks for registering to say that. Very helpful indeed.
Posted on Reply
#215
Totally
nguyenIt was benchmarked with 3950X test rig with PCIe 4.0
5700XT gain a few % with PCIe 4.0 and 2070S lose a few % with 3950X vs 10900K, some game even run very poorly with 3950X

Also HUB refuse to include DLSS 2.0 results, which RTX users will gladly use on any title that support it.
Kinda unfair to use feature available on one card (PCIe 4.0) but refuse to do the same with the other (DLSS).

Now that Cyberpunk 2077 is about to release, I wonder what 5700XT owners are gonna feel though, imminent upgrade to Big Navi ? The CP2077 HypeTrain is going stronger and stronger everyday :D
You make it sound as if the game only runs at 4k60 on max settings and is unplayable with anything less than that. :wtf:

Max(IT)a lot of driver related issues (the situation is better now, but not completely solved) and far from stellar sales.
And the lack of RT that made it "old" since the beginning...
As pointed out by many in this thread, a very small portion of users were actually affected and after the fact retracted their claims as the those users found out that it was something else causing the issue not the card or drivers, others just hopping on the bandwagon and exaggerating things. The latter group is YOU.
launch drivers ? Drivers were terrible until last may, and even today they are far from being very stable.
I'm not dismissing that there are issues but problems of others aren't mine.
I don't want to have an argue with a fanboy, which clearly you are, so I will cut it short. I have NO brand loyalty at all.
I'm a fanboy for disagreeing and pointing out your gross exaggeration? If that's what it takes to be a fanboy then I'm a fanboy. Vote me.
I have NO brand loyalty at all.
No one asked, no one cared, and how does that preclude you from being biased as hell? Sus, if you asked me.
I have a PC with a Ryzen 3900X and another PC with a 5700XT and the VGA is terrible. Performance and price were good, but drivers made me mad for a whole year. Even today, when the situation improved, sometimes I'm experiencing black screens and freezes.

I am going to give AMD another chance with the new GPU, if it will be a good product, but they have to make it better than the crappy 5700XT...
Says he has a crappy experience with a trash product and sticks with the same brand and is accusing others of being a fanboy. I'm gonna vote fanboy.
Posted on Reply
#216
Max(IT)
TotallyYou make it sound as if the game only runs at 4k60 on max settings and is unplayable with anything less than that. :wtf:




As pointed out by many in this thread, a very small portion of users were actually affected and after the fact retracted their claims as the those users found out that it was something else causing the issue not the card or drivers, others just hopping on the bandwagon and exaggerating things. The latter group is YOU.



I'm not dismissing that there are issues but problems of others aren't mine.



I'm a fanboy for disagreeing and pointing out your gross exaggeration? If that's what it takes to be a fanboy then I'm a fanboy. Vote me.



No one asked, no one cared, and how does that preclude you from being biased as hell? Sus, if you asked me.



Says he has a crappy experience with a trash product and sticks with the same brand and is accusing others of being a fanboy. I'm gonna vote fanboy.
A small portion of users ... lol ...

as I said : a waste of time.
Posted on Reply
#217
DeeJay1001
BMfan80That would be awesome,I had a 4770 and when overclocked it was a beast of a card.
I have a screenshot of it back in the day with a furmark score that is higher than a GTX470

I'm waiting to see what the new cards can do,I would like to replace my 1080ti with one.
I also had a 4770, It was one of the first 4770s Newegg shipped out. Was my first real GPU and I've been hooked ever since. I put in over 900 hours of COD MW2 on that card.

That thing still runs like a champ, my little brother plays minecraft on it nearly everyday.

That little 4770 has been pumping out frames and generating joy for over 10 years.
Posted on Reply
#218
r9
I don't how you guys pick which one to buy but performance summary form Wizz reviews been my bible. Cross-reference to the price/availability. And due to AMD/ati being the underdog through history they always undercut nvidia in price so I end up mostly with AMD/Ati cards.
Posted on Reply
#219
Totally
Max(IT)A small portion of users ... lol ...

as I said : a waste of time.
Sounds like you have actual concrete data on the numbers of cards sold vs affected, care to share.
r9I don't how you guys pick which one to buy but performance summary form Wizz reviews been my bible. Cross-reference to the price/availability. And due to AMD/ati being the underdog through history they always undercut nvidia in price so I end up mostly with AMD/Ati cards.
same
Posted on Reply
#220
TheoneandonlyMrK
TotallySounds like you have actual concrete data on the numbers of cards sold vs affected, care to share.



same
Seconded, Data please.
Posted on Reply
#221
r9
theoneandonlymrkSeconded, Data please.
Posted on Reply
#222
Aquinus
Resident Wat-man
RaevenlordThe specs outed by Patrick are promising, to say the least; that AMD's Big Navi can reach clocks in excess of 2.4 GHz with a 250 W+ TGP (quoted at around 255 W) is certainly good news.
I predict that "Big Navi" is going to be a 20-30% smaller die than GA102 if these power and performance figures are accurate. I guess we'll only have to wait a little more than a week to find out.
Posted on Reply
#223
Zach_01
AquinusI predict that "Big Navi" is going to be a 20-30% smaller die than GA102 if these power and performance figures are accurate. I guess we'll only have to wait a little more than a week to find out.
There is already a leaked picture of the big die and was estimated around 536mm². It is smaller than GA102 (628mm²). Still if all info is true, the actual RDNA2 GPU will draw more power than GA102.
The 2 GPU card rivals may have the same TotalBoardPower (~320W) but lets not forget that 3080 uses GDDR6X that draws the hefty amount of ~70W vs the 20~30W that "simple" GDDR6 draws.
Posted on Reply
#224
r9
Zach_01There is already a leaked picture of the big die and was estimated around 536mm². It is smaller than GA102 (628mm²). Still if all info is true, the actual RDNA2 GPU will draw more power than GA102.
The 2 GPU card rivals may have the same TotalBoardPower (~320W) but lets not forget that 3080 uses GDDR6X that draws the hefty amount of ~70W vs the 20~30W that "simple" GDDR6 draws.
Just put something in perspective Ryzen 3800x 74mm2 7nm sells for $399, doesn't come with vram, pcb or vrm.
Big navi 536mm2 will sell for ~$699, will need pcb, vram and vrm. To the cost add the waaay lower yield due to the muuuch higher size.
In case people wonder why AMD putting so much more effort into the CPU business.
Posted on Reply
#225
lesovers
Max(IT)Yes, I have nothing to complain about image quality (AMD/ATi usually are better on that).
I found the most stable drivers for me to be 20.4.2 (IIRC).
The black screen/freeze bug is very subtle: I can play the same game for weeks (literally) without any issue and then ... bang ! Three crashes in a row on the same day !
It is driving me crazy ... :cry:

The situation was terrible in december/march, when the freezes were quite frequent, but now it is much better (it happens mostly in Warzone, but not very often).
I'm afraid to upgrade to the latest 20.9.2: I read about big performance improvements but also about some crashes.
Thanks the 20.2.2 driver still gives me the best results so far therefore this is my final driver before purchacing the new Navi 6800XT or 6900XT.
Hope the new hardware just works like the RX580 and has no driver bugs please!
Posted on Reply
Add your own comment
Apr 26th, 2024 17:28 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts