Sunday, October 18th 2020

AMD Navi 21 XT Seemingly Confirmed to Run at ~2.3, 2.4 GHz Clock, 250 W+

AMD's RDNA2-based cards are just around the corner, with the company's full debut of the secrecy-shrouded cards being set for October 28th. Rumors of high clocks on AMD's new architecture - which were nothing more than unsubstantiated rumors up to now - have seemingly been confirmed, with Patrick Schur posting on Twitter some specifications for upcoming RNDA2-based Navi 21 XT. Navi 21 XT falls under the big Navi chip, but likely isn't the top performer from AMD - the company is allegedly working on a Navi 21 XTX solution, which ought to be exclusive to their reference designs, with higher clocks and possibly more CUs.

The specs outed by Patrick are promising, to say the least; that AMD's Big Navi can reach clocks in excess of 2.4 GHz with a 250 W+ TGP (quoted at around 255 W) is certainly good news. The 2.4 GHz (game clock) speeds are being associated with AIB cards; AMD's own reference designs should be running at a more conservative 2.3 GHz. A memory pool of 16 GB GDDR6 has also been confirmed. AMD's assault on the NVIDIA 30-series lineup should embody three models carved from the Navi 21 chip - the higher performance, AMD-exclusive XTX, XT, and the lower performance Navi 21 XL. All of these are expected to ship with the same 256 bit bus and 16 GB GDDR6 memory, whilst taking advantage of AMD's (rumored, for now) Infinity Cache to make up for the lower memory speeds and bus. Hold on to your hats; the hype train is going full speed ahead, luckily stopping in a smooth manner come October 28th.
Sources: Patrick Schur @ Twitter, via Videocardz
Add your own comment

229 Comments on AMD Navi 21 XT Seemingly Confirmed to Run at ~2.3, 2.4 GHz Clock, 250 W+

#126
nguyen
INSTG8RFunny you try to cast off fidelity FX as so simple it’s little bit more than that but guess what it looks good to compare to your prec DLSS and looks better doing for your crude description of what it’s doing while your tech is down res then upscaling that fancy AI working hard to hide what’s actually going to not look like crap on the fly. Apparently I just need to make a custom Res and mine looks just as good and performs a#s well or better and apparently I can apply it to any game I want...Sounds to me like Fidelity FX should be added to more games being pretty easy to eo. See the weakness of DLSS is that its constantly having to keep the illusion up on the fly and you can where it struggles to keep up with fast changes, Fidelity FX shows no such visual “artifacts”
I could already tell which one is FidelityFX 5 seconds into your vid without looking at the title bar, it looks like shit :D, see the big rock ?
And this is with YT compression, IRL the difference is probably massive.

I would rather just lower the details setting than using FidelityFX tbh, it looks too blurry.
Posted on Reply
#127
INSTG8R
Vanguard Beta Tester
nguyenI could already tell which one is FidelityFX 5 seconds into your vid without looking at the title bar, it looks like shit :D, see the big rock ?
And this is with YT compression, IRL the difference is probably massive.
Well you say its “simple” but good enough to compare to your fancy system that can equally go shitty when it can’t keep up with fast changing scenes and teXfursx I”m not gonna dig through the videos to find the one wit’s literally shimmering lines in the air defenders saying it just part of the game. Fidelity FX can look more than half as good as. DLSS for just “custom res and GPU scaling” kinda make# your precious DLSS a lot of high tech trickery when AMD can do the same thing with just “simple settings‘“ oh and you also said I could do it with any gang Score another point for AMDs simple
solution.
Laave it to Nvidia to bring a backhoe to a job hen all AMD need to bring is a shovel. Might take a little longer but still gonna dig the same hole
Posted on Reply
#128
nguyen
INSTG8RWell you say its “simple” but good enough to compare to your fancy system that can equally go shitty when it can’t keep up with fast changing scenes and teXfursx I”m not gonna dig through the videos to find the one wit’s literally shimmering lines in the air defenders saying it just part of the game. Fidelity FX can look more than half as good as. DLSS for just “custom res and GPU scaling” kinda make# your precious DLSS a lot of high tech trickery when AMD can do the same thing with just “simple settings‘“ oh and you also said I could do it with any gang Score another point for AMDs simple solution.
Yeah sure a solution that can be discerned within 5 seconds in a blind test is "good enough".
Not really agree to that.
That is like saying play with low settings, which I would rather use instead of FidelityFX if I didn't have DLSS and need more FPS.
Posted on Reply
#129
INSTG8R
Vanguard Beta Tester
nguyenYeah sure a solution that can be discerned within 5 seconds in a blind test is "good enough".
Not really agree to that.
That is like saying play with low settings, which I would rather use instead of FidelityFX if I didn't have DLSS and need more FPS.
LOL the entire basis of DLSS is “low setting’“ upscaled and cleverly trying to cover it up...you have ay too mch of a hard on for basically play games rendered at 720p made to look like it’s 4K.... at Lear I played Death Stranding somewhere close t9 my/ native Res I didn’t in fact co use any AA and desp your sensit eye# it looked fantastic maxecd out and ran fantastic as well, I even got HDR. .
Posted on Reply
#130
nguyen
INSTG8RLOL the entire basis of DLSS is “low setting’“ upscaled and cleverly trying to cover it up...you have ay too mch of a hard on for basically play games rendered at 720p made to look like it’s 4K.... at Lear I played Death Stranding somewhere close t9 my/ native Res I didn’t in fact co use any AA and desp your sensit eye# it looked fantastic maxecd out and ran fantastic as well, I even got HDR. .
Idk why you are so against innovation.
DLSS is not an upscaling technique, it is an image resconstruction technique. Basically Nvidia train the AI network what the objects looks like in 16K, in game the Tensor cores will recognize those low res objects and restructure them to imitate 16K images.
It is "work smarter, not harder" motto.

if the end results are equal, why do you care so much how it is done ?
Do you care if a race car only has V6 engine but is just as fast as those with V8 or V12 ?
Posted on Reply
#131
INSTG8R
Vanguard Beta Tester
It’s lowering the resolution to a more playable level the upscaling it to your desired resolution and the AI is just working hard to keep it looking good. I think it’s quite clever but DLSS 1 made the techni pretty obvious to me it’s meant for that poor guy that wants game on his crappy laptop at a decent frame rate. And Frankly not sure why DLSS got added to Death Stranding I was running it maxed out at 1440 getting 120 -144 FPS most of the time it ran and looked amazing. My favourite game i’ve played this year. Maybe PCI 4.0 as giving me an advantage ;)
But yes I much prefer pure GPU grunt to get my FPS while I did have Fidelity FX on Im not sure how much difference it made=for performance.
Posted on Reply
#133
nguyen
INSTG8RIt’s lowering the resolution to a more playable level the upscaling it to your desired resolution and the AI is just working hard to keep it looking good. I think it’s quite clever but DLSS 1 made the techni pretty obvious to me it’s meant for that poor guy that wants game on his crappy laptop at a decent frame rate. And Frankly not sure why DLSS got added to Death Stranding I was running it maxed out at 1440 getting 120 -144 FPS most of the time it ran and looked amazing. My favourite game i’ve played this year. Maybe PCI 4.0 as giving me an advantage ;)
But yes I much prefer pure GPU grunt to get my FPS while I did have Fidelity FX on Im not sure how much difference it made=for performance.
Yes sure so my crappy laptop with 2070 Super Max-Q can enjoy higher IQ and FPS than your desktop with 5700XT, sounds good to me :D.
Anyways we can talk againt after CP2077 release :roll:
Posted on Reply
#134
INSTG8R
Vanguard Beta Tester
Bought it almost a year and a half ago now. I have a brand new 1TB empty NVME just waiting for CP2077.
Posted on Reply
#135
nguyen
same, bought it over GOG too so more money for CDPR
Posted on Reply
#136
INSTG8R
Vanguard Beta Tester
nguyensame, bought it over GOG too so more money for CDPR
Yeah 8 even got a 40% Displate discount so I grabbed a cool Star Wars poster with if. Could have taken a better pic...
Posted on Reply
#137
jigar2speed
FlankerIf RDNA2 can do what the HD4xxx series did at the time, I think that will be good news for consumers.
HD4*** was a upper cut, Nvidia never saw coming and it created a series of re-release of same SOC.
Posted on Reply
#138
renz496
INSTG8RWell this is definitely their chance with Nvidia's supply issue and inflated prices. If big Navi is truly competitive and available they have a chance to really take some market share back.
they probably will gain some if they don't have supply issue themselves. but this problem is only temporary. most likely one quarter at most. when things back to normal nvidia probably will start gaining their market share back.
Posted on Reply
#139
INSTG8R
Vanguard Beta Tester
renz496they probably will gain some if they don't have supply issue themselves. but this problem is only temporary. most likely one quarter at most. when things back to normal nvidia probably will start gaining their market share back.
Exactly they have one shot at this they better be ready with a great product and supplies to back it up while Nvidia fumbLes to make up literally thousands of orders
Posted on Reply
#140
Max(IT)
Fabiowhy you say 5700xt was a fiasco?
a lot of driver related issues (the situation is better now, but not completely solved) and far from stellar sales.
And the lack of RT that made it "old" since the beginning...
INSTG8RWell it did have a rough launch driver wise so first month or so wasn’t great but fiasco is a bit of an exaggeration.
launch drivers ? Drivers were terrible until last may, and even today they are far from being very stable.
TotallyFiasco? can you help my memory?

Did the cards crash to desktop frequently when gaming or pushed hard?
Were the cards sold out day 1, hour 1, minute 1 because of extremely limited supply or non-existent supply in someplaces?
Very power hungry?
I don't want to have an argue with a fanboy, which clearly you are, so I will cut it short.
I have NO brand loyalty at all.
I have a PC with a Ryzen 3900X and another PC with a 5700XT and the VGA is terrible. Performance and price were good, but drivers made me mad for a whole year. Even today, when the situation improved, sometimes I'm experiencing black screens and freezes.

I am going to give AMD another chance with the new GPU, if it will be a good product, but they have to make it better than the crappy 5700XT...
Posted on Reply
#141
INSTG8R
Vanguard Beta Tester
Max(IT)a lot of driver related issues (the situation is better now, but not completely solved) and far from stellar sales.
And the lack of RT that made it "old" since the beginning...


launch drivers ? Drivers were terrible until last may, and even today they are far from being very stable.
i would tend to disagree considering what I do on the side. Any specif issues you’d like raise? If I can confirm or repro it I can get looked$ at. But I literally run every driver and then some and I can’t think of anything issue wise effecting stability. The only “big one” for me is Enhanced Sync is not worked rireliably for quite some time and little has been done to fix it. When combined with Freesync it was a perfect combo
Posted on Reply
#142
Max(IT)
INSTG8Ri would tend to disagree considering what I do on the side. Any specif issues you’d like raise? If I can confirm or repro it I can get looked$ at. But I literally run every driver and then some and I can’t think of anything issue wise effecting stability. The only “big one” for me is Enhanced Sync is not worked rireliably for quite some time and little has been done to fix it. When combined with Freesync it was a perfect combo
the web is literally FULL of people complaining about 5700XT freezes, and if you deny it I would automatically put you on the "fanboy list". In my long experience I learned that AMD fanboys are the worst on the web, by far. They are in complete denial. No point in arguing with them.

To be crystal clear, I am a big AMD supporter since the beginning (since AMD K6 200), and I love a lot of ATI cards I bought in the past (9700Pro and 9800Pro being my all time favorites), but that doesn't make me linked to a brand no-matter-what. My main PC is Ryzen based and I'm building another one for my little son Ryzen based too.
I bought a 5700XT because it had a good price (it was ~160€ less than the 2070 Super at the time) and a promise for good performance. But it gave me a lot of issues and I also returned one (hoping it was defective) and restored the PC several times in order to solve the issues. With no success in doing that.
Now I'm waiting for this new generation to get rid of this crappy card and buy something else.

If your experience was better, I'm happy for you. But for sure I'm not alone on the web regarding 5700XT issues...
Posted on Reply
#143
DonKnotts
ZoneDymoLike if Nvidia and AMD both had a card that performed identical and had the same power consumption etc, then personally atm I would probably go for Nvidia purely for that well done Nvenc which AMD has no answer for as of yet.
Same thing with CUDA. A lot of the things that I use my GPU for besides gaming, will definitely benefit more from an Nvidia card than an AMD card, so AMD needs to add value by lowering their price.
Posted on Reply
#144
medi01
INSTG8RWell that’s usually what AMD does historically. But if they really have a 3080 contender who knows how pricing will go. It’s a pretty conventional card, no exotic HBM etc
There is absolutely no reason for AMD to price 3080 perf chip below $699 as 3080 is nothing but a placeholder with no availability for months to come (and I suspect until new arch comes from NV).
Max(IT)the web is literally FULL of people complaining
Chuckle.
Posted on Reply
#145
Vya Domus
nguyenI could already tell which one is FidelityFX 5 seconds into your vid without looking at the title bar, it looks like shit :D, see the big rock ?
Suuuuuuuure.
nguyenDLSS is not an upscaling technique, it is an image resconstruction technique.
You are wrong, it's an upscaling algorithm. Reconstructions happens when the image is missing information or portions of it are marked as being unusable for instance due to noise and then you fill in those gaps, that's not what DLSS does. DLSS takes a fully rendered image and upscales it. Checkerboarding is for instance a reconstruction algorithm because the initial image is missing half of the vertical columns of pixels.
Posted on Reply
#146
medi01
renz496AMD should understand doing price war for a decade with nvidia will only reduce their profit margin more rather than stealing more nvidia market share.
Yeah, AMD that is used to 40% (and below) margins should be afraid of NV who is used to 60%+.
How could AMD possibly take on Huang, lol.
nguyenAlso HUB refuse to include DLSS 2.0 results
How dare they not run game at a lower resolution but claim the upscaled one.

I've seen Digital Totally Not Shills Getting Exclusive Favourable Previews For Some Strange REason Foundry telling me that if I look at my monitor from another room, I would barely notice it is upscaled! :D
Posted on Reply
#147
nguyen
Vya DomusSuuuuuuuure.


need more circles ?
Some people just have better visual acuity you know :D
Posted on Reply
#148
Vya Domus
nguyenneed more circles ?
Yes and a static image please. Are you seriously screenshoting frames from a compressed youtube video that aren't even in sync ? :roll:

You're something else man.
Posted on Reply
#149
medi01
nguyenDLSS is not an upscaling technique, it is an image resconstruction technique. Basically Nvidia train the AI network what the objects looks like in 16K, in game the Tensor cores will recognize those low res objects and restructure them to imitate 16K images.
Oh, you sweet summer child, is there anything about this tech that you got right...

Posted on Reply
#150
Zach_01
medi01Yeah, AMD that is used to 40% (and below) margins should be afraid of NV who is used to 60%+.
How could AMD possibly take on Huang, lol.
The 3080 10GB does not have a 60% margin... Its far less, and that is why nVidia used it as a placeholder and marketing and never intended to sell this widely.
Wait, you will see the real cards shortly...
Posted on Reply
Add your own comment
May 12th, 2024 04:01 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts