• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon Navi 21 XTXH Variant Spotted, Another Flagship Graphics Card Incoming?

Joined
Oct 12, 2019
Messages
128 (0.08/day)
My strongest candidate is the laptop version, heavily cut (NVIDIA is doing it also).

I strongly doubt any memory-related change, if it wasn't engineered from the start and in ready design. The reason being they strongly opted for GDDR6 (not 'X', not HBM-any) and made that large on-chip cache for compensation. More memory = more cache or different bus, so different silicon. Different memory type = more expensive + whole cache stuff is then of questionable need (guess they tested extensively already, so it's possible but not probable).

A related guess is that the whole top-line is mostly finished, there can be better binning or something like that, price correction (highly needed), and the whole RDNA2 train goes toward (desperately) needed mid/low range, APUs too. I don't know if any lithography gimmick like Zen to Zen+ is very easy to do, if it is then maybe that - but not NOW, maybe at 21Q4...

Ampere and RDNA2 are new products with unfinished lineup, both companies will surely look to return the expensive R&D investments, no chance that anyone goes crazy and launch a new generation this year. AMD got what they wanted, similar to Zen 1 vs. Intel - not besting them in all scenarios, but being competitive and bit cheaper. NVIDIA still has Jensens stupid 'fastest overall' crown (they invested huge money in the past for similar stupid cards which nobody sane bought), and what I dub as 'equally idiotic RT crown' (being first adopter among buyers is... either willing enthusiasm, supporting new tech because someone needs to OR being too rich, too uninformed OR being desperate to play CP2077 or whatever 4 other games support it).

[Yeah, I was connected with or worked in that (rendering) field for three decades. I read the news from professional/dedicated forums even now. Hobby now, but a way above average knowledge still, if I may say so]

Like 70% games has exactly ZERO need for any RT - f00k, look at all those pixel-art games (the worst possible example) people like to play so much. Strategies. Simulations (except MS FS and such). Let's face it, it's mainly FP(s) technology. What about cartoonish graphics? 2D? Both may look *somewhat* better with RT lights/shadows, but how much is that and how much all of it worths to an average (-budget) player? Rasterized lights/shadows aren't catastrophically bad, and the whole low/mid market perhaps don't need it that much...

My (posted, initial) opinion was that GOOD RT needs at least 5 years. But some guy from the gaming industry said that real-lime PHOTOREALISTIC gaming is 10 years away, guess he knows it much better than I do. Also, RT isn't equal to photorealism at all.

Also, who wants ALL games to be photorealistic? Not all players, for sure.

[I've skipped all tech details. I wanted to write an easy to understand, general article about RT - but the time...]

Back to the topic - NVIDIA has what they want now, and AMD has it, too. My opinion is that high-end improvements will come with better lithography - 6nm, 5nm, less; perhaps MCM - for both (and perhaps Intel, hahahaha).

Not like anything any GPU producer will invest soooo much THIS year, probably next, too.

So, perhaps binning, relatively small improvements - perhaps this year, larger ones next or even later... There. My opinion. For both.

Oh, and NVIDIA knows much more about RT (and photorealism) then they advertise now. There is a good ebook about it on NVIDIA site - yup, I've read it, nothing to disprove what I said here, or before. F00k, they have top-level 'guys' in the field - would be weird otherwise, RT existed for longer than I'm in it... True photorealism likely require VR, because it's needed to follow eyeball movement to follow the focus, just to name one tech-detail...
 
Joined
Oct 22, 2014
Messages
13,210 (3.81/day)
Location
Sunshine Coast
System Name Black Box
Processor Intel Xeon E3-1260L v5
Motherboard MSI E3 KRAIT Gaming v5
Cooling Tt tower + 120mm Tt fan
Memory G.Skill 16GB 3600 C18
Video Card(s) Asus GTX 970 Mini
Storage Kingston A2000 512Gb NVME
Display(s) AOC 24" Freesync 1m.s. 75Hz
Case Corsair 450D High Air Flow.
Audio Device(s) No need.
Power Supply FSP Aurum 650W
Mouse Yes
Keyboard Of course
Software W10 Pro 64 bit
XT - more power
XH - extra Hot ;)
 
Joined
Aug 12, 2020
Messages
23 (0.02/day)
Location
Germany
System Name Spirit
Processor i9-9900k@5GHz
Motherboard ASUS Maximus Hero XI (WiFi)
Cooling Arctic Liquid Freezer II 420
Memory G.Skill Trident Z RGB 3600MHz CL14 4x8GB
Video Card(s) RTX 3090 FE
Storage WD Black SN750 500GB / 2x WD Blue SN550 2TB / WD Blue 4TB (external) / DS220j (2x WD RED 4TB@Raid0)
Display(s) Asus VG27AQ / LG C9 55"
Case Be Quiet! Dark Base 700
Audio Device(s) Soundblaster X-FI Titanium @Toslink / DT990 Pro 600 Ohm / Kenwood 6400D / Teufel Kompakt 30 5.1
Power Supply Be Quiet! Dark Power Pro 11 850W
Mouse Steelseries Rival 500
Keyboard Steelseries Apex 7 brown switches
Software Win10 Pro
Three weeks ago I got a 6900XT. After two weeks I gave up. Buyed new cables, used DDU and CRU several times, it didn't worked well. The card did not work with my LG C9, only 4k@60 was working, Freesync didn't, VRR didn't, HDR didn't, 4k@120Hz@8bit+dithering was working for one evening with blackscreens every 5 seconds and microstuttering at its finest. The next day the TV was showing "No Signal" till I dropped down to 60Hz. When I was playing on the TV, my Monitor showed some weired flickering, maybe it's called Freesync Flickering. It is a Dell S3220DGF with Freesync 2 HDR. Freesync only worked in windowed mode, at FullScreen there was so much flickering that it was not usable. I was so hyped from AMD, and so disappointed from Nvidia with their paperlaunch, customer politics (Hardware unboxed), #CudaCore "lying", but with Nvidia everything is working properly right from the start. Perhaps the HDMI 2.1 port on the AMD card is the reason for this mess because it is only 40Gb while the LG C9 got the full 48Gb. On the other side, it shouldn't affect the S3220DGF with a DP 1.4 cable (which I had to buy to get the 165Hz out of it, because Dell wanted to safe 10 cents and delivers only a DP 1.2 cable with this monitor).
Now I'm waiting for the 3080ti.
 
Last edited:
Joined
Dec 28, 2012
Messages
3,475 (0.84/day)
System Name Skunkworks
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software openSUSE tumbleweed/Mint 21.2
The Navi 21 XTXHash edition, for all the miners out there!
 
Joined
Jan 8, 2009
Messages
548 (0.10/day)
System Name AMD RyZen PC
Processor AMD RyZen 5950x
Motherboard ASUS Crosshair VIII Hero 570x WIFI
Cooling Custom Loop
Memory 64GB G.Skill Trident Z DDR4 3200 MHz 14C x4
Video Card(s) Evga 3080 TI
Storage Seagate 8TB + 3TB + 4TB + 2TB external + 512 Samsung 980
Display(s) LG 4K 144Hz 27GN950-B
Case Thermaltake CA-1F8-00M1WN-02 Core X71 Tempered Glass Edition Black
Audio Device(s) XI-FI 8.1
Power Supply EVGA 700W
Mouse Microsoft
Keyboard Microsoft
Software Windows 10 x64 Pro
I have yet to find a current gen AMD offerings in both CPU and GPU segments available to order at rated prices or close to it. It is just been a paper launch year. From all companies. Very disappointed.
Ya its so bad. I finally gave in and bought a 5950x for 1099 but hoping i can get a 3080TI or super for actual price
 
Joined
Dec 13, 2019
Messages
47 (0.03/day)
Either HBM version or The full fat CU with 7300+ core version ? Hope so... good that scalpers emptied all that stock that stopped me from making the move on a new GPU... Great news
The Radeon™ RX 6900 XT has the full Navi 21 chip.
 
Joined
Mar 10, 2010
Messages
11,878 (2.30/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
I'm expecting a higher binned hydro cooled 6900XT
 
Joined
Jun 21, 2011
Messages
165 (0.04/day)
The H does not stand for Mobile when it comes to AMD.. Their mobile lineup always has a M prefix when it comes to the GPU, so this could be an HBM 2.0 card.
For some reason a reply I gave has not shown up. Must've been a client-side issue, or a PEBCAC.

As I suggested in my first comment, I'm referring to the H-series CPUs. I'm deducing the XTXH would be an XTX-class GPU coupled with an H-series CPU (i.e. high-performance mobile). As mobile parts go, AMD does add the "m" suffix to their retail branding, but XTX is not a retail name (it was once, a long time ago, but not any more), XTX defines the die-family it belongs to.

At 27W I can't say it's a 3090-killer, as this looks like the power envelope for a mobile GPU.
 
Last edited:
Joined
Jan 29, 2016
Messages
128 (0.04/day)
System Name Ryzen 5800X-PC / RyzenITX (2nd system 5800X stock)
Processor AMD Ryzen 7 5800X (atx) / 5800X itx (soon one pc getting 5800X3D upgrade! ;)
Motherboard Gigabyte X570 AORUS MASTER (ATX) / X570 I Aorus Pro WiFi (ITX)
Cooling AMD Wrath Prism Cooler / Alphenhone Blackridge (ITX)
Memory OLOY 4000Mhz 16GB x 2 (32GB) DDR4 4000 Mhz CL18, (22,22,22,42) 1.40v AT & ITX PC's (2000 Fclk)
Video Card(s) AMD Radeon RX 6800 XT (ATX) /// AMD Radeon RX 6700 XT 12GB GDDR6 (ITX)
Storage (Sys)Sammy 970EVO 500GB & SabrentRocket 4.0+ 2TB (ATX) | SabrentRocket4.0+ 1TB NVMe (ITX)
Display(s) 30" Ultra-Wide 21:9 200Hz/AMD FREESYNC 200hz/144hz LED LCD Montior Connected Via Display Port (x2)
Case Lian Li Lancool II Mesh (ATX) / Velkase Velka 7 (ITX)
Audio Device(s) Realtek HD ALC1220 codec / Onboard HD Audio* (BOTH) w/ EQ settings
Power Supply 850w (Antec High-Current Gamer) HC-850 PSU (80+ gold certified) ATX) /650Watt Thermaltake SFX (ITX)
Mouse Logitech USB Wireless KB & MOUSE (Both Systems)
Keyboard Logitech USB Wireless KB & MOUSE (Both Systems)
VR HMD Oculus Quest 2 - 128GB - Standalone + Oculus link PC
Software Windows 10 Home x64bit 2400 /BOTH SYSTEMS
Benchmark Scores CPUZ - ATX-5800X (ST:670) - (MT: 6836.3 ) CPUZ - ITX -5800X (ST:680.2) - (MT: 7015.2) ??? same CPU?
Pretty sure Micron and Nvidia teamed up for GDDR6X. Not sure if it's open to others.
pretty bullshit / Anti-Competitive if you ask me, Nvidia/Micron should be PENTILIZED BYT THE FTC for this ANTI-Competitive Behavior? this is BULLSHIT!! THEY CANT KEEP THIS FROM OTHE TECH COMPANIES LIKE AMD, THIS IS ANTI COMPETITIVE BEHIVIOR!!!

For some reason a reply I gave has not shown up. Must've been a client-side issue, or a PEBCAC.

As I suggested in my first comment, I'm referring to the H-series CPUs. I'm deducing the XTXH would be an XTX-class GPU coupled with an H-series CPU (i.e. high-performance mobile). As mobile parts go, AMD does add the "m" suffix to their retail branding, but XTX is not a retail name (it was once, a long time ago, but not any more), XTX defines the die-family it belongs to.

At 27W I can't say it's a 3090-killer, as this looks like the power envelope for a mobile GPU.
besides, THE TOP XTX CHIP WOULDNT BE USED FOR A MOBILE PLATFORM ANY WAY. NO WAY IN HELL
 
Joined
Jul 13, 2016
Messages
2,828 (1.00/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage Too much
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) Topping DX5, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse G305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
pretty bullshit / Anti-Competitive if you ask me, Nvidia/Micron should be PENTILIZED BYT THE FTC for this ANTI-Competitive Behavior? this is BULLSHIT!! THEY CANT KEEP THIS FROM OTHE TECH COMPANIES LIKE AMD, THIS IS ANTI COMPETITIVE BEHIVIOR!!!


besides, THE TOP XTX CHIP WOULDNT BE USED FOR A MOBILE PLATFORM ANY WAY. NO WAY IN HELL

It's pretty interesting as Micron has been working on PAM4 signaling (the basis behind GDDR6X) since 2007. Most likely all Nvidia provided was help with the practical implementation and a promise to buy a ton of GDDR6X chips.

But hey, this is micron we are talking about here. They've been caught being anti-competitive in the past. I very much doubt they care about screwing AMD over if Nvidia has promised to buy a ton of their chips.
 
Joined
Jul 9, 2015
Messages
3,413 (1.06/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
Like 70% games has exactly ZERO need for any RT - f00k, look at all those pixel-art games (the worst possible example) people like to play so much. Strategies. Simulations (except MS FS and such). Let's face it, it's mainly FP(s) technology. What about cartoonish graphics? 2D? Both may look *somewhat* better with RT lights/shadows, but how much is that and how much all of it worths to an average (-budget) player? Rasterized lights/shadows aren't catastrophically bad, and the whole low/mid market perhaps don't need it that much...

There are 3 major points missed by people hyped about NV's RT.

First, this is how things look like on a 7870 level GPU (PS4, non pro):


reflections, shades, light effects, you name it, all in there.


Second, this is how things look in the demo of the latest version of the most popular game engine on the planet, oh, it uses none of the "hardware RT" even though it is present on the respective platform and even though hardware RT is supported by even older version of the same engine:



Last, but not least, people do not seem to know what actually is "hardware RT" and what it isn't.
If we trust DF "deep dive" (in many parts, honestly, pathetic, but there aren't too many reviews of the kind to choose from) there are multiple steps involved in what is regarded as RT:

1) Creating the structure
2) Checking rays for intersections
3) Doing something with that (denoising, some temporal tricks for reflection, etc)

In this list, ONLY STEP 2 is hardware accelerated. This is why, while AMD likely beats NV in raw RT-ing power (especially with that uber infinity cache) most green sponsored tiles run poorly on AMD GPUs, as #1 and #3, while having nothing to do with RT, are largely optimized for NV GPUs.

This also shows how "AMD is behind on RT" doesn't quite reflect "what RT" it is about (and why AMD's GPUs wipe the floor with green GPUs in, say, Dirt 5 RT).


Ultimately, tech fails to deliver on... pretty much any front:

Promise 1: "yet unseen graphics"
There are NO uber effects that we have not seen already, and to make it more insulting, even GoW on PS4 has 90%+ of all of them, despite using pathetic GPU

Promise 2: "mkay, we had those effects, but now it's so much easier to implement"
It is exactly the opposite. There is a lot of tinkering, most of it GPU manufacturer specific AND it still has major negative impact on performance.

CP2077 is an interesting example of it, with many "RT on" versions looking WORSE than "RT off".

Oh, and then there is "DLSS: The Hype For Brain Dead"... :D
 
Joined
Jan 29, 2016
Messages
128 (0.04/day)
System Name Ryzen 5800X-PC / RyzenITX (2nd system 5800X stock)
Processor AMD Ryzen 7 5800X (atx) / 5800X itx (soon one pc getting 5800X3D upgrade! ;)
Motherboard Gigabyte X570 AORUS MASTER (ATX) / X570 I Aorus Pro WiFi (ITX)
Cooling AMD Wrath Prism Cooler / Alphenhone Blackridge (ITX)
Memory OLOY 4000Mhz 16GB x 2 (32GB) DDR4 4000 Mhz CL18, (22,22,22,42) 1.40v AT & ITX PC's (2000 Fclk)
Video Card(s) AMD Radeon RX 6800 XT (ATX) /// AMD Radeon RX 6700 XT 12GB GDDR6 (ITX)
Storage (Sys)Sammy 970EVO 500GB & SabrentRocket 4.0+ 2TB (ATX) | SabrentRocket4.0+ 1TB NVMe (ITX)
Display(s) 30" Ultra-Wide 21:9 200Hz/AMD FREESYNC 200hz/144hz LED LCD Montior Connected Via Display Port (x2)
Case Lian Li Lancool II Mesh (ATX) / Velkase Velka 7 (ITX)
Audio Device(s) Realtek HD ALC1220 codec / Onboard HD Audio* (BOTH) w/ EQ settings
Power Supply 850w (Antec High-Current Gamer) HC-850 PSU (80+ gold certified) ATX) /650Watt Thermaltake SFX (ITX)
Mouse Logitech USB Wireless KB & MOUSE (Both Systems)
Keyboard Logitech USB Wireless KB & MOUSE (Both Systems)
VR HMD Oculus Quest 2 - 128GB - Standalone + Oculus link PC
Software Windows 10 Home x64bit 2400 /BOTH SYSTEMS
Benchmark Scores CPUZ - ATX-5800X (ST:670) - (MT: 6836.3 ) CPUZ - ITX -5800X (ST:680.2) - (MT: 7015.2) ??? same CPU?
Joined
Dec 22, 2011
Messages
3,890 (0.86/day)
Processor AMD Ryzen 7 3700X
Motherboard MSI MAG B550 TOMAHAWK
Cooling AMD Wraith Prism
Memory Team Group Dark Pro 8Pack Edition 3600Mhz CL16
Video Card(s) NVIDIA GeForce RTX 3080 FE
Storage Kingston A2000 1TB + Seagate HDD workhorse
Display(s) Samsung 50" QN94A Neo QLED
Case Antec 1200
Power Supply Seasonic Focus GX-850
Mouse Razer Deathadder Chroma
Keyboard Logitech UltraX
Software Windows 11
Either HBM version or The full fat CU with 7300+ core version ? Hope so... good that scalpers emptied all that stock that stopped me from making the move on a new GPU... Great news

Ahh HBM, now we are back to 256bit GDDR already... hence the Fury!
 

GnarlySpookXT

New Member
Joined
Jun 5, 2021
Messages
1 (0.00/day)
Sapphire nitro RX 6900 XT SE




Techpowerup.gif

2745 clock pic.gif
 
Joined
Feb 20, 2019
Messages
7,289 (3.87/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
Ultimately, [RT] tech fails to deliver on... pretty much any front:

Promise 1: "yet unseen graphics"
There are NO uber effects that we have not seen already, and to make it more insulting, even GoW on PS4 has 90%+ of all of them, despite using pathetic GPU

Promise 2: "mkay, we had those effects, but now it's so much easier to implement"
It is exactly the opposite. There is a lot of tinkering, most of it GPU manufacturer specific AND it still has major negative impact on performance.

CP2077 is an interesting example of it, with many "RT on" versions looking WORSE than "RT off".

Oh, and then there is "DLSS: The Hype For Brain Dead"... :D
I jumped on Turing at launch (didn't give a damn about RTX, just needed more raw performance) and having seen most, if not all, of the demos, benchmarks, and games with my own eyes I am 100% with you on RTX being overhyped garbage.

Too many of the RTX on/off comparisons aren't fair comparisons, they're RTX vs no effort at all - not RTX vs the alternative shadow/reflection/illumination/transparency methods we've seen since DX11 engines became popular. Gimmicks like Quake2 RTX or Minecraft are interesting tech demos, but that's not to say you couldn't also get very close facsimiles of their appearance at far higher framerates if developers put in the effort to import those assets into a modern game engine that supports SSAO, SSR, dynamic shadowmaps etc.

IMO, realtime raytracing may be the gold-standard for accuracy but it's also the least-efficient, dumbest, brute-force method that ends up being the least elegant solution with the worst possible performance of all the potential methods to render a scene's lighting.

DLSS and FidelityFX SR should not be dragged down with the flaws of raytracing though - sure, it's a crutch that can go some way towards mitigating the inefficiencies of realtime raytracing, but that shining praise of the technology - it can singlehandedly undo most of the damage caused by the huge performance penalties of RT and alongside VRS, I believe it is the technology that will let us continue increasing resolution without needing exponentially more GPU power to keep up. I have a 4K120 TV and AAA games simply aren't going to run at 4K120 without help. 4K displays have been mainstream for a decade and mass 8K adoption is looming on the horizon, no matter how pointless it may seem.
 
Top