• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 6800 XT

Joined
Jan 23, 2016
Messages
96 (0.03/day)
Location
Sofia, Bulgaria
Processor Ryzen 5 5600X I Core i7 6700K
Motherboard B550 Phantom Gaming 4 I Asus Z170-A ATX
Video Card(s) RX 6900 XT PowerColor Red Devil I RTX 3080 Palit GamingPro
Storage Intel 665P 2TB I Intel 660p 2TB
Case NZXT S340 Elite I Some noname case lmao
Mouse Logitech G Pro Wired
Keyboard Wooting Two Lekker Edition
Please educate yourself about game development a tiny bit. It's perfectly possible to make both games require a lot less VRAM - it's just their developers went overboard when they had access to the 2080 Ti.

Of course you can make something less VRAM. You just use inferior (smaller) textures.

You can also make the geometry bottleneck smaller if you use lower quality models and lesser tessellation settings, you can lessen the impact on the L2 caches (for Ampere) if the game's very design easily compresses using earlier DCC parameters from Pascal.

But why lower the settings? Isnt Ultra supposed to be decadent? If anything, we need games to have more demanding Ultra settings like we had in 2004 or 2007. A return to games being completely unplayable on Ultra settings like in the past would be so welcome to real tech enthusiasts. For the rest there is the "High" preset.
 
Joined
Jun 11, 2020
Messages
560 (0.40/day)
Location
Florida
Processor 5800x3d
Motherboard MSI Tomahawk x570
Cooling Thermalright
Memory 32 gb 3200mhz E die
Video Card(s) 3080
Storage 2tb nvme
Display(s) 165hz 1440p
Case Fractal Define R5
Power Supply Toughpower 850 platium
Mouse HyperX Hyperfire Pulse
Keyboard EVGA Z15
4K performance in Microsoft Flight Simulator is extremely disappointing :( I was hoping RDNA2's higher frequency and 16gb or ram to make a difference, but game obviously prefers more shaders and faster memory. Well, 3080 for me after all I guess.
View attachment 176144
WAIT FOR TI if you can!
 
Joined
Jan 23, 2016
Messages
96 (0.03/day)
Location
Sofia, Bulgaria
Processor Ryzen 5 5600X I Core i7 6700K
Motherboard B550 Phantom Gaming 4 I Asus Z170-A ATX
Video Card(s) RX 6900 XT PowerColor Red Devil I RTX 3080 Palit GamingPro
Storage Intel 665P 2TB I Intel 660p 2TB
Case NZXT S340 Elite I Some noname case lmao
Mouse Logitech G Pro Wired
Keyboard Wooting Two Lekker Edition
WAIT FOR TI if you can!
The 3080 Ti will for sure be slightly slower than the 3090. Less bandwidth. So the dude, if he really cares for MSFS so much... well that is his choice.
 
Joined
Apr 18, 2013
Messages
1,260 (0.32/day)
Location
Artem S. Tashkinov
Of course you can make something less VRAM. You just use inferior (smaller) textures.

And this is patently false. At the very minimum you can use textures streaming or make levels smaller. There are other methods as well.

Speaking of the RT performance in Watch dogs: Legion for AMD cards:



I'm not sure they are comparable yet. Something is definitely missing. :cool:
 

MxPhenom 216

ASIC Engineer
Joined
Aug 31, 2010
Messages
12,944 (2.61/day)
Location
Loveland, CO
System Name Ryzen Reflection
Processor AMD Ryzen 9 5900x
Motherboard Gigabyte X570S Aorus Master
Cooling 2x EK PE360 | TechN AM4 AMD Block Black | EK Quantum Vector Trinity GPU Nickel + Plexi
Memory Teamgroup T-Force Xtreem 2x16GB B-Die 3600 @ 14-14-14-28-42-288-2T 1.45v
Video Card(s) Zotac AMP HoloBlack RTX 3080Ti 12G | 950mV 1950Mhz
Storage WD SN850 500GB (OS) | Samsung 980 Pro 1TB (Games_1) | Samsung 970 Evo 1TB (Games_2)
Display(s) Asus XG27AQM 240Hz G-Sync Fast-IPS | Gigabyte M27Q-P 165Hz 1440P IPS | Asus 24" IPS (portrait mode)
Case Lian Li PC-011D XL | Custom cables by Cablemodz
Audio Device(s) FiiO K7 | Sennheiser HD650 + Beyerdynamic FOX Mic
Power Supply Seasonic Prime Ultra Platinum 850
Mouse Razer Viper v2 Pro
Keyboard Razer Huntsman Tournament Edition
Software Windows 11 Pro 64-Bit
Gamers need this GPU, here the reviews.


Better than the rtx 3080 with lower power consuption and is cheaper.
On 1080P its better than the rtx 3090
View attachment 176154

View attachment 176153

Good job AMD for beating NVIDIA. :clap:

(Some NVIDIA fan`s are not happy, lol.)

Imagine buying any of these cards for 1080p....
 
Joined
Jan 23, 2016
Messages
96 (0.03/day)
Location
Sofia, Bulgaria
Processor Ryzen 5 5600X I Core i7 6700K
Motherboard B550 Phantom Gaming 4 I Asus Z170-A ATX
Video Card(s) RX 6900 XT PowerColor Red Devil I RTX 3080 Palit GamingPro
Storage Intel 665P 2TB I Intel 660p 2TB
Case NZXT S340 Elite I Some noname case lmao
Mouse Logitech G Pro Wired
Keyboard Wooting Two Lekker Edition
And this is patently false. At the very minimum you can use textures streaming or make levels smaller. There are other methods as well.

id Tech 7 already does that. The only optimization it is missing (with regards to textures) is sampler feedback, or to be precises - a Vulkan equivalent to it.
 

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
15,800 (4.58/day)
Location
Kepler-186f
Imagine buying any of these cards for 1080p....


So what if some people do? If they have the intention of upgrading later on their monitor, most do, most can't buy everything at once.

That being said, some people do enjoy 240hz 1080p maxed out in AAA games, and even the rtx 3090 can't manage to do that yet for games like Valhalla, etc. i don't care about that high of frames personally... 144-165 is the sweet spot... and 60 at 4k.
 
Joined
Feb 21, 2006
Messages
1,972 (0.30/day)
Location
Toronto, Ontario
System Name The Expanse
Processor AMD Ryzen 7 5800X3D
Motherboard Asus Prime X570-Pro BIOS 5003 AM4 AGESA V2 PI 1.2.0.B
Cooling Corsair H150i Pro
Memory 32GB GSkill Trident RGB DDR4-3200 14-14-14-34-1T (B-Die)
Video Card(s) AMD Radeon RX 7900 XTX 24GB (24.3.1)
Storage WD SN850X 2TB / Corsair MP600 1TB / Samsung 860Evo 1TB x2 Raid 0 / Asus NAS AS1004T V2 14TB
Display(s) LG 34GP83A-B 34 Inch 21: 9 UltraGear Curved QHD (3440 x 1440) 1ms Nano IPS 160Hz
Case Fractal Design Meshify S2
Audio Device(s) Creative X-Fi + Logitech Z-5500 + HS80 Wireless
Power Supply Corsair AX850 Titanium
Mouse Corsair Dark Core RGB SE
Keyboard Corsair K100
Software Windows 10 Pro x64 22H2
Benchmark Scores 3800X https://valid.x86.fr/1zr4a5 5800X https://valid.x86.fr/2dey9c
You think RDNA2 based cards will not improve their performance over time? I know the future potential uplift dank memes about AMD, GCN on consoles but you really think the massive gains for games we saw on zen3 chips was just a coincidence?

When it comes to longevity of cards history shows AMD has NV beat. There are numerous example of this in the past. NV drops driver support in previous gen cards much faster than AMD does.
 

MxPhenom 216

ASIC Engineer
Joined
Aug 31, 2010
Messages
12,944 (2.61/day)
Location
Loveland, CO
System Name Ryzen Reflection
Processor AMD Ryzen 9 5900x
Motherboard Gigabyte X570S Aorus Master
Cooling 2x EK PE360 | TechN AM4 AMD Block Black | EK Quantum Vector Trinity GPU Nickel + Plexi
Memory Teamgroup T-Force Xtreem 2x16GB B-Die 3600 @ 14-14-14-28-42-288-2T 1.45v
Video Card(s) Zotac AMP HoloBlack RTX 3080Ti 12G | 950mV 1950Mhz
Storage WD SN850 500GB (OS) | Samsung 980 Pro 1TB (Games_1) | Samsung 970 Evo 1TB (Games_2)
Display(s) Asus XG27AQM 240Hz G-Sync Fast-IPS | Gigabyte M27Q-P 165Hz 1440P IPS | Asus 24" IPS (portrait mode)
Case Lian Li PC-011D XL | Custom cables by Cablemodz
Audio Device(s) FiiO K7 | Sennheiser HD650 + Beyerdynamic FOX Mic
Power Supply Seasonic Prime Ultra Platinum 850
Mouse Razer Viper v2 Pro
Keyboard Razer Huntsman Tournament Edition
Software Windows 11 Pro 64-Bit
Joined
Jan 23, 2016
Messages
96 (0.03/day)
Location
Sofia, Bulgaria
Processor Ryzen 5 5600X I Core i7 6700K
Motherboard B550 Phantom Gaming 4 I Asus Z170-A ATX
Video Card(s) RX 6900 XT PowerColor Red Devil I RTX 3080 Palit GamingPro
Storage Intel 665P 2TB I Intel 660p 2TB
Case NZXT S340 Elite I Some noname case lmao
Mouse Logitech G Pro Wired
Keyboard Wooting Two Lekker Edition
So what if some people do? If they have the intention of upgrading later on their monitor, most do, most can't buy everything at once.

That being said, some people do enjoy 240hz 1080p maxed out in AAA games, and even the rtx 3090 can't manage to do that yet for games like Valhalla, etc. i don't care about that high of frames personally... 144-165 is the sweet spot... and 60 at 4k.

Once you witness 4K 100-120 fps on an OLED HDR TV, you will never again want to go back to LCDs. Ever.
This is a bigger upgrade than upgrading a CPU or GPU. A far bigger one. I would honestly say that (were it to have HDMI2.1) a 5700 XT with such a TV/screen and medium settings is a noticeably superior experience than a RTX 3080/6800XT at Ultra setting on an inferior display.
 
Joined
Feb 21, 2006
Messages
1,972 (0.30/day)
Location
Toronto, Ontario
System Name The Expanse
Processor AMD Ryzen 7 5800X3D
Motherboard Asus Prime X570-Pro BIOS 5003 AM4 AGESA V2 PI 1.2.0.B
Cooling Corsair H150i Pro
Memory 32GB GSkill Trident RGB DDR4-3200 14-14-14-34-1T (B-Die)
Video Card(s) AMD Radeon RX 7900 XTX 24GB (24.3.1)
Storage WD SN850X 2TB / Corsair MP600 1TB / Samsung 860Evo 1TB x2 Raid 0 / Asus NAS AS1004T V2 14TB
Display(s) LG 34GP83A-B 34 Inch 21: 9 UltraGear Curved QHD (3440 x 1440) 1ms Nano IPS 160Hz
Case Fractal Design Meshify S2
Audio Device(s) Creative X-Fi + Logitech Z-5500 + HS80 Wireless
Power Supply Corsair AX850 Titanium
Mouse Corsair Dark Core RGB SE
Keyboard Corsair K100
Software Windows 10 Pro x64 22H2
Benchmark Scores 3800X https://valid.x86.fr/1zr4a5 5800X https://valid.x86.fr/2dey9c
Then you really have to wonder what AMD is gonna do when Nvidia moves to 5nm TSMC. I don't think AMD is looking forward to that.

You mean the 5nm TSMC node that apple is using for the M1 processor. Guess who is going to buy up all the fab capacity for that. Both AMD and NV will be fighting for craps there. And why do you assume AMD is going to sit on 7nm on the GPU side while NV goes to 5nm?

You do know there is an RDNA 3 already on the road map? and guess what nm its going to be using?
 
Joined
Jun 11, 2020
Messages
560 (0.40/day)
Location
Florida
Processor 5800x3d
Motherboard MSI Tomahawk x570
Cooling Thermalright
Memory 32 gb 3200mhz E die
Video Card(s) 3080
Storage 2tb nvme
Display(s) 165hz 1440p
Case Fractal Define R5
Power Supply Toughpower 850 platium
Mouse HyperX Hyperfire Pulse
Keyboard EVGA Z15
Hardware Unboxed and Gamers Nexus came to similar conclusion as TechPowerUp (1080p being the only real difference):
- RNDA2 has better performance per watt
- Offers more Vram
- 6800XT is on pair with 3080 in standard rasterization (better at 1080p, roughly on pair at 1440p, worse at 4K),
- worse in RT (roughly on pair with 2080TI)
- lacks AI supersampling ("DLSS")

It all comes down to what features do you want. Price performance ratio is about the same. 6800 is faster with more vram than 3070, but costs 16% more. 6800XT is on pair with 3080 has more vram BUT worse RT and no AI SS and costs 7% less. All AMD brings to the table is more options to chose from but it isn't necessary better value. Now 6800 costing $499 and 6800XT $599, that would be true Ampere killers. AMD has clearly chosen profit margins over market share gain. That's their decision to make. I personally still hate Ngreedia because of Turing, but I've fallen out of love for Team red too, since they've decided to rise profit margins. I'll just buy what suits my needs best for as cheap as I can get it.

Its sad but they just spent 30 billion acquiring a company, they quite literally can't afford to give too much value, especially considering supply. Lets hope they'll throw something special at the mainstream segment when they reveal navi 22...
 

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
15,800 (4.58/day)
Location
Kepler-186f
Once you witness 4K 100-120 fps on an OLED HDR TV, you will never again want to go back to LCDs. Ever.
This is a bigger upgrade than upgrading a CPU or GPU. A far bigger one. I would honestly say that (were it to have HDMI2.1) a 5700 XT with such a TV/screen and medium settings is a noticeably superior experience than a RTX 3080/6800XT at Ultra setting on an inferior display.

I will never be that rich, so more power to you lol
 
Joined
Jul 8, 2019
Messages
169 (0.10/day)
So what if some people do? If they have the intention of upgrading later on their monitor, most do, most can't buy everything at once.

That being said, some people do enjoy 240hz 1080p maxed out in AAA games, and even the rtx 3090 can't manage to do that yet for games like Valhalla, etc. i don't care about that high of frames personally... 144-165 is the sweet spot... and 60 at 4k.

I find pushing AAA games with higher frames like 144hz futile, because you often get into frame pacing[stutter] issues on these mostly unoptimized, console port type of games. They are first and foremost optimized for 30 fps, then for 60 fps, and anything higher is just luxury. But you are right, some people enjoy higher frames on lower resolutions.
 
Joined
Jan 21, 2020
Messages
109 (0.07/day)
Speaking of the RT performance in Watch dogs: Legion for AMD cards:



I'm not sure they are comparable yet. Something is definitely missing. :cool:
This is exactly why I asked for a review of the actual raytracing output on both Nvidia and AMD cards.
 
Joined
Dec 30, 2010
Messages
2,082 (0.43/day)
The 6800xt avg gaming power consumption is around 218watts, vs 300watts for the 3080 and above. GG AMD!
 
Joined
Nov 4, 2005
Messages
11,655 (1.73/day)
System Name Compy 386
Processor 7800X3D
Motherboard Asus
Cooling Air for now.....
Memory 64 GB DDR5 6400Mhz
Video Card(s) 7900XTX 310 Merc
Storage Samsung 990 2TB, 2 SP 2TB SSDs and over 10TB spinning
Display(s) 56" Samsung 4K HDR
Audio Device(s) ATI HDMI
Mouse Logitech MX518
Keyboard Razer
Software A lot.
Benchmark Scores Its fast. Enough.
Well with all due respect, it is you who seem to have the "special" eyes because the numbers you mention are "modified" towards AMDs favor.

It's mostly 5% difference and the power consumption difference is closer to 50 watts.

Also the "other tweaks" part is a lot better for the Green team nowadays... bios flashing on 3000 series is easy as pie and brings great performance jumps with it especially combined with watercooling.

Not to mention the software side of things with dlss, raytracing etc

If you think 5% is insignificant, go tell that to high end nvme drives lol

210 VS 303 is 93W difference.



3080 is 6% faster, but the 6800XT can overclock 10% faster out of the box, VS 4% for the 3080. So a net equal.

Raytracing is 10 games currently, if those 10 games are a make or break deal go for the green, but to 90% of gamers we know raytracing will be like tesselation, it will take a few generations to implement and match up performance and quality and by that time these cards will be obsolete and the difference will be 15FPS at 4K in those new games VS 21FPS.

DLSS, I'm not sold on, it takes special profiles from nvidia, do they have support for every game? Does it matter if AMD does the same?

And about overclocking and flashing BIOS, if 10% out of the box and a whopping 1v is an indication for AMD with the node they are on watercooling and 1.2-1.3V should give a 6800XT 25% more clock speed, meaning it will be 25% faster than the 3080 while still being cheaper. So still the better buy for 90% of gamers, games and those who want to play the silicon lottery and tweak. Samsungs node is crap and poor choice on on Nvidia for using them to save a few $$$
 
Joined
Apr 18, 2013
Messages
1,260 (0.32/day)
Location
Artem S. Tashkinov
When it comes to longevity of cards history shows AMD has NV beat. There are numerous example of this in the past. NV drops driver support in previous gen cards much faster than AMD does.

This myth has been debunked by many reputable websites, including TPU. NVIDIA does not drop driver support for previous gen cards [faster]. If anythings they support their cards a lot longer than AMD. For instance NVIDIA still fully supports Kepler generation cards which were released 8 years ago.

NVIDIA however stops tweaking drivers for previous generations cards because it's just not worth it from the financial standpoint - performance is not there anyways: any extracted performance gains would not bring your older cards to the level where their performance is enough to run new heavy games. Imagine optimizing drivers for the GTX 680. Why would you do that? The card is absolutely insufficient for modern games.
 
Last edited:
Joined
Jan 21, 2020
Messages
109 (0.07/day)
DLSS, I'm not sold on, it takes special profiles from nvidia, do they have support for every game? Does it matter if AMD does the same?
That is no longer the case (it was in DLSS 1.0). Since DLSS 2.0 there are no per-game profiles anymore. No game-specific training is required. The experience is now streamlined meaning that game engines like Unreal Engine and Unity can support it out of the box. That in turn means a lot more adoption for DLSS going forward.
 
Joined
Jul 8, 2019
Messages
169 (0.10/day)
This myth has been debunked by many reputable websites, including TPU. NVIDIA does not drop driver support for previous gen cards [faster]. If anythings they support their cards a lot longer than AMD. For instance NVIDIA still fully supports Kepler generation cards which were released 8 years ago.

NVIDIA however stops tweaking drivers for previous generations cards because it's just not worth it from the financial standpoint - performance is not there anyways: any extracted performance gains would not bring your older cards to the level where their performance is enough to run new heavy games. Imaging optimizing drivers for the GTX 680. Why would you do that? The card is absolutely insufficient for modern games.

Adding to that, NVIDIA usually offer almost full performance upfront, and AMD just hone their drivers over longer time, thus making it appear as AMD performance improves as their gpu age.
 
Joined
Jan 23, 2016
Messages
96 (0.03/day)
Location
Sofia, Bulgaria
Processor Ryzen 5 5600X I Core i7 6700K
Motherboard B550 Phantom Gaming 4 I Asus Z170-A ATX
Video Card(s) RX 6900 XT PowerColor Red Devil I RTX 3080 Palit GamingPro
Storage Intel 665P 2TB I Intel 660p 2TB
Case NZXT S340 Elite I Some noname case lmao
Mouse Logitech G Pro Wired
Keyboard Wooting Two Lekker Edition
Adding to that, NVIDIA usually offer almost full performance upfront, and AMD just hone their drivers over longer time, thus making it appear as AMD performance improves as their gpu age.

This isnt fully fair either. Nvidia do tweak performance over time too. They are engineers, not posthuman genetically engineered entities with future-seeing capabilities. They still need time.

As for AMD - they have a smaller team so things like their performance long term... it does need more work for sure. This is where the Fine Wine thing came from.
It is good though. As long as AMD prices products on performance during the product's release, it is A-OK to have drivers improve it further. It means you paid fair for it once and get better performance long term. That is my thought process and I used it for Turing too (which really did improve over time nice).
 
Joined
Jan 3, 2015
Messages
2,873 (0.85/day)
System Name The beast and the little runt.
Processor Ryzen 5 5600X - Ryzen 9 5950X
Motherboard ASUS ROG STRIX B550-I GAMING - ASUS ROG Crosshair VIII Dark Hero X570
Cooling Noctua NH-L9x65 SE-AM4a - NH-D15 chromax.black with IPPC Industrial 3000 RPM 120/140 MM fans.
Memory G.SKILL TRIDENT Z ROYAL GOLD/SILVER 32 GB (2 x 16 GB and 4 x 8 GB) 3600 MHz CL14-15-15-35 1.45 volts
Video Card(s) GIGABYTE RTX 4060 OC LOW PROFILE - GIGABYTE RTX 4090 GAMING OC
Storage Samsung 980 PRO 1 TB + 2 TB - Samsung 870 EVO 4 TB - 2 x WD RED PRO 16 GB + WD ULTRASTAR 22 TB
Display(s) Asus 27" TUF VG27AQL1A and a Dell 24" for dual setup
Case Phanteks Enthoo 719/LUXE 2 BLACK
Audio Device(s) Onboard on both boards
Power Supply Phanteks Revolt X 1200W
Mouse Logitech G903 Lightspeed Wireless Gaming Mouse
Keyboard Logitech G910 Orion Spectrum
Software WINDOWS 10 PRO 64 BITS on both systems
Benchmark Scores Se more about my 2 in 1 system here: kortlink.dk/2ca4x
I'm not sure where I stand right now frankly. AMD's cards really has been a good thing happening for sure. Nvidia now has serious competition.

But here comes my concern, while I have taken the choice to move to Zen 3. I'm not so sure on big Navi yet. First of all I have only had nvidia since like ever. I have never owned a amd card in my entire life.

Big Navi offers more vram and a little bit cheaper. That is great. But will driver be the Radeon 5000 series all over again with bugs and problems, that still whas a problem long time after launch and the last thing I want is driver with issues and problems. So far amd will have to convince me away from nvidia, they will have to show that not only the hardware is good, but also there software side. Also it can clearly be seen ray tracing on big nav is in it's infantcy and not on pair with nvidia just yet.

So I guess my next gpu choise will be between 6900 XT and the rumored RTX 3080 TI. Also depending on performance and driver experience and optimization. Nvidia also have some other features like streaming and AI software. In short, I am not totally convinced to go big Navi yet. Drivers first of all have to be a good experience, bugs and errors will only piss me off. Then Ray tracing will have to mature as well. I might end up going nvidia again, so amd convince me to go radeon.
 
Joined
Jan 8, 2017
Messages
8,863 (3.36/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
Adding to that, NVIDIA usually offer almost full performance upfront, and AMD just hone their drivers over longer time, thus making it appear as AMD performance improves as their gpu age.

The dilemma of the Nvidia fanboy.

It used to be that they denied the existence of "Fine Wine", now apparently their drivers do improve performance over time. Man this is so strange.
 
Top