• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 6800 XT

Joined
Jan 23, 2016
Messages
72 (0.04/day)
Location
Sofia, Bulgaria
Processor Ryzen 5 3600 I Core i7 6700K
Video Card(s) RX 5700 XT Nitro+ I RTX 2070 Super Zotac Mini
Mouse Logitech G Pro
Keyboard Wooting Two Lekker Edition
Please educate yourself about game development a tiny bit. It's perfectly possible to make both games require a lot less VRAM - it's just their developers went overboard when they had access to the 2080 Ti.

Of course you can make something less VRAM. You just use inferior (smaller) textures.

You can also make the geometry bottleneck smaller if you use lower quality models and lesser tessellation settings, you can lessen the impact on the L2 caches (for Ampere) if the game's very design easily compresses using earlier DCC parameters from Pascal.

But why lower the settings? Isnt Ultra supposed to be decadent? If anything, we need games to have more demanding Ultra settings like we had in 2004 or 2007. A return to games being completely unplayable on Ultra settings like in the past would be so welcome to real tech enthusiasts. For the rest there is the "High" preset.
 
Joined
Jan 23, 2016
Messages
72 (0.04/day)
Location
Sofia, Bulgaria
Processor Ryzen 5 3600 I Core i7 6700K
Video Card(s) RX 5700 XT Nitro+ I RTX 2070 Super Zotac Mini
Mouse Logitech G Pro
Keyboard Wooting Two Lekker Edition
WAIT FOR TI if you can!
The 3080 Ti will for sure be slightly slower than the 3090. Less bandwidth. So the dude, if he really cares for MSFS so much... well that is his choice.
 
Joined
Apr 18, 2013
Messages
973 (0.33/day)
Of course you can make something less VRAM. You just use inferior (smaller) textures.

And this is patently false. At the very minimum you can use textures streaming or make levels smaller. There are other methods as well.

Speaking of the RT performance in Watch dogs: Legion for AMD cards:



I'm not sure they are comparable yet. Something is definitely missing. :cool:
 

MxPhenom 216

ASIC Engineer
Joined
Aug 31, 2010
Messages
12,519 (3.22/day)
Location
Longmont, CO
System Name Please god I need a GPU!
Processor Intel Core i7 8700k @ 4.8GHz 1.28v
Motherboard MSI Z370 Gaming Pro Carbon AC
Cooling 2x EK PE360 | EK CPU and GPU WB | Full hard line tubing | Singularity Resonance Single res/pump
Memory Corsair Vengeance Pro RGB 32GB 3200 14-14-14-34
Video Card(s) MSI GTX1070 Gaming X -> Asus TUF RTX3080 or Evga XC3 RTX3080
Storage 1TB Samsung 970 EVO 2TB Samsung 970 Evo Plus
Display(s) Dell S3220DGF 32" 1440p Freesync 2 (G-Sync) HDR 165Hz | 2x Asus VP249QGR 144Hz IPS
Case Lian Li PC-011D
Audio Device(s) Realtek 1220 w/ Sennheiser Game Ones
Power Supply Seasonic Flagship Prime Ultra Platinum 850
Mouse Razer Viper
Keyboard Razer Huntsman Tournament Edition
Software Windows 10 Pro 64-Bit
Gamers need this GPU, here the reviews.


Better than the rtx 3080 with lower power consuption and is cheaper.
On 1080P its better than the rtx 3090
View attachment 176154

View attachment 176153

Good job AMD for beating NVIDIA. :clap:

(Some NVIDIA fan`s are not happy, lol.)

Imagine buying any of these cards for 1080p....
 
Joined
Jan 23, 2016
Messages
72 (0.04/day)
Location
Sofia, Bulgaria
Processor Ryzen 5 3600 I Core i7 6700K
Video Card(s) RX 5700 XT Nitro+ I RTX 2070 Super Zotac Mini
Mouse Logitech G Pro
Keyboard Wooting Two Lekker Edition
And this is patently false. At the very minimum you can use textures streaming or make levels smaller. There are other methods as well.

id Tech 7 already does that. The only optimization it is missing (with regards to textures) is sampler feedback, or to be precises - a Vulkan equivalent to it.
 
Joined
Oct 17, 2014
Messages
6,168 (2.60/day)
Location
USA
System Name Paladius Tacet
Processor Ryzen 5600x
Motherboard MSI X570 Tomahawk
Cooling Arctic Freezer 34 DUO (custom aggressive fan curve)
Memory G.Skill 2x16 3600 14-14-14-34 Dual Rank
Video Card(s) Navi 6800 + Rage Mode + OC
Display(s) Acer Nitro XF243Y 23.8" 0.5ms IPS 165hz 1080p
Case Corsair 110Q Silent + NZXT Aer-P exhaust fan
Power Supply EVGA 700w Gold
Mouse Razer Naga X (2021 Edition)
Imagine buying any of these cards for 1080p....


So what if some people do? If they have the intention of upgrading later on their monitor, most do, most can't buy everything at once.

That being said, some people do enjoy 240hz 1080p maxed out in AAA games, and even the rtx 3090 can't manage to do that yet for games like Valhalla, etc. i don't care about that high of frames personally... 144-165 is the sweet spot... and 60 at 4k.
 
Joined
Feb 21, 2006
Messages
1,058 (0.19/day)
Location
Toronto, Ontario
System Name The Expanse
Processor AMD Ryzen 7 5800X
Motherboard Asus Prime X570-Pro BIOS 3603 AM4 AGESA V2 PI 1.2.0.1 Patch A
Cooling Corsair H150i Pro
Memory 16GB Gskill Trident RGB DDR4-3200 14-14-14-34-1T
Video Card(s) GIGABYTE Radeon RX 580 GAMING 8GB
Storage Corsair MP600 1TB PCIe 4 / Samsung 860Evo 1TB x2 Raid 0 / Asus NAS AS1004T V2 12TB
Display(s) HP ZR24w + LG 24MB35 on Neo-Flex® Dual Monitor Lift Stand
Case Fractal Design Meshify S2
Audio Device(s) Creative X-Fi + Logitech Z-5500
Power Supply Corsair AX850 Titanium
Mouse Corsair Dark Core RGB
Keyboard Logitech G810
Software Windows 10 Pro x64 20H2
Benchmark Scores 3800X https://valid.x86.fr/1zr4a5 5800X https://valid.x86.fr/1bigrn
You think RDNA2 based cards will not improve their performance over time? I know the future potential uplift dank memes about AMD, GCN on consoles but you really think the massive gains for games we saw on zen3 chips was just a coincidence?

When it comes to longevity of cards history shows AMD has NV beat. There are numerous example of this in the past. NV drops driver support in previous gen cards much faster than AMD does.
 

MxPhenom 216

ASIC Engineer
Joined
Aug 31, 2010
Messages
12,519 (3.22/day)
Location
Longmont, CO
System Name Please god I need a GPU!
Processor Intel Core i7 8700k @ 4.8GHz 1.28v
Motherboard MSI Z370 Gaming Pro Carbon AC
Cooling 2x EK PE360 | EK CPU and GPU WB | Full hard line tubing | Singularity Resonance Single res/pump
Memory Corsair Vengeance Pro RGB 32GB 3200 14-14-14-34
Video Card(s) MSI GTX1070 Gaming X -> Asus TUF RTX3080 or Evga XC3 RTX3080
Storage 1TB Samsung 970 EVO 2TB Samsung 970 Evo Plus
Display(s) Dell S3220DGF 32" 1440p Freesync 2 (G-Sync) HDR 165Hz | 2x Asus VP249QGR 144Hz IPS
Case Lian Li PC-011D
Audio Device(s) Realtek 1220 w/ Sennheiser Game Ones
Power Supply Seasonic Flagship Prime Ultra Platinum 850
Mouse Razer Viper
Keyboard Razer Huntsman Tournament Edition
Software Windows 10 Pro 64-Bit
Joined
Jan 23, 2016
Messages
72 (0.04/day)
Location
Sofia, Bulgaria
Processor Ryzen 5 3600 I Core i7 6700K
Video Card(s) RX 5700 XT Nitro+ I RTX 2070 Super Zotac Mini
Mouse Logitech G Pro
Keyboard Wooting Two Lekker Edition
So what if some people do? If they have the intention of upgrading later on their monitor, most do, most can't buy everything at once.

That being said, some people do enjoy 240hz 1080p maxed out in AAA games, and even the rtx 3090 can't manage to do that yet for games like Valhalla, etc. i don't care about that high of frames personally... 144-165 is the sweet spot... and 60 at 4k.

Once you witness 4K 100-120 fps on an OLED HDR TV, you will never again want to go back to LCDs. Ever.
This is a bigger upgrade than upgrading a CPU or GPU. A far bigger one. I would honestly say that (were it to have HDMI2.1) a 5700 XT with such a TV/screen and medium settings is a noticeably superior experience than a RTX 3080/6800XT at Ultra setting on an inferior display.
 
Joined
Feb 21, 2006
Messages
1,058 (0.19/day)
Location
Toronto, Ontario
System Name The Expanse
Processor AMD Ryzen 7 5800X
Motherboard Asus Prime X570-Pro BIOS 3603 AM4 AGESA V2 PI 1.2.0.1 Patch A
Cooling Corsair H150i Pro
Memory 16GB Gskill Trident RGB DDR4-3200 14-14-14-34-1T
Video Card(s) GIGABYTE Radeon RX 580 GAMING 8GB
Storage Corsair MP600 1TB PCIe 4 / Samsung 860Evo 1TB x2 Raid 0 / Asus NAS AS1004T V2 12TB
Display(s) HP ZR24w + LG 24MB35 on Neo-Flex® Dual Monitor Lift Stand
Case Fractal Design Meshify S2
Audio Device(s) Creative X-Fi + Logitech Z-5500
Power Supply Corsair AX850 Titanium
Mouse Corsair Dark Core RGB
Keyboard Logitech G810
Software Windows 10 Pro x64 20H2
Benchmark Scores 3800X https://valid.x86.fr/1zr4a5 5800X https://valid.x86.fr/1bigrn
Then you really have to wonder what AMD is gonna do when Nvidia moves to 5nm TSMC. I don't think AMD is looking forward to that.

You mean the 5nm TSMC node that apple is using for the M1 processor. Guess who is going to buy up all the fab capacity for that. Both AMD and NV will be fighting for craps there. And why do you assume AMD is going to sit on 7nm on the GPU side while NV goes to 5nm?

You do know there is an RDNA 3 already on the road map? and guess what nm its going to be using?
 
Joined
Jun 11, 2020
Messages
152 (0.49/day)
Location
Florida
Hardware Unboxed and Gamers Nexus came to similar conclusion as TechPowerUp (1080p being the only real difference):
- RNDA2 has better performance per watt
- Offers more Vram
- 6800XT is on pair with 3080 in standard rasterization (better at 1080p, roughly on pair at 1440p, worse at 4K),
- worse in RT (roughly on pair with 2080TI)
- lacks AI supersampling ("DLSS")

It all comes down to what features do you want. Price performance ratio is about the same. 6800 is faster with more vram than 3070, but costs 16% more. 6800XT is on pair with 3080 has more vram BUT worse RT and no AI SS and costs 7% less. All AMD brings to the table is more options to chose from but it isn't necessary better value. Now 6800 costing $499 and 6800XT $599, that would be true Ampere killers. AMD has clearly chosen profit margins over market share gain. That's their decision to make. I personally still hate Ngreedia because of Turing, but I've fallen out of love for Team red too, since they've decided to rise profit margins. I'll just buy what suits my needs best for as cheap as I can get it.

Its sad but they just spent 30 billion acquiring a company, they quite literally can't afford to give too much value, especially considering supply. Lets hope they'll throw something special at the mainstream segment when they reveal navi 22...
 
Joined
Oct 17, 2014
Messages
6,168 (2.60/day)
Location
USA
System Name Paladius Tacet
Processor Ryzen 5600x
Motherboard MSI X570 Tomahawk
Cooling Arctic Freezer 34 DUO (custom aggressive fan curve)
Memory G.Skill 2x16 3600 14-14-14-34 Dual Rank
Video Card(s) Navi 6800 + Rage Mode + OC
Display(s) Acer Nitro XF243Y 23.8" 0.5ms IPS 165hz 1080p
Case Corsair 110Q Silent + NZXT Aer-P exhaust fan
Power Supply EVGA 700w Gold
Mouse Razer Naga X (2021 Edition)
Once you witness 4K 100-120 fps on an OLED HDR TV, you will never again want to go back to LCDs. Ever.
This is a bigger upgrade than upgrading a CPU or GPU. A far bigger one. I would honestly say that (were it to have HDMI2.1) a 5700 XT with such a TV/screen and medium settings is a noticeably superior experience than a RTX 3080/6800XT at Ultra setting on an inferior display.

I will never be that rich, so more power to you lol
 
Joined
Jul 8, 2019
Messages
142 (0.22/day)
So what if some people do? If they have the intention of upgrading later on their monitor, most do, most can't buy everything at once.

That being said, some people do enjoy 240hz 1080p maxed out in AAA games, and even the rtx 3090 can't manage to do that yet for games like Valhalla, etc. i don't care about that high of frames personally... 144-165 is the sweet spot... and 60 at 4k.

I find pushing AAA games with higher frames like 144hz futile, because you often get into frame pacing[stutter] issues on these mostly unoptimized, console port type of games. They are first and foremost optimized for 30 fps, then for 60 fps, and anything higher is just luxury. But you are right, some people enjoy higher frames on lower resolutions.
 
Joined
Jan 21, 2020
Messages
109 (0.24/day)
Speaking of the RT performance in Watch dogs: Legion for AMD cards:



I'm not sure they are comparable yet. Something is definitely missing. :cool:
This is exactly why I asked for a review of the actual raytracing output on both Nvidia and AMD cards.
 
Joined
Dec 30, 2010
Messages
1,211 (0.32/day)
The 6800xt avg gaming power consumption is around 218watts, vs 300watts for the 3080 and above. GG AMD!
 
Joined
Nov 4, 2005
Messages
10,745 (1.90/day)
System Name MoFo 2
Processor AMD PhenomII 1100T @ 4.2Ghz
Motherboard Asus Crosshair IV
Cooling Swiftec 655 pump, Apogee GT,, MCR360mm Rad, 1/2 loop.
Memory 8GB DDR3-2133 @ 1900 8.9.9.24 1T
Video Card(s) HD7970 1250/1750
Storage Agility 3 SSD 6TB RAID 0 on RAID Card
Display(s) 46" 1080P Toshiba LCD
Case Rosewill R6A34-BK modded (thanks to MKmods)
Audio Device(s) ATI HDMI
Power Supply 750W PC Power & Cooling modded (thanks to MKmods)
Software A lot.
Benchmark Scores Its fast. Enough.
Well with all due respect, it is you who seem to have the "special" eyes because the numbers you mention are "modified" towards AMDs favor.

It's mostly 5% difference and the power consumption difference is closer to 50 watts.

Also the "other tweaks" part is a lot better for the Green team nowadays... bios flashing on 3000 series is easy as pie and brings great performance jumps with it especially combined with watercooling.

Not to mention the software side of things with dlss, raytracing etc

If you think 5% is insignificant, go tell that to high end nvme drives lol

210 VS 303 is 93W difference.



3080 is 6% faster, but the 6800XT can overclock 10% faster out of the box, VS 4% for the 3080. So a net equal.

Raytracing is 10 games currently, if those 10 games are a make or break deal go for the green, but to 90% of gamers we know raytracing will be like tesselation, it will take a few generations to implement and match up performance and quality and by that time these cards will be obsolete and the difference will be 15FPS at 4K in those new games VS 21FPS.

DLSS, I'm not sold on, it takes special profiles from nvidia, do they have support for every game? Does it matter if AMD does the same?

And about overclocking and flashing BIOS, if 10% out of the box and a whopping 1v is an indication for AMD with the node they are on watercooling and 1.2-1.3V should give a 6800XT 25% more clock speed, meaning it will be 25% faster than the 3080 while still being cheaper. So still the better buy for 90% of gamers, games and those who want to play the silicon lottery and tweak. Samsungs node is crap and poor choice on on Nvidia for using them to save a few $$$
 
Joined
Apr 18, 2013
Messages
973 (0.33/day)
When it comes to longevity of cards history shows AMD has NV beat. There are numerous example of this in the past. NV drops driver support in previous gen cards much faster than AMD does.

This myth has been debunked by many reputable websites, including TPU. NVIDIA does not drop driver support for previous gen cards [faster]. If anythings they support their cards a lot longer than AMD. For instance NVIDIA still fully supports Kepler generation cards which were released 8 years ago.

NVIDIA however stops tweaking drivers for previous generations cards because it's just not worth it from the financial standpoint - performance is not there anyways: any extracted performance gains would not bring your older cards to the level where their performance is enough to run new heavy games. Imagine optimizing drivers for the GTX 680. Why would you do that? The card is absolutely insufficient for modern games.
 
Last edited:
Joined
Jan 21, 2020
Messages
109 (0.24/day)
DLSS, I'm not sold on, it takes special profiles from nvidia, do they have support for every game? Does it matter if AMD does the same?
That is no longer the case (it was in DLSS 1.0). Since DLSS 2.0 there are no per-game profiles anymore. No game-specific training is required. The experience is now streamlined meaning that game engines like Unreal Engine and Unity can support it out of the box. That in turn means a lot more adoption for DLSS going forward.
 
Joined
Jul 8, 2019
Messages
142 (0.22/day)
This myth has been debunked by many reputable websites, including TPU. NVIDIA does not drop driver support for previous gen cards [faster]. If anythings they support their cards a lot longer than AMD. For instance NVIDIA still fully supports Kepler generation cards which were released 8 years ago.

NVIDIA however stops tweaking drivers for previous generations cards because it's just not worth it from the financial standpoint - performance is not there anyways: any extracted performance gains would not bring your older cards to the level where their performance is enough to run new heavy games. Imaging optimizing drivers for the GTX 680. Why would you do that? The card is absolutely insufficient for modern games.

Adding to that, NVIDIA usually offer almost full performance upfront, and AMD just hone their drivers over longer time, thus making it appear as AMD performance improves as their gpu age.
 
Joined
Jan 23, 2016
Messages
72 (0.04/day)
Location
Sofia, Bulgaria
Processor Ryzen 5 3600 I Core i7 6700K
Video Card(s) RX 5700 XT Nitro+ I RTX 2070 Super Zotac Mini
Mouse Logitech G Pro
Keyboard Wooting Two Lekker Edition
Adding to that, NVIDIA usually offer almost full performance upfront, and AMD just hone their drivers over longer time, thus making it appear as AMD performance improves as their gpu age.

This isnt fully fair either. Nvidia do tweak performance over time too. They are engineers, not posthuman genetically engineered entities with future-seeing capabilities. They still need time.

As for AMD - they have a smaller team so things like their performance long term... it does need more work for sure. This is where the Fine Wine thing came from.
It is good though. As long as AMD prices products on performance during the product's release, it is A-OK to have drivers improve it further. It means you paid fair for it once and get better performance long term. That is my thought process and I used it for Turing too (which really did improve over time nice).
 
Joined
Jan 3, 2015
Messages
1,879 (0.82/day)
System Name Suffering from random access memory loss.
Processor Intel Core i7 980X @ 4.42 GHz (when becnching op to 4.77 GHz)
Motherboard ASUS P6X58D Premium
Cooling Noctua NH-D14 CPU cooler with 3 x noctua nf-f12 industrialppc-3000 pwm 120 MM fans
Memory 12 GB DDR3 CORSAIR 1600 MHz TIMMINGS 9-9-9-24
Video Card(s) EVGA GeForce GTX 1080 Ti SC2 GAMING
Storage 2 x Samsung 950 PRO 256 GB M.2 NVME SSD, Crucial MX300 2 TB SSD, WD RED 4 TB + WD AV-GP 2 TB HDD.
Display(s) 24" SAMSUNG
Case ANTEC TWELVE HUNDRED
Audio Device(s) ONBOARD realtek sound card.
Power Supply THERMALTAKE TOUGHPOWER 1500 WATT
Mouse Logitech G903 Lightspeed Wireless Gaming Mouse
Keyboard LOGITECH G19s
Software WINDOWS 10 PRO 64 BITS
Benchmark Scores https://www.3dmark.com/fs/13514527 https://www.3dmark.com/spy/2333033
I'm not sure where I stand right now frankly. AMD's cards really has been a good thing happening for sure. Nvidia now has serious competition.

But here comes my concern, while I have taken the choice to move to Zen 3. I'm not so sure on big Navi yet. First of all I have only had nvidia since like ever. I have never owned a amd card in my entire life.

Big Navi offers more vram and a little bit cheaper. That is great. But will driver be the Radeon 5000 series all over again with bugs and problems, that still whas a problem long time after launch and the last thing I want is driver with issues and problems. So far amd will have to convince me away from nvidia, they will have to show that not only the hardware is good, but also there software side. Also it can clearly be seen ray tracing on big nav is in it's infantcy and not on pair with nvidia just yet.

So I guess my next gpu choise will be between 6900 XT and the rumored RTX 3080 TI. Also depending on performance and driver experience and optimization. Nvidia also have some other features like streaming and AI software. In short, I am not totally convinced to go big Navi yet. Drivers first of all have to be a good experience, bugs and errors will only piss me off. Then Ray tracing will have to mature as well. I might end up going nvidia again, so amd convince me to go radeon.
 
Joined
Jan 8, 2017
Messages
6,601 (4.23/day)
System Name Good enough
Processor AMD Ryzen R7 1700X - 4.0 Ghz / 1.350V
Motherboard ASRock B450M Pro4
Cooling Deepcool Gammaxx L240 V2
Memory 16GB - Corsair Vengeance LPX - 3333 Mhz CL16
Video Card(s) OEM Dell GTX 1080 with Kraken G12 + Water 3.0 Performer C
Storage 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) 4K Samsung TV
Case Deepcool Matrexx 70
Power Supply GPS-750C
Adding to that, NVIDIA usually offer almost full performance upfront, and AMD just hone their drivers over longer time, thus making it appear as AMD performance improves as their gpu age.

The dilemma of the Nvidia fanboy.

It used to be that they denied the existence of "Fine Wine", now apparently their drivers do improve performance over time. Man this is so strange.
 
Top