• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 6800 XT

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
21,966 (3.55/day)
Processor Core i7-8700K
Memory 32 GB
Video Card(s) RTX 3080
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
"Much has been talked about VRAM size during NVIDIA's Ampere launch. The GeForce RTX 3080 offers 10 GB VRAM, which I still think is plenty for today and the near future. Planning many years ahead with hardware purchases makes no sense, especially when it comes to VRAM—you'll run out of shading power long before memory becomes a problem. GTX 1060 will not drive 4K, no matter if it has 3 GB or 6 GB. Game developers will certainly make use of the added memory capacity on the new consoles, we're talking 8 GB here, as a large portion of the console's total memory is used for the OS, game and game data. I think I'd definitely be interested in a RX 6700 Series with just 8 GB VRAM, at more affordable pricing. On the other hand, AMD's card is cheaper than NVIDIA's product, and has twice the VRAM at the same time, so really no reason to complain."

Wizzard, no offence, but this part is wrong.
There is already a different between 8 GB and 10/11/12 GB in DOOM Eternal at 4K and Wolfenstein 2 with manually enabled maximum settings (above Mein Leben) at 1440p.

For people who mod, the actual PCMR users, not just the ones who think opening an ini file every 5 years makes them hardcore, this does matter too.

More VRAM is always good. Personally, 8GB is obsolete for me for 1440p and 4K. 10GB is not but its close since I can break it if I want to. But not obsolete. Still, just because you never used something does not mean that it isnt important.
Yeah no doubt, you can always make games run bad by changing settings or replacing textures. These cases are very very edge case, maybe 1000 gamers out of 100 million? (making up random numbers).

More = good, but more = $, so more != good ;)
 

Cheeseball

Not a Potato
Supporter
Joined
Jan 2, 2009
Messages
1,494 (0.33/day)
Location
Pittsburgh, PA
System Name Custom AMD Rig
Processor AMD Ryzen™ 7 3800X
Motherboard ASUS TUF GAMING X570-PLUS (WI-FI)
Cooling EVGA CLC 280mm AIO Liquid Cooler
Memory G.SKILL TridentZ 32GB (8GBx4) F4-3200C16-8GTZR
Video Card(s) EVGA GeForce RTX 3080 FTW3 ULTRA GAMING 10GB
Storage 250GB Samsung 970 EVO NVMe, 2TB Inland Premium NVMe, 1TB Crucial MX500 SATA, 4TB WD Blue SATA
Display(s) Acer Nitro XV340CK Pbmiipphzx 34" UWQHD 1440p, LG 27GN850-B UltraGear 27" 1440p 144 Hz
Case NZXT H510i Matte White
Audio Device(s) Kanto Audio YU2 and SUB8 Desktop Speakers and Subwoofer, Blue Yeti
Power Supply Corsair RMx Series RM750x 750W
Mouse Kingston HyperX Pulsefire Haste
Keyboard Kingston HyperX Alloy Origins Core
Software Windows 10 Pro 64-bit 20H2
More VRAM is always more expensive. Specially for NVIDIA which uses exclusive to it GDDR6X.

This is probably the reason why they stuck with a 320-bit bus and limited the RTX 3080 to just 10 GB. Having more exclusive/proprietary/expensive GDDR6X chips would've pushed up the price.
 
Joined
Jul 23, 2019
Messages
52 (0.08/day)
Location
France
System Name Computer
Processor Intel Core i9-9900kf @ 5.0ghz
Motherboard MSI MPG Z390 Gaming Pro Carbon
Cooling NZXT Kraken X62
Memory 32GB G.Skill Trident Z DDR4-3600 CL16-19-19-39
Video Card(s) Gigabyte Geforce RTX 2070 Super Gaming OC 8g
Storage A Bunch of Sata SSD and some HD for storage
Display(s) Asus MG278q (main screen)
More VRAM is always more expensive. Specially for NVIDIA which uses exclusive to it GDDR6X.

FTFY.

And these two games are outliers and I presume could have been fixed if they had been released recently. Lastly good luck finding any visual differences between Uber and Ultra textures on your 4K monitor.
The thing is, benchmark are usually made with nothing else running on the test bench. I did run into the 8GB vram limit when using the high res pack in WD:L while W1zz did not, because i usually have a bunch of things running on my other monitor, and some (youtube video playing for example) do add to vram usage. So, while i could do without the high res pack, it does not seems very future friendly if i already potentially hit the limit.
 

Kallan

New Member
Joined
Oct 25, 2020
Messages
5 (0.03/day)
Fanboyism at its finest. There's almost zero difference between 10900K and 5950X at resolutions above 1080p and 10900K is as fast or faster than 5950X at 1080p.
That might not be as true as you think as SAM will benefit AMD CPU owners, here is one example (starts at 16:57);

1605715810945.png
 
Joined
Jan 21, 2020
Messages
109 (0.24/day)
Edit : for the record, i will quite probably recommend an AMD card for people in my entourage that are not interested in RT.
That would be a mistake. AMD cards do not have anything similar to DLSS. DLSS adds a significant boost to longevity of the cards, as similar to how you can play 4k games native today on a RTX 3080 (DLSS off), you will be able to play new games 4 years from now with DLSS on. You will not be able to do that on RX6000 series from AMD.
 
Joined
Feb 23, 2019
Messages
2,340 (2.98/day)
Location
Poland
Processor Ryzen 7 3700X
Motherboard Gigabyte X570 Aorus Elite
Cooling BeQuiet Dark Rock 4
Memory 2x8 GB Crucial Ballistix Sport LT 3200 CL16 @ 3600 CL16
Video Card(s) EVGA 1060 6GB SSC
Storage SX8200 Pro 1 TB, Plextor M6Pro 256 GB, WD Blue 2TB
Display(s) Acer XB273GP
Case SilverStone Primera PM01 RGB
Power Supply SeaSonic Focus Plus Gold 750W
Mouse SteelSeries Rival 300
Keyboard MK Typist (Kailh Box White)
Joined
Jan 21, 2020
Messages
109 (0.24/day)
No, they're both crap. RT is still not ready for prime time.
That's interesting :D Then why are all the biggest titles of today implementing raytracing? Why is AMD rushing their own raytracing implementation (even though inferior to Nvidia's)? Why do the consoles now support raytracing? :D
 
Joined
Dec 28, 2012
Messages
1,533 (0.51/day)
yeah, as much as I love competition, and as fast as the card might be, I'm not comfy with the price hike over $500 over the last couple of years
$500 was a long time ago, when we didnt need expensive DDR6/X buses, more expensive tracing, more complicated coolers to deal with hotspots from tiny nodes, and new nodes were regularly coming out. And of course you cant forget inflation, with two massive stimulus packages (in the US) in the last 10 years and tanking interest rates for most of that time inflation is goign to occur.

Also, remember that the 8800 ultra was over $800 in 2007.


These high prices are nothing new.
 
Joined
Jul 23, 2019
Messages
52 (0.08/day)
Location
France
System Name Computer
Processor Intel Core i9-9900kf @ 5.0ghz
Motherboard MSI MPG Z390 Gaming Pro Carbon
Cooling NZXT Kraken X62
Memory 32GB G.Skill Trident Z DDR4-3600 CL16-19-19-39
Video Card(s) Gigabyte Geforce RTX 2070 Super Gaming OC 8g
Storage A Bunch of Sata SSD and some HD for storage
Display(s) Asus MG278q (main screen)
That would be a mistake. AMD cards do not have anything similar to DLSS. DLSS adds a significant boost to longevity of the cards, as similar to how you can play 4k games native today on a RTX 3080 (DLSS off), you will be able to play new games 4 years from now with DLSS on. You will not be able to do that on RX6000 series from AMD.
It will only be really true if all new games support DLSS (which is not the case actually, look at valhalla for example) and if the equivalent solution that AMD announced isn't up for the task. It's not really an easy choice as it depend on a lot of things that you can't really predict.
 
Joined
Dec 28, 2012
Messages
1,533 (0.51/day)
"Much has been talked about VRAM size during NVIDIA's Ampere launch. The GeForce RTX 3080 offers 10 GB VRAM, which I still think is plenty for today and the near future. Planning many years ahead with hardware purchases makes no sense, especially when it comes to VRAM—you'll run out of shading power long before memory becomes a problem. GTX 1060 will not drive 4K, no matter if it has 3 GB or 6 GB. Game developers will certainly make use of the added memory capacity on the new consoles, we're talking 8 GB here, as a large portion of the console's total memory is used for the OS, game and game data. I think I'd definitely be interested in a RX 6700 Series with just 8 GB VRAM, at more affordable pricing. On the other hand, AMD's card is cheaper than NVIDIA's product, and has twice the VRAM at the same time, so really no reason to complain."

Wizzard, no offence, but this part is wrong.
There is already a different between 8 GB and 10/11/12 GB in DOOM Eternal at 4K and Wolfenstein 2 with manually enabled maximum settings (above Mein Leben) at 1440p.

For people who mod, the actual PCMR users, not just the ones who think opening an ini file every 5 years makes them hardcore, this does matter too.

More VRAM is always good. Personally, 8GB is obsolete for me for 1440p and 4K. 10GB is not but its close since I can break it if I want to. But not obsolete. Still, just because you never used something does not mean that it isnt important.

Wolfenstein the new order was one of the two games to expose the limitations ofthe 2GB framebuffer on the 680/770 cards way back when, the other being forza 4. But most other games ran perfectly fine, by the time the 2GB limit actually became a significant limit the 680 performance class was the range of 4GB 560s and 770 usage was nearly non existent anymore.

Not saying RAM limits cant happen, but the doom-and-gloom over nvidia's 10GB bus is way overhyped. A handful of games with manual settings that obliterate VRAM usage =! 10GB not being enough for 99% of PC gamers, even on the high end.
 
Joined
Jul 8, 2019
Messages
142 (0.22/day)
It will only be really true if all new games support DLSS (which is not the case actually, look at valhalla for example) and if the equivalent solution that AMD announced isn't up for the task. It's not really an easy choice as it depend on a lot of things that you can't really predict.

It will just matter if demanding games ie. most AAA titles get DLSS. For the rest[indie games or AA games] you don't need it as it will run at 4k native.
 
Joined
Apr 6, 2015
Messages
184 (0.08/day)
Location
Japan
System Name ChronicleScienceWorkStation
Processor AMD Threadripper 1950X
Motherboard Asrock X399 Taichi
Cooling Noctua U14S-TR4
Memory G.Skill DDR4 3200 C14 16GB*4
Video Card(s) AMD Radeon VII
Storage Samsung 970 Pro*1, Kingston A2000 1TB*2 RAID 0, HGST 8TB*5 RAID 6
Case Lian Li PC-A75X
Power Supply Corsair AX1600i
Software Proxmox 6.2
$500 was a long time ago, when we didnt need expensive DDR6/X buses, more expensive tracing, more complicated coolers to deal with hotspots from tiny nodes, and new nodes were regularly coming out. And of course you cant forget inflation, with two massive stimulus packages (in the US) in the last 10 years and tanking interest rates for most of that time inflation is goign to occur.

Also, remember that the 8800 ultra was over $800 in 2007.


These high prices are nothing new.
LOL
Was back then a proud 8800 GTS 320 MB owner, until a month later they dropped 8800 GT with hardware HD decoding.
 

Cheeseball

Not a Potato
Supporter
Joined
Jan 2, 2009
Messages
1,494 (0.33/day)
Location
Pittsburgh, PA
System Name Custom AMD Rig
Processor AMD Ryzen™ 7 3800X
Motherboard ASUS TUF GAMING X570-PLUS (WI-FI)
Cooling EVGA CLC 280mm AIO Liquid Cooler
Memory G.SKILL TridentZ 32GB (8GBx4) F4-3200C16-8GTZR
Video Card(s) EVGA GeForce RTX 3080 FTW3 ULTRA GAMING 10GB
Storage 250GB Samsung 970 EVO NVMe, 2TB Inland Premium NVMe, 1TB Crucial MX500 SATA, 4TB WD Blue SATA
Display(s) Acer Nitro XV340CK Pbmiipphzx 34" UWQHD 1440p, LG 27GN850-B UltraGear 27" 1440p 144 Hz
Case NZXT H510i Matte White
Audio Device(s) Kanto Audio YU2 and SUB8 Desktop Speakers and Subwoofer, Blue Yeti
Power Supply Corsair RMx Series RM750x 750W
Mouse Kingston HyperX Pulsefire Haste
Keyboard Kingston HyperX Alloy Origins Core
Software Windows 10 Pro 64-bit 20H2
That's interesting :D Then why are all the biggest titles of today implementing raytracing? Why is AMD rushing their own raytracing implementation (even though inferior to Nvidia's)? Why do the consoles now support raytracing? :D

While RT is being implemented now, it is still not at an acceptable performance level in currently released games. If having RT enabled only gave like a 10% to 20% performance hit (like doing 8x AA in the previous years) while giving a noticeable uplift in graphics fidelity, then it would be acceptable. But a drop of 50% or more? No.

Also, you can safely assume that games on consoles will not be using RT at their max, but only implement subtle visual improvements (like light mirror effects and such).
 
Joined
Jan 21, 2020
Messages
109 (0.24/day)
That might not be as true as you think as SAM will benefit AMD CPU owners, here is one example (starts at 16:57);

View attachment 176136
You responded to a post about how "fanboyism is bad" with the biggest AMD fanboy page on the internet - HardwareUnboxed? Oh come on. You can't trust their results at all. Thier results are constantly significantly tilted in favor of AMD compared to the rest of the interner for their entire existence.
 

MxPhenom 216

ASIC Engineer
Joined
Aug 31, 2010
Messages
12,519 (3.22/day)
Location
Longmont, CO
System Name Please god I need a GPU!
Processor Intel Core i7 8700k @ 4.8GHz 1.28v
Motherboard MSI Z370 Gaming Pro Carbon AC
Cooling 2x EK PE360 | EK CPU and GPU WB | Full hard line tubing | Singularity Resonance Single res/pump
Memory Corsair Vengeance Pro RGB 32GB 3200 14-14-14-34
Video Card(s) MSI GTX1070 Gaming X -> Asus TUF RTX3080 or Evga XC3 RTX3080
Storage 1TB Samsung 970 EVO 2TB Samsung 970 Evo Plus
Display(s) Dell S3220DGF 32" 1440p Freesync 2 (G-Sync) HDR 165Hz | 2x Asus VP249QGR 144Hz IPS
Case Lian Li PC-011D
Audio Device(s) Realtek 1220 w/ Sennheiser Game Ones
Power Supply Seasonic Flagship Prime Ultra Platinum 850
Mouse Razer Viper
Keyboard Razer Huntsman Tournament Edition
Software Windows 10 Pro 64-Bit
Thanks for the review.


RT is the antithesis of DLSS. You increase image quality only to decrease resolution, and IQ, for a performance boost.


I imagine AMD will make the same move as soon as the node is available. Do you have insider information that TSMC plans on barring AMD from using the node to make Nvidia look better :eek:?

So far TSMC is barring everyone from using 5nm except for Apple...
 
Joined
Oct 17, 2014
Messages
6,168 (2.60/day)
Location
USA
System Name Paladius Tacet
Processor Ryzen 5600x
Motherboard MSI X570 Tomahawk
Cooling Arctic Freezer 34 DUO (custom aggressive fan curve)
Memory G.Skill 2x16 3600 14-14-14-34 Dual Rank
Video Card(s) Navi 6800 + Rage Mode + OC
Display(s) Acer Nitro XF243Y 23.8" 0.5ms IPS 165hz 1080p
Case Corsair 110Q Silent + NZXT Aer-P exhaust fan
Power Supply EVGA 700w Gold
Mouse Razer Naga X (2021 Edition)
honestly I am glad I got the 6800 non-XT, I imagine with my super tuned ram at 3600 cas 14-14-14 which i already have stable on my 5600x, enable rage mode, medium OC, smart access memory enabled... I will be nipping at the heals of a 3080 myself even with non-xt.

but mainly since i game at 1080p 165hz or 1440p 144hz, the 6800 with all that stuff mentioned above, maxes out the frame rate anyway... so yeah... I'm set and I saved $80 on top of that. would have liked a 6800 xt for sure, but I am just thankful I got what I got.

also love the title... "nvidia is in trouble" haha indeed, glorious times.
 
Joined
Jan 21, 2020
Messages
109 (0.24/day)
It will only be really true if all new games support DLSS (which is not the case actually, look at valhalla for example) and if the equivalent solution that AMD announced isn't up for the task. It's not really an easy choice as it depend on a lot of things that you can't really predict.
You can't expect an AMD sponsored game to support DLSS. That is a very bad (and minority) example.
 
Joined
Dec 28, 2012
Messages
1,533 (0.51/day)
rDNA2 is shaping up to be the next Evergreen series. Wouldnt surprise me to see AMD ripping a significant chunk of marketshare back from Nvidia.

The AIB 6800xts are going to be awesome with larger power buses and limits. And given how slow the fans spin at stock there is plenty of temp room as well.

Now I really want to see what the 6900XT is capable of, with the 6800XT OC tickling the 3090 in nvidia's golden examples.
 
Joined
Jan 21, 2020
Messages
109 (0.24/day)
While RT is being implemented now, it is still not at an acceptable performance level in currently released games. If having RT enabled only gave like a 10% to 20% performance hit (like doing 8x AA in the previous years) while giving a noticeable uplift in graphics fidelity, then it would be acceptable. But a drop of 50% or more? No.

Also, you can safely assume that games on consoles will not be using RT at their max, but only implement subtle visual improvements (like light mirror effects and such).
I can play Control in 4K in full raytracing right now and get massive quality increase out of that. Now add Minecraft and Cyberpunk 2077 also with massive quality improvements gained from raytracing.
 
Joined
Jul 23, 2019
Messages
52 (0.08/day)
Location
France
System Name Computer
Processor Intel Core i9-9900kf @ 5.0ghz
Motherboard MSI MPG Z390 Gaming Pro Carbon
Cooling NZXT Kraken X62
Memory 32GB G.Skill Trident Z DDR4-3600 CL16-19-19-39
Video Card(s) Gigabyte Geforce RTX 2070 Super Gaming OC 8g
Storage A Bunch of Sata SSD and some HD for storage
Display(s) Asus MG278q (main screen)
You can't expect an AMD sponsored game to support DLSS. That is a very bad (and minority) example.
There will always be AAA game sponsored by AMD, my point was just that you can't count on DLSS for every demanding game that will be out in the future. So while i agree with you that it is a nice edge for nvidia to take into account, it's not a very reliable one.
 
Joined
Oct 17, 2014
Messages
6,168 (2.60/day)
Location
USA
System Name Paladius Tacet
Processor Ryzen 5600x
Motherboard MSI X570 Tomahawk
Cooling Arctic Freezer 34 DUO (custom aggressive fan curve)
Memory G.Skill 2x16 3600 14-14-14-34 Dual Rank
Video Card(s) Navi 6800 + Rage Mode + OC
Display(s) Acer Nitro XF243Y 23.8" 0.5ms IPS 165hz 1080p
Case Corsair 110Q Silent + NZXT Aer-P exhaust fan
Power Supply EVGA 700w Gold
Mouse Razer Naga X (2021 Edition)
Your own review shows 6800xt as 5% slower in raster vs 3080 and getting stomped out in DXR.

how is Nvidia in trouble over $50 price difference?

nvidia card's don't OC for one thing, the new ones don't. and AMD oc's very well. surpassing 3080 really across the board even with both oc'd.

also competition is just great for the PC gaming industry... so just be happy and move on with life?
 
Joined
Jan 21, 2020
Messages
109 (0.24/day)
There will always be AAA game sponsored by AMD, my point was just that you can't count on DLSS for every demanding game that will be out in the future. So while i agree with you that it is a nice edge for nvidia to take into account, it's not a very reliable one.
AMD-sponsored games are a small percentage of all the games coming out. So recommending Nvidia is actually very reliable for GPU longevity.
 
Joined
Dec 28, 2012
Messages
1,533 (0.51/day)
Your own review shows 6800xt as 5% slower in raster vs 3080 and getting stomped out in DXR.

how is Nvidia in trouble over $50 price difference?
Nvidia has practically 0 OC headroom. AMD has decent headroom and is restricted by power limits, AIB products with higher limits and more memory OCing will expand that further. With OC factored in 6800xt totally closes the gap with Nvidia.

Raytracing continues to only be a selling point to a tiny minority of users who love the game control, outside of that game raytracing is hilarious vaporware. A $50 difference means AMD getting more attention, that's enought o convince people with the cards being so close, and Nvidia doesnt have a lot of headroom to cut prices on a GA die they cant seem to make in any large number.
 
Top