• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 6800 XT

"Much has been talked about VRAM size during NVIDIA's Ampere launch. The GeForce RTX 3080 offers 10 GB VRAM, which I still think is plenty for today and the near future. Planning many years ahead with hardware purchases makes no sense, especially when it comes to VRAM—you'll run out of shading power long before memory becomes a problem. GTX 1060 will not drive 4K, no matter if it has 3 GB or 6 GB. Game developers will certainly make use of the added memory capacity on the new consoles, we're talking 8 GB here, as a large portion of the console's total memory is used for the OS, game and game data. I think I'd definitely be interested in a RX 6700 Series with just 8 GB VRAM, at more affordable pricing. On the other hand, AMD's card is cheaper than NVIDIA's product, and has twice the VRAM at the same time, so really no reason to complain."

Wizzard, no offence, but this part is wrong.
There is already a different between 8 GB and 10/11/12 GB in DOOM Eternal at 4K and Wolfenstein 2 with manually enabled maximum settings (above Mein Leben) at 1440p.

For people who mod, the actual PCMR users, not just the ones who think opening an ini file every 5 years makes them hardcore, this does matter too.

More VRAM is always good. Personally, 8GB is obsolete for me for 1440p and 4K. 10GB is not but its close since I can break it if I want to. But not obsolete. Still, just because you never used something does not mean that it isnt important.
Yeah no doubt, you can always make games run bad by changing settings or replacing textures. These cases are very very edge case, maybe 1000 gamers out of 100 million? (making up random numbers).

More = good, but more = $, so more != good ;)
 
More VRAM is always more expensive. Specially for NVIDIA which uses exclusive to it GDDR6X.

This is probably the reason why they stuck with a 320-bit bus and limited the RTX 3080 to just 10 GB. Having more exclusive/proprietary/expensive GDDR6X chips would've pushed up the price.
 
More VRAM is always more expensive. Specially for NVIDIA which uses exclusive to it GDDR6X.

FTFY.

And these two games are outliers and I presume could have been fixed if they had been released recently. Lastly good luck finding any visual differences between Uber and Ultra textures on your 4K monitor.
The thing is, benchmark are usually made with nothing else running on the test bench. I did run into the 8GB vram limit when using the high res pack in WD:L while W1zz did not, because i usually have a bunch of things running on my other monitor, and some (youtube video playing for example) do add to vram usage. So, while i could do without the high res pack, it does not seems very future friendly if i already potentially hit the limit.
 
Fanboyism at its finest. There's almost zero difference between 10900K and 5950X at resolutions above 1080p and 10900K is as fast or faster than 5950X at 1080p.
That might not be as true as you think as SAM will benefit AMD CPU owners, here is one example (starts at 16:57);

1605715810945.png
 
Edit : for the record, i will quite probably recommend an AMD card for people in my entourage that are not interested in RT.
That would be a mistake. AMD cards do not have anything similar to DLSS. DLSS adds a significant boost to longevity of the cards, as similar to how you can play 4k games native today on a RTX 3080 (DLSS off), you will be able to play new games 4 years from now with DLSS on. You will not be able to do that on RX6000 series from AMD.
 
No, they're both crap. RT is still not ready for prime time.
That's interesting :D Then why are all the biggest titles of today implementing raytracing? Why is AMD rushing their own raytracing implementation (even though inferior to Nvidia's)? Why do the consoles now support raytracing? :D
 
yeah, as much as I love competition, and as fast as the card might be, I'm not comfy with the price hike over $500 over the last couple of years
$500 was a long time ago, when we didnt need expensive DDR6/X buses, more expensive tracing, more complicated coolers to deal with hotspots from tiny nodes, and new nodes were regularly coming out. And of course you cant forget inflation, with two massive stimulus packages (in the US) in the last 10 years and tanking interest rates for most of that time inflation is goign to occur.

Also, remember that the 8800 ultra was over $800 in 2007.


These high prices are nothing new.
 
That would be a mistake. AMD cards do not have anything similar to DLSS. DLSS adds a significant boost to longevity of the cards, as similar to how you can play 4k games native today on a RTX 3080 (DLSS off), you will be able to play new games 4 years from now with DLSS on. You will not be able to do that on RX6000 series from AMD.
It will only be really true if all new games support DLSS (which is not the case actually, look at valhalla for example) and if the equivalent solution that AMD announced isn't up for the task. It's not really an easy choice as it depend on a lot of things that you can't really predict.
 
"Much has been talked about VRAM size during NVIDIA's Ampere launch. The GeForce RTX 3080 offers 10 GB VRAM, which I still think is plenty for today and the near future. Planning many years ahead with hardware purchases makes no sense, especially when it comes to VRAM—you'll run out of shading power long before memory becomes a problem. GTX 1060 will not drive 4K, no matter if it has 3 GB or 6 GB. Game developers will certainly make use of the added memory capacity on the new consoles, we're talking 8 GB here, as a large portion of the console's total memory is used for the OS, game and game data. I think I'd definitely be interested in a RX 6700 Series with just 8 GB VRAM, at more affordable pricing. On the other hand, AMD's card is cheaper than NVIDIA's product, and has twice the VRAM at the same time, so really no reason to complain."

Wizzard, no offence, but this part is wrong.
There is already a different between 8 GB and 10/11/12 GB in DOOM Eternal at 4K and Wolfenstein 2 with manually enabled maximum settings (above Mein Leben) at 1440p.

For people who mod, the actual PCMR users, not just the ones who think opening an ini file every 5 years makes them hardcore, this does matter too.

More VRAM is always good. Personally, 8GB is obsolete for me for 1440p and 4K. 10GB is not but its close since I can break it if I want to. But not obsolete. Still, just because you never used something does not mean that it isnt important.

Wolfenstein the new order was one of the two games to expose the limitations ofthe 2GB framebuffer on the 680/770 cards way back when, the other being forza 4. But most other games ran perfectly fine, by the time the 2GB limit actually became a significant limit the 680 performance class was the range of 4GB 560s and 770 usage was nearly non existent anymore.

Not saying RAM limits cant happen, but the doom-and-gloom over nvidia's 10GB bus is way overhyped. A handful of games with manual settings that obliterate VRAM usage =! 10GB not being enough for 99% of PC gamers, even on the high end.
 
It will only be really true if all new games support DLSS (which is not the case actually, look at valhalla for example) and if the equivalent solution that AMD announced isn't up for the task. It's not really an easy choice as it depend on a lot of things that you can't really predict.

It will just matter if demanding games ie. most AAA titles get DLSS. For the rest[indie games or AA games] you don't need it as it will run at 4k native.
 
$500 was a long time ago, when we didnt need expensive DDR6/X buses, more expensive tracing, more complicated coolers to deal with hotspots from tiny nodes, and new nodes were regularly coming out. And of course you cant forget inflation, with two massive stimulus packages (in the US) in the last 10 years and tanking interest rates for most of that time inflation is goign to occur.

Also, remember that the 8800 ultra was over $800 in 2007.


These high prices are nothing new.
LOL
Was back then a proud 8800 GTS 320 MB owner, until a month later they dropped 8800 GT with hardware HD decoding.
 
That's interesting :D Then why are all the biggest titles of today implementing raytracing? Why is AMD rushing their own raytracing implementation (even though inferior to Nvidia's)? Why do the consoles now support raytracing? :D

While RT is being implemented now, it is still not at an acceptable performance level in currently released games. If having RT enabled only gave like a 10% to 20% performance hit (like doing 8x AA in the previous years) while giving a noticeable uplift in graphics fidelity, then it would be acceptable. But a drop of 50% or more? No.

Also, you can safely assume that games on consoles will not be using RT at their max, but only implement subtle visual improvements (like light mirror effects and such).
 
That might not be as true as you think as SAM will benefit AMD CPU owners, here is one example (starts at 16:57);

View attachment 176136
You responded to a post about how "fanboyism is bad" with the biggest AMD fanboy page on the internet - HardwareUnboxed? Oh come on. You can't trust their results at all. Thier results are constantly significantly tilted in favor of AMD compared to the rest of the interner for their entire existence.
 
Thanks for the review.


RT is the antithesis of DLSS. You increase image quality only to decrease resolution, and IQ, for a performance boost.


I imagine AMD will make the same move as soon as the node is available. Do you have insider information that TSMC plans on barring AMD from using the node to make Nvidia look better :eek:?

So far TSMC is barring everyone from using 5nm except for Apple...
 
honestly I am glad I got the 6800 non-XT, I imagine with my super tuned ram at 3600 cas 14-14-14 which i already have stable on my 5600x, enable rage mode, medium OC, smart access memory enabled... I will be nipping at the heals of a 3080 myself even with non-xt.

but mainly since i game at 1080p 165hz or 1440p 144hz, the 6800 with all that stuff mentioned above, maxes out the frame rate anyway... so yeah... I'm set and I saved $80 on top of that. would have liked a 6800 xt for sure, but I am just thankful I got what I got.

also love the title... "nvidia is in trouble" haha indeed, glorious times.
 
It will only be really true if all new games support DLSS (which is not the case actually, look at valhalla for example) and if the equivalent solution that AMD announced isn't up for the task. It's not really an easy choice as it depend on a lot of things that you can't really predict.
You can't expect an AMD sponsored game to support DLSS. That is a very bad (and minority) example.
 
rDNA2 is shaping up to be the next Evergreen series. Wouldnt surprise me to see AMD ripping a significant chunk of marketshare back from Nvidia.

The AIB 6800xts are going to be awesome with larger power buses and limits. And given how slow the fans spin at stock there is plenty of temp room as well.

Now I really want to see what the 6900XT is capable of, with the 6800XT OC tickling the 3090 in nvidia's golden examples.
 
While RT is being implemented now, it is still not at an acceptable performance level in currently released games. If having RT enabled only gave like a 10% to 20% performance hit (like doing 8x AA in the previous years) while giving a noticeable uplift in graphics fidelity, then it would be acceptable. But a drop of 50% or more? No.

Also, you can safely assume that games on consoles will not be using RT at their max, but only implement subtle visual improvements (like light mirror effects and such).
I can play Control in 4K in full raytracing right now and get massive quality increase out of that. Now add Minecraft and Cyberpunk 2077 also with massive quality improvements gained from raytracing.
 
You can't expect an AMD sponsored game to support DLSS. That is a very bad (and minority) example.
There will always be AAA game sponsored by AMD, my point was just that you can't count on DLSS for every demanding game that will be out in the future. So while i agree with you that it is a nice edge for nvidia to take into account, it's not a very reliable one.
 
Your own review shows 6800xt as 5% slower in raster vs 3080 and getting stomped out in DXR.

how is Nvidia in trouble over $50 price difference?

nvidia card's don't OC for one thing, the new ones don't. and AMD oc's very well. surpassing 3080 really across the board even with both oc'd.

also competition is just great for the PC gaming industry... so just be happy and move on with life?
 
There will always be AAA game sponsored by AMD, my point was just that you can't count on DLSS for every demanding game that will be out in the future. So while i agree with you that it is a nice edge for nvidia to take into account, it's not a very reliable one.
AMD-sponsored games are a small percentage of all the games coming out. So recommending Nvidia is actually very reliable for GPU longevity.
 
Your own review shows 6800xt as 5% slower in raster vs 3080 and getting stomped out in DXR.

how is Nvidia in trouble over $50 price difference?
Nvidia has practically 0 OC headroom. AMD has decent headroom and is restricted by power limits, AIB products with higher limits and more memory OCing will expand that further. With OC factored in 6800xt totally closes the gap with Nvidia.

Raytracing continues to only be a selling point to a tiny minority of users who love the game control, outside of that game raytracing is hilarious vaporware. A $50 difference means AMD getting more attention, that's enought o convince people with the cards being so close, and Nvidia doesnt have a lot of headroom to cut prices on a GA die they cant seem to make in any large number.
 
Back
Top