• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 6800 XT

Thanks for the review.

I'm glad AMD has once again returned to competing at the high-end of performance in rasterization. The ray-tracing performance is unusable, however, at all but the lowest resolutions and without a DLSS competitor, I won't be purchasing an RX 6000 card. Guess it's back to waiting for NV to make 3000 series cards available in meaningful quantity.
RT is the antithesis of DLSS. You increase image quality only to decrease resolution, and IQ, for a performance boost.

Then you really have to wonder what AMD is gonna do when Nvidia moves to 5nm TSMC. I don't think AMD is looking forward to that.
I imagine AMD will make the same move as soon as the node is available. Do you have insider information that TSMC plans on barring AMD from using the node to make Nvidia look better :eek:?
 
Split them up and have RT Off only and RT On only?

Or if not, have some color scheme that allows to easily grasp when RT is on and when not.
Reload. How's that?
 
right now their gun is DLSS/RT.

What's funny is that almost everyone lamented those back then, even Nvidia fanboys.

I don't know about 3070, but 3080 can go from 320W to 220W just by undervolting. Performance will decrease by 5% tops. This means nvidia runs these chips well over their sweet spot.

I am sure the 6800XT undervolted as well but I guess we'll pretend like that not the case, right ?
 
How can it be such a power hog when playing simple videos but eat 100W less when gaming?
 
Reload. How's that?
Definitely preferred, might be good enough so graphs do not have to be split.
 
Reload. How's that?
It's immediately noticeable.
 
How can it be such a power hog when playing simple videos but eat 100W less when gaming?
Memory runs at full speed during media playback, see the table on page 32

Previous architectures didn't do that, so I think it'll be fixed soon
 
This would have been an impressive card at $500.
 
The MSRP doesn't really apply outside of the US though, because, reasons.

AMDs recommended price was specifically 6990 and 6249 respectively. That was the number they gave for PR purposes, which isn't just US MSRP + VAT.
 
This would have been an impressive card at $500.
yeah, as much as I love competition, and as fast as the card might be, I'm not comfy with the price hike over $500 over the last couple of years
 
How can it be such a power hog when playing simple videos but eat 100W less when gaming?

I can hardly call that a power hog but almost always there are issues with new cards and media power consumption which are usually fixed sooner or later.

As someone said -horrible hardware encoder.

Man, is Nvidia's damage control task force comprised of just you ? Rough times to be an Nvidia fan, huh ?
 
Thanks for the review.


RT is the antithesis of DLSS. You increase image quality only to decrease resolution, and IQ, for a performance boost.


I imagine AMD will make the same move as soon as the node is available. Do you have insider information that TSMC plans on barring AMD from using the node to make Nvidia look better :eek:?
Oh hello??? :D Is anybody home? RTX 3000 series are on the 8nm Samsung node, which is just an improved 10nm. RX6000 from AMD are on the bestest and latest 7nm node from TSMC - true 7nm node. So right now, AMD has the node advantage, but still they trail behind Nvidia in rasterization and significantly so in raytracing. If both AMD and Nvidia move to 5nm TSMC, AMDs node advantage is gone. Do you now understand AMDs situation?
 
Then link actual 4K results where AMD beats top-of-the-line Intel CPUs in gaming. OR even the 9900K.
Why should I link 4K results? As I mentioned, there is no real difference between CPUs that you can experince in real life above FHD or even on FHD with a non-flagship GPU. But you were the ones who were screaming about the same differences on FHD when Intel was in the lead. :D I'm sorry you can't understand it.
 
I am sure the 6800XT undervolted as well but I guess we'll pretend like that not the case, right ?

I'd actually like to see this. I undervolted my 3080 to a locked .825 mV and only lost around 3 FPS in Cold War (Zombies mode, DLSS off) and PUBG, and my UPS reported a nice 85W reduction from 355W. It'd be nice to see the 6800 XT running below 180W at stock performance.

How can it be such a power hog when playing simple videos but eat 100W less when gaming?

Drivers.
 
Vaporware again. Sigh.

I don't know about 3070, but 3080 can go from 320W to 220W just by undervolting. Performance will decrease by 5% tops. This means nvidia runs these chips well over their sweet spot.

Said Every Vega owner.
 
"RTRT will be implemented in most triple-A titles from now on. "

Sorry, is this a fact? Can you quote it from somewhere/someone?
I'm not pro nvidia or amd, i was really interested in the result of this new card generation, but i have to say that the RT performances are quite disappointing.
I enjoyed RT in Control, i enjoyed RT in Legion, i look forward for RT in Minecraft and maybe cyberpunk 2077 if i like the game. But even if the rasterization performance would be vastly superior to my 2070 super, i would feel bad to spend that much money on a 6800/6800XT to have worse performances when it come to RTX+DLSS even if it's only on a few games. In the case of Legion, i think my 2070 super without RT but with DLSS may even have better results than the 6800, although with a little bit of quality loss.

I understand that not everyone is into RT, the new AMD card are indeed very good in this case, but if you have any interest in RT, they are definitely not up for the task at the moment.
 
Why should I link 4K results? As I mentioned, there is no real difference between CPUs that you can experince in real life above FHD or even on FHD with a non-flagship GPU. But you were the ones who were screaming about the same differences on FHD when Intel was in the lead. :D I'm sorry you can't understand it.
Me? Where?
 
I'm excited for Big Navi because of how it's going to affect Nvidia. I'm not going to rush out and buy an AMD card (my g-sync monitor won't support AMD cards with g-sync on and I value that far more than better performance). It's the same excitement when I see AMD's CPUs coming out, though I'm more likely to try AMD with my next build in a few years than I am to try AMD GPUs.

I'm still going to seek out a 3080 when stock is no longer an issue and it can be had at close to MSRP, but it's exciting to see competition in the GPU realm at the high end.
 
AMDs recommended price was specifically 6990 and 6249 respectively. That was the number they gave for PR purposes, which isn't just US MSRP + VAT.
Ah, ok, I had no idea they'd released local MSRP pricing, that's not very common. Yeah, that's not really going to cut it.
 
RTRT perf on Metro is decent, but on Cotrol is crap hmm...
I'm curious what the RTRT perf will on new titles, one that are build not only with Nvidia in mind.
I feel like there is more to the RTRT story than what we're seeing here.
 
Do you now understand AMDs situation?

Do you ?

AMD is doing away with a narrower bus and slower memory, they can always bump those up and gain a considerable chunk of performance just by doing that and nothing more. Also caches scale really well with newer nodes in terms of power, there is a very good chance AMD will trash Nvidia in performance/watt even harder when both get to the next node.

Also, you do realize AMD is probably already working on a future 5nm design by now. Meanwhile I am led to believe Nvidia is figuring out how to bring Ampere to 7nm ? Do I have to explain how things aren't exactly going too well for them if that's the case ?

Seems like Nvidia has just done an Intel style upsy-daisy by screwing up their node situation.
 
Back
Top