• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 6800 XT

Joined
Feb 21, 2006
Messages
1,982 (0.30/day)
Location
Toronto, Ontario
System Name The Expanse
Processor AMD Ryzen 7 5800X3D
Motherboard Asus Prime X570-Pro BIOS 5013 AM4 AGESA V2 PI 1.2.0.Ca.
Cooling Corsair H150i Pro
Memory 32GB GSkill Trident RGB DDR4-3200 14-14-14-34-1T (B-Die)
Video Card(s) AMD Radeon RX 7900 XTX 24GB (24.3.1)
Storage WD SN850X 2TB / Corsair MP600 1TB / Samsung 860Evo 1TB x2 Raid 0 / Asus NAS AS1004T V2 14TB
Display(s) LG 34GP83A-B 34 Inch 21: 9 UltraGear Curved QHD (3440 x 1440) 1ms Nano IPS 160Hz
Case Fractal Design Meshify S2
Audio Device(s) Creative X-Fi + Logitech Z-5500 + HS80 Wireless
Power Supply Corsair AX850 Titanium
Mouse Corsair Dark Core RGB SE
Keyboard Corsair K100
Software Windows 10 Pro x64 22H2
Benchmark Scores 3800X https://valid.x86.fr/1zr4a5 5800X https://valid.x86.fr/2dey9c
You responded to a post about how "fanboyism is bad" with the biggest AMD fanboy page on the internet - HardwareUnboxed? Oh come on. You can't trust their results at all. Thier results are constantly significantly tilted in favor of AMD compared to the rest of the interner for their entire existence.

Trust their results more than your post.

You are in a RX 6000 review thread and pro Nvidia and don't intend on buying the hardware.

That applies to the other NV fan boys in this thread why are you here?

other than thread crapping?

AMD-sponsored games are a small percentage of all the games coming out. So recommending Nvidia is actually very reliable for GPU longevity.

Both consoles are RDNA 2 which means you will see more games supporting it from the ground up.
 
Joined
Jul 8, 2019
Messages
169 (0.10/day)
This isnt fully fair either. Nvidia do tweak performance over time too. They are engineers, not posthuman genetically engineered entities with future-seeing capabilities. They still need time.

As for AMD - they have a smaller team so things like their performance long term... it does need more work for sure. This is where the Fine Wine thing came from.
It is good though. As long as AMD prices products on performance during the product's release, it is A-OK to have drivers improve it further. It means you paid fair for it once and get better performance long term. That is my thought process and I used it for Turing too (which really did improve over time nice).

AMD driver team also has different philosophy -there are different teams working at different subsequent drivers, or even on different matters. Don't know how good communication is there and management, but i find it weird. One team can squash a bug in one driver revision, while the other did not and it resurfaces. Why use alternating team for driver releases? Why not just one team, with different departments -one go against these bugs, one against these.
 
Joined
Mar 20, 2012
Messages
270 (0.06/day)
Anyone find it ironic hypocrisy that a person shit posting all over AMD is calling other people fanboys, yet at the same time is some stalwart defender of everything intel and nvidia?
 
Joined
Jul 8, 2019
Messages
169 (0.10/day)
The dilemma of the Nvidia fanboy.

It used to be that they denied the existence of "Fine Wine", now apparently their drivers do improve performance over time. Man this is so strange.

No AMD just uncover performance that is there. They don't magically create additional performance. That's why also usually AMD had higher raw power on paper but failed to deliver. Imagine a situation where you have two cars both have 120HP, but two different drivers. One is quick to learn so he basically use car full potential from the start, the other is slow learner and deliver the same performance after longer time.
 
Joined
Mar 10, 2010
Messages
11,878 (2.30/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
No AMD just uncover performance that is there. They don't magically create additional performance. That's why also usually AMD had higher raw power on paper but failed to deliver. Imagine a situation where you have two cars both have 120HP, but two different drivers. One is quick to learn so he basically use car full potential from the start, the other is slow learner and deliver the same performance after longer time.
Comedy double down, I wouldn't mention power draw, some never learn!?.
 
Joined
Aug 15, 2017
Messages
18 (0.01/day)
I tell you why, because some insist a 50% performance hit so much better than a 60% hit.

No, they're both crap. RT is still not ready for prime time.

I don't understand why people like you who misrepresent facts to prove a point.


In the worst-case scenario, the 3080 drops -43%. Once you account for the improvements from DLSS, the penalty is nowhere as bad as you suggest.
 
Joined
Jan 8, 2017
Messages
8,940 (3.36/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
I don't understand why people like you who misrepresent facts to prove a point.


In the worst-case scenario, the 3080 drops -43%. Once you account for the improvements from DLSS, the penalty is nowhere as bad as you suggest.

You are right sir, excuse me.

- 43%, now that sounds absolutely amazing compared to 50 or 60. Losing almost half the performance is pretty good, what can I say you have completely changed my mind I'm sold.
 
Last edited:
Joined
Aug 15, 2017
Messages
18 (0.01/day)
That's interesting :D Then why are all the biggest titles of today implementing raytracing? Why is AMD rushing their own raytracing implementation (even though inferior to Nvidia's)? Why do the consoles now support raytracing? :D

It's an ignorant take. Console launch titles are already implementation raytracing and somehow it is not "ready for prime time" ?

Most PC games will either be console ports or co-developed. I think raytracing performance is going to matter a lot in next gen games.

There's also this little title called Cyberpunk which will utilize raytracing..
 
Joined
Apr 19, 2011
Messages
2,198 (0.46/day)
Location
So. Cal.
I see this as RTG's "Zen" moment. Is this XT "above" the Nvidia RTX 3080, ... no. That said, this is much more than just "nipping" at the heels, this is in a stride for stride. This is "competition"! and what we always looked for! AMD/RTG can make a marked play on the number of sales and begin to rival Nividia. At this point, RTG has found the momentum and only needs to focus hard and run their race.

Looking at "supposed issues" with Nvidia and their bigger GA102 Samsung 8mn and if those are inducing their issues with yield/supply. We can't say for certain but it could be/continue it doesn't bode well for Nvidia. While sure RTG has there own struggles, I don't see this initial release consideration as a long-term problem. AMD/RTG juggling their demands at TSMC has its own challenges, but direct supplier/yields is probably not one of them. AMD/RTG has probably got "Zen 3" CPU channel loaded, and for the last couple of weeks been loading "Navi 21" parts for reference boards and to AIB's. I say RTG is set well to grab the after Christmas funds.

Either side... either card if you can find one in your cart... lucky you! :toast:
 
Last edited:
Joined
Jul 8, 2019
Messages
169 (0.10/day)
I really hope that the samsung nvidia node isnt that crappy as people make it so, because otherwise if nvidia goes to lower tsmc process AMD will be again spanked. I hope that's not the case and with RDNA3 they will be on par with nvidia both performance and feature wise in all fields. People think im nvidia fanboy - i currently own both amd cpu and amd gpu. Im just jaded after 5700 got me headaches for months. I don't approach AMD with rose tinted glasses, and i had nvidia gpus before almost exclusively bar a HD series gpu. I know both sides of equation.
 
Joined
Mar 27, 2017
Messages
6 (0.00/day)
Fixed, thanks! GPU-Z has the wrong value, too

I seriously believe something gone wrong with the benchmarks here which dont seem to agree with the majority of major youtube channels that benched the GPU...

In some games even the 6800 takes the lead to the 6800 xt while the general difference seems to be 2 FPS between the two and on top of that the 3070 (a gpu that trades blows with the 2080 ti) comes on top of even the RX 6800 xt...

You should seriously check if something is wrong with your test bench hardware wise (eg dual channel is enabled? is the RAM more than 16GB when test the 6800 xt ? is the CPU a flagship one? ) or the drivers on windows and on the card itself...
 
Joined
Oct 26, 2019
Messages
137 (0.08/day)
I literally have 8 GB cards and this game. Your source is wrong or incompetent. 8GB does indeed have issues in the game at 4K.
Sure you are well competent. Scales pretty much proportional. So your source is grossly incompetent.
 
Joined
Mar 10, 2010
Messages
11,878 (2.30/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
No AMD just uncover performance that is there. They don't magically create additional performance. That's why also usually AMD had higher raw power on paper but failed to deliver. Imagine a situation where you have two cars both have 120HP, but two different drivers. One is quick to learn so he basically use car full potential from the start, the other is slow learner and deliver the same performance after longer time.
.=


Have you literally took my "raw power" comment as higher wattage? Are you smoothbrained? What i wanted to say that AMD GPUs had often better spec on paper like more ROPS, SM, higher core clock etc. but somehow were slower or on par with nvidia. Now you understand?
No your so wrong it's not right, in the right application AMD showed their performance like Folding at home, mining, Doom.
Nvidia usually had higher sounding core counts, more rops and higher boost clocks, were you on another planet this last decade.
 

r9

Joined
Jul 28, 2008
Messages
3,300 (0.57/day)
System Name Primary|Secondary|Poweredge r410|Dell XPS|SteamDeck
Processor i7 11700k|i7 9700k|2 x E5620 |i5 5500U|Zen 2 4c/8t
Memory 32GB DDR4|16GB DDR4|16GB DDR4|32GB ECC DDR3|8GB DDR4|16GB LPDDR5
Video Card(s) RX 7800xt|RX 6700xt |On-Board|On-Board|8 RDNA 2 CUs
Storage 2TB m.2|512GB SSD+1TB SSD|2x256GBSSD 2x2TBGB|256GB sata|512GB nvme
Display(s) 50" 4k TV | Dell 27" |22" |3.3"|7"
VR HMD Samsung Odyssey+ | Oculus Quest 2
Software Windows 11 Pro|Windows 10 Pro|Windows 10 Home| Server 2012 r2|Windows 10 Pro
RT-off performance is pretty good, with the higher oc headroom 8% vs 4% and the speed deficit being around 5% while 50$ cheaper is right where it should be. But RT performance is terrible and with no DLSS 2.0 equivalent it's a big disadvantage. However all those games have been optimized for Nvidia so we'll have to see how future titles and future game updates do. Personally I'm not convinced that 8gb is a big disadvantage as I haven't seen any proof as vram utilization doesn't prove anything talking fps proof. That said 6800 with 8gb vram for $480 or 6800xt for $550 would be great value/performance product. Unfortunately it will never happen.
 
Joined
Jul 8, 2019
Messages
169 (0.10/day)
.=



No your so wrong it's not right, in the right application AMD showed their performance like Folding at home, mining, Doom.
Nvidia usually had higher sounding core counts, more rops and higher boost clocks, were you on another planet this last decade.

Remember GTX 1060 vs RX 480, where at launch GTX 1060 was winning, despite lower TFLOP performance, and then as GCN/polaris matured the performance improved and went on par with gtx 1060. That's what im talking about.
 
Joined
Jan 23, 2016
Messages
96 (0.03/day)
Location
Sofia, Bulgaria
Processor Ryzen 5 5600X I Core i7 6700K
Motherboard B550 Phantom Gaming 4 I Asus Z170-A ATX
Video Card(s) RX 6900 XT PowerColor Red Devil I RTX 3080 Palit GamingPro
Storage Intel 665P 2TB I Intel 660p 2TB
Case NZXT S340 Elite I Some noname case lmao
Mouse Logitech G Pro Wired
Keyboard Wooting Two Lekker Edition
Sure you are well competent. Scales pretty much proportional. So your source is grossly incompetent.


I literally have the game. IDK where Wizzard is testing, but any and all late-game levels perform worse on 8GB cards.

I personally test on Urdak since its a heavy and awesome map (perhaps third best in the game).
My 2080 and 5700 XT both choke here. The otherwise inferior (at lower resolutions) 1080 Ti outperforms them. It loses if you use lower texture settings. Its therefor VRAM.

I'd love it if people actually owned the hardware and games before talking bull :p
 
Joined
Sep 17, 2020
Messages
21 (0.02/day)
The experience is now streamlined meaning that game engines like Unreal Engine and Unity can support it out of the box. That in turn means a lot more adoption for DLSS going forward.

This isn't quite accurate. Yes, you can enable it as you develop and try it out but if you want to ship a game with it then you need Nvidia's explicit blessing. That's why even though it's easy to implement you don't see it get widespread adoption. It's the same for every other RTX/Gameworks feature you see out there.
 
Joined
Feb 23, 2019
Messages
5,637 (2.99/day)
Location
Poland
Processor Ryzen 7 5800X3D
Motherboard Gigabyte X570 Aorus Elite
Cooling Thermalright Phantom Spirit 120 SE
Memory 2x16 GB Crucial Ballistix 3600 CL16 Rev E @ 3800 CL16
Video Card(s) RTX3080 Ti FE
Storage SX8200 Pro 1 TB, Plextor M6Pro 256 GB, WD Blue 2TB
Display(s) LG 34GN850P-B
Case SilverStone Primera PM01 RGB
Audio Device(s) SoundBlaster G6 | Fidelio X2 | Sennheiser 6XX
Power Supply SeaSonic Focus Plus Gold 750W
Mouse Endgame Gear XM1R
Keyboard Wooting Two HE
And this is patently false. At the very minimum you can use textures streaming or make levels smaller. There are other methods as well.

Speaking of the RT performance in Watch dogs: Legion for AMD cards:



I'm not sure they are comparable yet. Something is definitely missing. :cool:

Better comparison.
 
Joined
Jul 8, 2019
Messages
169 (0.10/day)
Yeah either something fishy is going in watch dogs legion, or the difference between AMD RT and nvidia RT is night and day. AMD probably decreased details to not completely hammer the frames. I wonder if these quality differences affect other RT games, have reviewers compared under a great scrutiny screenshots from both amd RT and nvidia RT implementations? Maybe they are differences like these in watch dog legion in other games too?
 
Last edited:
Joined
Aug 25, 2017
Messages
32 (0.01/day)
Some are abit too emotional, like they have something at stake.
I'm not even sure what are you arguing about anymore, fact of the matter is both gpu vendors will sell anything they made for months beause demand outstrips supply.

Defending their scalpers price 3080 purchase may be? :)

Saying DLSS made your nVidia purchase future proof is like, I don't know, Dumb? And I'm a RTX 2080 user. If I have to rely on DLSS/lower quality to play future games, I think it's time to upgrade. DLSS future proof my ####
 
Joined
Nov 4, 2020
Messages
38 (0.03/day)
System Name Ancient PC Gaming
Processor Intel i7 2600K @4.5GHz
Motherboard Gigabyte GA-Z77X-D3H
Cooling noctua U12p SE2
Memory Team Vulkan PC2400 DDR3 32GB
Video Card(s) NVIDIA GTX 1080 Founder Edition
Storage Adata xpg sx8200 pro 512GB (boot with Z77 NVME Bios Mod)
Display(s) LG Prototype BFGD Monitor
Case Enermax Makashi
Audio Device(s) VIA VT2021
Power Supply Seasonic Focus Plus GOLD FX-850
Mouse Logitech MI85
Keyboard Ducky Shine 2
Software Windows 10 2004
Woww... nice! Great job! kudos to AMD! :rockout:

I hope after read this article, many people that pre-order RTX 3000 would change their mind, cancel their pre-order and turn to AMD card :D
Since I'm on queue 41th now. I hope with that I will got my new RTX 3080 before this Christmas :clap: ... LOL!!!



ps. just kidding!:roll:
 
Last edited:
Top