• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Releases Even More RX 6900 XT and RX 6800 XT Benchmarks Tested on Ryzen 9 5900X

Joined
Oct 17, 2014
Messages
5,344 (2.39/day)
Location
USA
System Name Paladius Tacet
Processor Ryzen 5600x @ Ryzen Master @ Auto OC
Motherboard MSI X570 Tomahawk
Cooling Arctic Freezer 34 DUO
Memory G.Skill 2x16 3600 14-14-14-34 Dual Rank
Video Card(s) Navi 6800 + Smart Access Memory Enabled
Storage 2TB SSD
Display(s) Acer Nitro XF243Y 23.8" 0.5ms IPS 165hz 1080p
Case Corsair 110Q Silent + NZXT Aer-P exhaust fan
Power Supply EVGA 700w Gold
Mouse Logitech G502 Hero SE
Keyboard Logitech Cherry Mx Red
Benchmark Scores Cinebench R20 Single of 618 3dMark Time Spy, 15,385 graphics score, Firestrike 37k 57.2 ns on RAM
What the hell are you on about ?
I don't think he understands its easy to get high refresh these days. i am playing dragons dogma maxed out at 144 fps 1440p with a gtx 1070... lol
 
Joined
Jan 8, 2017
Messages
6,112 (4.30/day)
System Name Good enough
Processor AMD Ryzen R7 1700X - 4.0 Ghz / 1.350V
Motherboard ASRock B450M Pro4
Cooling Deepcool Gammaxx L240 V2
Memory 16GB - Corsair Vengeance LPX - 3333 Mhz CL16
Video Card(s) OEM Dell GTX 1080 with Kraken G12 + Water 3.0 Performer C
Storage 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) 4K Samsung TV
Case Deepcool Matrexx 70
Power Supply GPS-750C
It means to get >60fps in single player game with your GPU, you are mostly playing with Low setting at 4K.
Or you actually never play any game and just decide that everyone need high FPS.
I am going to have to gather a team of researches to try and figure out what it is that you are saying.

Why say that 60 FPS is enough for single player games, there is absolutely no logic to that. You think everyone who has a higher refresh display will think to themselves that they need the highest possible frame rate in a multiplayer title but when switching to a singe player one suddenly they no longer need more than precisely 60 ? How the hell does that work, truly mind boggling.
 
Joined
Nov 11, 2016
Messages
709 (0.48/day)
System Name The de-ploughminator
Processor I7 8700K @ 5.1Ghz
Motherboard Gigabyte Z370 Gaming 5
Cooling Custom Watercooling
Memory 4x8GB G.Skill Trident Neo 3600mhz 15-15-15-30
Video Card(s) RTX 2080 Ti + Heatkiller IV wb
Storage Plextor 512GB nvme SSD
Display(s) LG 34GN850-B
Case Lian Li 011D Dynamic
Audio Device(s) Creative AE-5
Power Supply Corsair RM1000
I am going to have to gather a team of researches to try and figure out what it is that you are saying.

Why say that 60 FPS is enough for single player games, there is absolutely no logic to that. You think someone who has a higher refresh display will think to themselves that they need the highest possible frame rate in a multiplayer title but when switching to a singe player one suddenly they no longer need more than precisely 60 ? How the hell does that work, truly mind boggling.
60FPS is enough to enjoy single player game at the best visual fidelity you can get.
I bought the very first 144hz 1440p display, but I played The Witcher 3 at 60fps with the best visuals, no point in downgrading visual just to get >60FPS.
Same with Metro Exodus, Control, why would I sacrifice visual when I have ~60FPS already ?

Now switching to competitive games like PUBG, Modern Warfare, Overwatch and I lower every setting just to get the highest FPS, why wouldn't I ?
 
Joined
Jan 8, 2017
Messages
6,112 (4.30/day)
System Name Good enough
Processor AMD Ryzen R7 1700X - 4.0 Ghz / 1.350V
Motherboard ASRock B450M Pro4
Cooling Deepcool Gammaxx L240 V2
Memory 16GB - Corsair Vengeance LPX - 3333 Mhz CL16
Video Card(s) OEM Dell GTX 1080 with Kraken G12 + Water 3.0 Performer C
Storage 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) 4K Samsung TV
Case Deepcool Matrexx 70
Power Supply GPS-750C
no point in downgrading visual just to get >60FPS.
Still can't figure out that this is a purely subjective conclusion ?
 
Joined
Nov 11, 2016
Messages
709 (0.48/day)
System Name The de-ploughminator
Processor I7 8700K @ 5.1Ghz
Motherboard Gigabyte Z370 Gaming 5
Cooling Custom Watercooling
Memory 4x8GB G.Skill Trident Neo 3600mhz 15-15-15-30
Video Card(s) RTX 2080 Ti + Heatkiller IV wb
Storage Plextor 512GB nvme SSD
Display(s) LG 34GN850-B
Case Lian Li 011D Dynamic
Audio Device(s) Creative AE-5
Power Supply Corsair RM1000
Still can't figure out that this is a purely subjective conclusion ?
Tell me which editorial who would recommend downgrading visual to get >60fps ? You ?
Because I can give you many editorials who would target 60FPS gaming
 
Last edited:
Joined
Aug 25, 2015
Messages
81 (0.04/day)
Location
Denmark
System Name Red Bandit
Processor Ryzen 7 3800X 4.425 1.325v FLCK 1900
Motherboard ROG Strix X570-F Gaming
Cooling Noctua NH-U12A
Memory G.SKILL Ripjaws 3800 CL16
Video Card(s) RTX 2070 SUPER DUAL EVO2
Storage Adata SX8200 PRO 2TB / Samsung 950Pro 512GB
Display(s) ROG STRIX XG32VQR VA Freesync/Gsync Compatible
Case Phanteks P400A
Audio Device(s) Logitech G533
Power Supply Corsair RM850 (2019)
Mouse Logitech G502
Keyboard Logitech G710+ Blue
Software W10 Pro
What is it with people and RayTracing suddenly ? Since it was announced and until before AMD showed the new cars , NOBODY was talking about RayTracing and now ?

Just be glad for the fricking competetion , its a win for all us customers. Nvidia and AMD really dont care about us , they just wanna make as much money as possible.
 

mystera

New Member
Joined
Aug 24, 2020
Messages
2 (0.02/day)
How about AMD release COD Modern Warfare, BF5, SOTR with DXR benchmark number then ? make it easier to gauge RX6000 RT capability.
That's like asking: How do AMD cards perform in games that make heavy use of ®HairWorks???

The RTX you refer to is nVidia's proprietary implementation of Microsoft's DirectX raytracing. How (and why) do you expect AMD hardware to perform better than nvidia hardware in ®RTX games? It makes no sense for AMD to even attempt that. Especially since, going forward, most games will implement AMD's (proprietary or open) version of raytracing, and only "some" of them will ship with ®RTX support along the console version of raytracing.

And based on how well AMD's raytracing looks and performs, nvidia's ®RTX may (in time) become a niche feature for select sponsored games... To keep ®RTX relevant, nvidia will have to invest more than it's worth on hardware and software (game dev partners) development, and given nvidia's new focus on enterprise, it may be a hard sell (to investors).
 
Joined
Jan 8, 2017
Messages
6,112 (4.30/day)
System Name Good enough
Processor AMD Ryzen R7 1700X - 4.0 Ghz / 1.350V
Motherboard ASRock B450M Pro4
Cooling Deepcool Gammaxx L240 V2
Memory 16GB - Corsair Vengeance LPX - 3333 Mhz CL16
Video Card(s) OEM Dell GTX 1080 with Kraken G12 + Water 3.0 Performer C
Storage 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) 4K Samsung TV
Case Deepcool Matrexx 70
Power Supply GPS-750C
Tell me which editorial who would recommend downgrading visual to get >60fps ? You ?
I really don't know in what world do you live but almost everyone recommends turning down visuals to hit your monitor's native refresh rate. Not that it matters because that's still a subjective recommendation.

However, buying a high refresh monitor and then trying to convince yourself or others that you should actually play at 60 because "there is no point" sounds like a really intelligent conclusion, I gotta say. Because that's why most people buy a high refresh monitor, to then play at 60hz, right ?
 
Joined
Jun 25, 2013
Messages
37 (0.01/day)
System Name Rayzen
Processor 2700x
Motherboard asus prime x370-pro
Cooling NH-U12SSE-AM4
Memory G.SKILL TRIDENTZ F4-3200c14D-32G @3000
Video Card(s) RTX 2080 TI
Storage Force MP510
Display(s) SAMSUNG 40" TV
Case CORSAIR CARBIDE series 100R
Power Supply CORSAIR RM 650x
Now switching to competitive games like PUBG, Modern Warfare, Overwatch and I lower every setting just to get the highest FPS, why wouldn't I ?
Tell me which editorial who would recommend downgrading visual to get >60fps ? You ?
I am honestly very confused as of what is it you want or trying to imply?


Different players have different preferences when gaming.
 
Joined
Sep 1, 2020
Messages
75 (0.85/day)
Location
Bulgaria
There are real RT(Ms DXR) and Nvidia RT. Real RT will be in 100% of games Nvidia RT in no more than 10% of this 100%. I think there will be not games exclusive with support Nvidia RT only.
 
Joined
Jun 13, 2012
Messages
1,211 (0.39/day)
System Name desktop
Processor i7-4770k
Motherboard Asus z87-plus
Cooling Corsair h80
Memory 32gb G.Skill Ares @ 2400mhz
Video Card(s) EVGA GeForce GTX 1080 SC (ACX 3.0)
Storage 2x Samsung 1tb 850evo SSD, 3x WD drives, 2 seagate
Display(s) Acer Predator XB271HU 27inch IPS G-Sync 165hz
Audio Device(s) Sound Blaster x-FI Platium, Turtle beach Elite pro 2 + superamp.
Power Supply OCZ Z Series 850W (10 years strong)
Mouse Logitech G502 hero
Keyboard Logitech G710+
That's like asking: How do AMD cards perform in games that make heavy use of ®HairWorks???

The RTX you refer to is nVidia's proprietary implementation of Microsoft's DirectX raytracing. How (and why) do you expect AMD hardware to perform better than nvidia hardware in ®RTX games? It makes no sense for AMD to even attempt that. Especially since, going forward, most games will implement AMD's (proprietary or open) version of raytracing, and only "some" of them will ship with ®RTX support along the console version of raytracing.

And based on how well AMD's raytracing looks and performs, nvidia's ®RTX may (in time) become a niche feature for select sponsored games... To keep ®RTX relevant, nvidia will have to invest more than it's worth on hardware and software (game dev partners) development, and given nvidia's new focus on enterprise, it may be a hard sell (to investors).
This sounds like typical AMD excuse for why their card sucks with what was/is a standard.
 
Joined
Feb 11, 2009
Messages
2,870 (0.67/day)
System Name Cyberline
Processor Intel Core i7 2600k
Motherboard Asus P8P67 LE Rev 3.0
Cooling Tuniq Tower 120
Memory Corsair (4x2) 8gb 1600mhz
Video Card(s) AMD RX480
Storage Samsung 750 Evo 250gb SSD + WD 1tb x 2 + WD 2tb
Display(s) Philips 32inch LPF5605H (television)
Case antec 600
Audio Device(s) Focusrite 2i4 (USB)
Power Supply Seasonic 620watt 80+ Platinum
Mouse Elecom EX-G
Keyboard Rapoo V700
Software Windows 10 Pro 64bit
Good day everyone, I'm the person geniunely excited about raytracing. At least a third of the people I'm playing games with are excited too. Why? Because when implemented like in Control or Quake 2 it's legitimately the only eye candy tech that looks fresh to my eyes and makes me want to spend 700$ on a goddamn videocard. There are shite examples (Watch Dogs, WoW: Shadowlands, Dirt 5, BFV, etc.) where there are just some reflections or just some shadows and it doesn't make a difference but when it's all-out or close to it - man it looks amazing!
Hey, I'm also excited about 6800XT. I wanna know if it can do what I want it to do. Absolutely tired of people shouting that nobody cares. Where's your market research, mate?
I dont agree, I think the reflections in Watch Dogs look pretty dang impressive, sad its all so heavy so a true ray traced future is still several gens out for sure, but look at Digital's Foundry's latest vid on it
 
Joined
Nov 11, 2016
Messages
709 (0.48/day)
System Name The de-ploughminator
Processor I7 8700K @ 5.1Ghz
Motherboard Gigabyte Z370 Gaming 5
Cooling Custom Watercooling
Memory 4x8GB G.Skill Trident Neo 3600mhz 15-15-15-30
Video Card(s) RTX 2080 Ti + Heatkiller IV wb
Storage Plextor 512GB nvme SSD
Display(s) LG 34GN850-B
Case Lian Li 011D Dynamic
Audio Device(s) Creative AE-5
Power Supply Corsair RM1000
I really don't know in what world do you live but almost everyone recommends turning down visuals to hit your monitor's native refresh rate. Not that it matters because that's still a subjective recommendation.

However, buying a high refresh monitor and then trying to convince yourself or others that you should actually play at 60 because "there is no point" sounds like a really intelligent conclusion, I gotta say. Because that's why most people buy a high refresh monitor, to then play at 60hz, right ?
Sure just tell me which games you do you play exactly ? CSGO ? Youtube videos ?
Almost everyone recommends turning down visuals in AAA games to hit 144hz 1440p ? Yeah I really need some confirmation on that. No one would want to play AAA games with Low settings just to hit 144hz, that I'm sure of.

I didn't say anyone should play at 60FPS, if you have already max out all the graphical settings and can still getting >60FPS, then play at >60FPS, although capping the framerate really help with input latency with Nvidia Low Latency and AMD Anti Lag in certain games.
 
Last edited:
Joined
Feb 11, 2009
Messages
2,870 (0.67/day)
System Name Cyberline
Processor Intel Core i7 2600k
Motherboard Asus P8P67 LE Rev 3.0
Cooling Tuniq Tower 120
Memory Corsair (4x2) 8gb 1600mhz
Video Card(s) AMD RX480
Storage Samsung 750 Evo 250gb SSD + WD 1tb x 2 + WD 2tb
Display(s) Philips 32inch LPF5605H (television)
Case antec 600
Audio Device(s) Focusrite 2i4 (USB)
Power Supply Seasonic 620watt 80+ Platinum
Mouse Elecom EX-G
Keyboard Rapoo V700
Software Windows 10 Pro 64bit
This sounds like typical AMD excuse for why their card sucks with what was/is a standard.
I think you do not know what "standard" means or how its applied.
 
Joined
Dec 31, 2009
Messages
18,984 (4.76/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
dragons dogma
This doesn't explain it to you? The title? Monkeys with crayons can draw the scenes fast enough, lol.

From 2016: "Given its old-gen nature, Dragon’s Dogma: Dark Arisen is not really a demanding title."


Just saying. ;)


That's like asking: How do AMD cards perform in games that make heavy use of ®HairWorks???
Its nothing like it, really. AMD, like NV uses DXR. They're both using the same API for RT.
WCCFamdrdna2raytracing2-740x415.jpg


And based on how well AMD's raytracing looks and performs
Was anything official released on AMD RT performance?

RTX is hardware on the card. NV cards use DXR API for RT just as AMD will.
 
Last edited:
Joined
Aug 5, 2019
Messages
475 (0.99/day)
System Name Neon Master
Processor AMD Ryzen 3900x
Motherboard x570 Aorus Master F30
Cooling H115i RGB Platinum 280mm/Case Fans NF-A14 x4
Memory G.Skill Neo 32GB @3600 MT/s C16
Storage MP600 2TB
Display(s) LG 27GL850-b
Case Phanteks Evolv X
Audio Device(s) DT 770 Pro
Power Supply Seasonic Prime Ultra Titanium 1000w
Mouse Scimitar Pro
Keyboard K95 Platinum
Good day everyone, I'm the person geniunely excited about raytracing. At least a third of the people I'm playing games with are excited too. Why? Because when implemented like in Control or Quake 2 it's legitimately the only eye candy tech that looks fresh to my eyes and makes me want to spend 700$ on a goddamn videocard. There are shite examples (Watch Dogs, WoW: Shadowlands, Dirt 5, BFV, etc.) where there are just some reflections or just some shadows and it doesn't make a difference but when it's all-out or close to it - man it looks amazing!
Hey, I'm also excited about 6800XT. I wanna know if it can do what I want it to do. Absolutely tired of people shouting that nobody cares. Where's your market research, mate?
I am similarly excited about RT.
It's just fanboys shouting at fanboys at this point. These same people would have been the ones mocking Geforce 256 back in the day about HWT&L, pay them no heed.

We're just in that awkward phase now where DXR is still an unknown for most people and we still dont know for sure if this years or maybe the next cycle is the one that will bring mainstream acceptance/performance to RT. I personally am not aware of any non DXR games tho i do believe those nvidia developed ones like Quake RTX are probably going to be nvidia HW only. I doubt games like Control aren't going to work on AMD, I suspect it's just AMD's software side of things being still not ready enough. I would expect a lot of growing pains for the first half of 2021 and AMD DXR. Hopefully I'm wrong, but they are going into this dealing with a 2 year handicap.

New games will have cross brand hw to work with soon, and as someone with a 2070super all i can say is the DXR game library is veeery small still, and its only going to really grow now with the new consoles since more or less all cross platform AAA titles will be coming with some form of RT once this first cross platform year of releases is over. (and already some of those cross plats are coming with RT anyway) So it bodes well overall for us mid to long term, regardless of hardware brand choices or.. god forbid loyalties.
 
Joined
Dec 31, 2009
Messages
18,984 (4.76/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
You mean the DXR standard as opposed to Nvidia's proprietary RT
Who's proprietary RT? Nvidia uses DXR as well...

When DXR is enabled by a Game Ready Driver, targeted for April (2019), the supported GeForce GTX graphics cards will work without game updates because ray-traced games are built on DirectX 12’s DirectX Raytracing API, DXR. This industry standard API uses compute-like ray tracing workloads that are compatible with both dedicated hardware units, such as the RT Cores, and the GPU’s general purpose shader cores.
 
Last edited:
Joined
Apr 10, 2020
Messages
182 (0.78/day)
I must admit 3080XT and 3090 look impressive performance and pricewise. But 6800 disappoints big time. Not because of it's performance, but because of the price. 10-15% more rasterization performance for 16% more money offers slightly worse price/performance ratio than 3070 which is a BIG SHAME. 3070 would look utterly silly if AMD had chosen to price 6800 at $499. At $579, 3070 remains viable alternative. 3070 minuses are less vram, poorer standard rasterization & pluses better driver support, CUDA cores (for Adobe apps), AI upscaling (DLSS), probably better looking ray tracing. I'd say it's a draw. BUT given that Nvidia has much better brand recognition when it comes to GPUs, AMD will have hard time selling 6800 at the given MSRP IF Nvidia can actually produce enough Ampere GPUs to satisfy demand, which might not be the case in the near future.
 
Joined
Jun 2, 2017
Messages
3,115 (2.44/day)
System Name Best AMD Computer
Processor AMD TR4 1920X
Motherboard MSI X399 SLI Plus
Cooling Alphacool Eisbaer 420 x2 Noctua XPX Pro TR4 block
Memory Gskill RIpjaws 4 3000MHZ 48GB
Video Card(s) Sapphire Vega 64 Nitro, Gigabyte Vega 64 Gaming OC
Storage 6 x NVME 480 GB, 2 x SSD 2TB, 5TB HDD, 2 TB HDD, 2x 2TB SSHD
Display(s) Acer 49BQ0k 4K monitor
Case Thermaltake Core X9
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Corsair HX1200!
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 10 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 24955 Time Spy: 13500
60FPS is enough to enjoy single player game at the best visual fidelity you can get.
I bought the very first 144hz 1440p display, but I played The Witcher 3 at 60fps with the best visuals, no point in downgrading visual just to get >60FPS.
Same with Metro Exodus, Control, why would I sacrifice visual when I have ~60FPS already ?

Now switching to competitive games like PUBG, Modern Warfare, Overwatch and I lower every setting just to get the highest FPS, why wouldn't I ?
I can say with confidence that the Division 2 feels much smoother at 120+ FPS and looks beautiful. One of the differences between 60 FPS and 120+ FPS in that Game (and I suspect a few more) allows you to aim easier. To keep it simple in the Division 2 with a automatic rifle (900+ round) you can get headshot kills vs having the shots go all over the place. I love how from reading most of what you post that you are undeniably in favor of Nvidia GPUs. It is kind of foolish though that after you yourself bragged about the 3090 being unassailable for AMD. That you now make the ridiculous argument that 60 FPS is enough in Games period with the benchmarks that AMD has released? I suppose you will now remind me of the joy of DLSS (Which the 6000 series does not need for high......FPS) and RTX Ray tracing which may go the way of beta. You see beta was better than VHS but VHS is what the consumer market adopted. After about 5 years you could not find a beta version of popular culture anything. Which brings me to my last point....You don't have to downgrade anything to enjoy those same Games you mentioned at high FPS using 6000 series GPUS though. I am objective enough to say that Nvidia's 3000 series are nice cards but the way Nvidia is so relentless in trying to control mind share is desultory.
 
Joined
Nov 11, 2016
Messages
709 (0.48/day)
System Name The de-ploughminator
Processor I7 8700K @ 5.1Ghz
Motherboard Gigabyte Z370 Gaming 5
Cooling Custom Watercooling
Memory 4x8GB G.Skill Trident Neo 3600mhz 15-15-15-30
Video Card(s) RTX 2080 Ti + Heatkiller IV wb
Storage Plextor 512GB nvme SSD
Display(s) LG 34GN850-B
Case Lian Li 011D Dynamic
Audio Device(s) Creative AE-5
Power Supply Corsair RM1000
I can say with confidence that the Division 2 feels much smoother at 120+ FPS and looks beautiful. One of the differences between 60 FPS and 120+ FPS in that Game (and I suspect a few more) allows you to aim easier. To keep it simple in the Division 2 with a automatic rifle (900+ round) you can get headshot kills vs having the shots go all over the place. I love how from reading most of what you post that you are undeniably in favor of Nvidia GPUs. It is kind of foolish though that after you yourself bragged about the 3090 being unassailable for AMD. That you now make the ridiculous argument that 60 FPS is enough in Games period with the benchmarks that AMD has released? I suppose you will now remind me of the joy of DLSS (Which the 6000 series does not need for high......FPS) and RTX Ray tracing which may go the way of beta. You see beta was better than VHS but VHS is what the consumer market adopted. After about 5 years you could not find a beta version of popular culture anything. Which brings me to my last point....You don't have to downgrade anything to enjoy those same Games you mentioned at high FPS using 6000 series GPUS though. I am objective enough to say that Nvidia's 3000 series are nice cards but the way Nvidia is so relentless in trying to control mind share is desultory.
Do you play the single player or multi player version of the Division 2 ?
Like I said for competitive game, like the multiplayer version of Div2, then I would use Low Settings to get the highest FPS I can get.

Now tell me which do you prefer with your current GPU:
RDR2 High setting ~60fps or 144fps with low settings
AC O High Setting ~60 fps or 144fps with low settings
Horizon Zero Dawn High Setting ~60fps or 144fps with low settings

Well to be clear when I said 60FPS, it's for the minimum FPS.

Yeah sure if you count auto-overclocking and proprietary feature (SAM) that make 6900XT as being equal to 3090, see the hypocrisy there ? Also I can find higher benchmark numbers for 3080/3090 online, so trust AMD numbers with a grain of salt.
 
Last edited:
Joined
Mar 9, 2020
Messages
51 (0.19/day)
I must admit 3080XT and 3090 look impressive performance and pricewise. But 6800 disappoints big time. Not because of it's performance, but because of the price. 10-15% more rasterization performance for 16% more money offers slightly worse price/performance ratio than 3070 which is a BIG SHAME. 3070 would look utterly silly if AMD had chosen to price 6800 at $499. At $579, 3070 remains viable alternative. 3070 minuses are less vram, poorer standard rasterization & pluses better driver support, CUDA cores (for Adobe apps), AI upscaling (DLSS), probably better looking ray tracing. I'd say it's a draw. BUT given that Nvidia has much better brand recognition when it comes to GPUs, AMD will have hard time selling 6800 at the given MSRP IF Nvidia can actually produce enough Ampere GPUs to satisfy demand, which might not be the case in the near future.
The 3070 is at the moment, Unicorn breath, like the rest of the Ampere lineup. What you call "impressive" regards the 3090, becomes idiotic when a $1500 card only beats a $800 card by 10%.
Oh! and CUDA is no good for gaming - Whilst Ray Tracing kills performance without resorting to DLSS.
Raytracing is todays equivalent of Hairworks or Physx.
The leather jacket openly lied to Nvidia's consumer base, claiming the 3090 was "Titan-like" when it clearly isn't, and promising plenty of stock for buyers. The reality is that abysmal yields are the reason
the Ampere series are almost impossible to come by.
 
Joined
Dec 26, 2006
Messages
435 (0.09/day)
Location
Northern Ontario Canada
System Name Just another PC
Processor Ryzen 1700
Motherboard Gigabyte GA-AX370-K3
Cooling Noctua NH-C12P SE14
Memory DDR4-2133 2x16GB
Video Card(s) XFX RX480 8GB
Storage Samy 960 EVO 500GB m.2, 500GB SSD & a 2TB spinner
Display(s) LG 27UL550-W
Case Be Quiet Pure Base 600 (no window)
Audio Device(s) Realtek ALC1220
Power Supply EVGA Supernova G2 550W
Mouse Mionix Naos 8200
Keyboard Corsair with browns
Software W10 Pro x64 v2004
Benchmark Scores Wife says it's fast
The vanilla 6800 is actually looking really strong in the first few of those benchmarks.

It's great that yesterday's $1200 performance is now half price, but what the overwhelming majority have needed for two years is yeserday's $600 performance for $300.
I wonder if they will release a cheaper 8GB version of the 6800?
 
Joined
Jun 18, 2015
Messages
415 (0.21/day)
I must admit 3080XT and 3090 look impressive performance and pricewise. But 6800 disappoints big time. Not because of it's performance, but because of the price. 10-15% more rasterization performance for 16% more money offers slightly worse price/performance ratio than 3070 which is a BIG SHAME. 3070 would look utterly silly if AMD had chosen to price 6800 at $499. At $579, 3070 remains viable alternative. 3070 minuses are less vram, poorer standard rasterization & pluses better driver support, CUDA cores (for Adobe apps), AI upscaling (DLSS), probably better looking ray tracing. I'd say it's a draw. BUT given that Nvidia has much better brand recognition when it comes to GPUs, AMD will have hard time selling 6800 at the given MSRP IF Nvidia can actually produce enough Ampere GPUs to satisfy demand, which might not be the case in the near future.
except the 3070's real price is not 499$. since rtx 2xxx series nvidia has been selling their cards with much higher price than the announced prices.
This is fraud and reviewer sites channels should warn people and condemn nvidia for this but very few does.
 
Top