• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Announces the Radeon RX 6000 Series: Performance that Restores Competitiveness

Joined
Apr 12, 2013
Messages
6,728 (1.68/day)
This is why you buy aftermarket cards, depending on the model not only do you get higher clocks but also (much) better cooling.
 

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
15,800 (4.58/day)
Location
Kepler-186f
This is why you buy aftermarket cards, depending on the model not only do you get higher clocks but also (much) better cooling.

that's not true for rtx 3080 though and AMD seemed like they learned their lesson from last time... but I guess not. only the high end Asus cools better than stock 3080
 
Joined
Apr 12, 2013
Messages
6,728 (1.68/day)
The 3xxx series launch is mostly vaporware. I don't know if Nvidia intended for it to be as such, but they clearly knew where RDNA2 cards would fit & just wanted some early/cheap publicity & sales. This botch job could hurt them hard, unlike AMD they don't have too many customer facing avenues where they can redeem themselves & yeah as some others have said this could be a Ryzen moment for their (AMD) GPU department!
 

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
15,800 (4.58/day)
Location
Kepler-186f
The 3xxx series launch is mostly vaporware. I don't know if Nvidia intended for it to be as such, but they clearly knew where RDNA2 cards would fit & just wanted some early/cheap publicity & sales. This botch job could hurt them hard, unlike AMD they don't have too many customer facing avenues where they can redeem themselves & yeah as some others have said this could be a Ryzen moment for their (AMD) GPU department!

especially since the gpu's do even better with x570 and zen 3 cpu. i'm really hoping i can get my hands on a 5600x and 6800 or 6800 xt. should be my final build for many years, and hoping i can sell my gtx 1070 laptop to at least cover the some of the purchase... bleh
 
Joined
Mar 24, 2012
Messages
528 (0.12/day)
And that really is the only awesome release from 21 Navi lineup. Big lost opportunity. 6800XT priced at $599 and 6800 at $449 would obliterate entire Nvidia A104/A102 lineup. It looks like AMD is content with 20% discrete GPU market share, that's why no price war with Nvidia :( 6900XT is awesome deal for 0.1% of the gaming market that actually buys $1K GPUs.

see what happen for the last 10 years. did price war really help AMD gain more market share?
 
Joined
Nov 4, 2019
Messages
234 (0.15/day)
Uh, WTF?
My graph? Uh... No. I didn't make that graph.
You do know where that graph's from, right?

Ah wait, I think I know what you're saying.
That graph they posted with SAM. Yeah, we don't have Zen3 CPUs to compare to yet. Unlike the 6800XT which was an apples-to-apples comparison, the 6800 graph isn't a fair comparison and is using the Zen3 + Infinity Cache + SAM amalgamated results to make it look better. Some of those results are pretty impressive but they're deliberately unfair to promote the whole 3-part solution of Ryzen5000+X570+RX6800. We won't know until November what the actual GPU-only results are, and I was just going off Lisa Su's actual words when she said "matches a 2080Ti at just $579"

no i meant your markings on the graph make no sense because you are comparing to the /2080ti3070 but the RX 6800 looks to be 18 percent faster at 1440p vs the 2080 ti, it should be 118 on the top of your fraction...
 
Joined
Jul 9, 2015
Messages
3,413 (1.07/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
Once reviews and RT performance
In which games?
Laughable World of Warcraft effects barely anyone notices (bar framerate dip)?

Note how despite DXR being a DirectX standard, API, Jensen Huang still managed to shit all over the place and make "nvidia sponsored" games (at least) use green proprietary extensions.
I guess he doesn't remember how he killed OpenGL.
 

SLK

Joined
Sep 2, 2019
Messages
30 (0.02/day)
IMO, AMD is still a generation behind Nvidia.

Pros for AMD :
- Competitive in rasterization now
- Can compete on price

Cons for AMD:
- 1st gen RT, so its a slower than Ampere's RT
- No DLSS, which gives huge performance advantage in the second most popular game in the world (Fortnite)
- NVENC is far superior to AMD's implementation. (OBS Streaming software has native support for Nvidia but not AMD)
- Nvidia Broadcast is a huge asset to content creators, no equivalent from AMD
- Way behind on professional applications due to Nvidia's broad CUDA support.

My choice is clear.
 
Joined
Jul 9, 2015
Messages
3,413 (1.07/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
According to Digital Foundry, PS5's raytracing performance is similar to RTX 2060 Super.
According to Digital Totally Not Shills But Do It Like Shills Just By Coincidence Foundry, XSeX is merely 2070 at raster perfr.
 

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
15,800 (4.58/day)
Location
Kepler-186f
IMO, AMD is still a generation behind Nvidia.

Pros for AMD :
- Competitive in rasterization now
- Can compete on price

Cons for AMD:
- 1st gen RT, so its a slower than Ampere's RT
- No DLSS, which gives huge performance advantage in the second most popular game in the world (Fortnite)
- NVENC is far superior to AMD's implementation. (OBS Streaming software has native support for Nvidia but not AMD)
- Nvidia Broadcast is a huge asset to content creators, no equivalent from AMD
- Way behind on professional applications due to Nvidia's broad CUDA support.

My choice is clear.


your choice doesn't matter, none of this will be in stock for any of us for 6 months still, lol if then. covid has brought in a ton of people to compete with not just scammers. im going to try my best to get whatever comes in stock first within budget. lol

also, for me i don't care about RT, DLSS (though I do think it is nice but not enough games use it really), I don't care about streaming, don't care about broadcast for same reason, and I don't care about professional stuff.

just a gamer guy that likes to game at high refresh rates 1440p. AMD is 100% for me if i can find it in stock on launch day, big if. heh.
 
Joined
Jul 9, 2015
Messages
3,413 (1.07/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
Speculation for what exactly is AMD's reason for the higher price of the 6800 vs the 3070 has been interesting to read in the past couple of hours.

Some speculated that the ~80 dollar difference was "feature tax", SAM + RAGE Mode allowing it to eke ahead of a 3070;
Is it some sort of a joke?
6800 is said to be 18% faster than 2080Ti.
It has twice the RAM of 3070.
 
Joined
Sep 26, 2012
Messages
856 (0.20/day)
Location
Australia
System Name ATHENA
Processor AMD 7950X
Motherboard ASUS Crosshair X670E Extreme
Cooling Noctua NH-D15S, 7 x Noctua NF-A14 industrialPPC IP67 2000RPM
Memory 2x32GB Trident Z RGB 6000Mhz CL30
Video Card(s) ASUS 4090 Strix
Storage 3 x Kingston Fury 4TB, 4 x Samsung 870 QVO
Display(s) Alienware AW3821DW, Wacom Cintiq Pro 15
Case Fractal Design Torrent
Audio Device(s) Topping A90/D90 MQA, Fluid FPX7 Fader Pro, Beyerdynamic T1 G2, Beyerdynamic MMX300
Power Supply ASUS THOR 1600T
Mouse Xtrfy MZ1 - Zy' Rail, Logitech MX Vertical, Logitech MX Master 3
Keyboard Logitech G915 TKL
VR HMD Oculus Quest 2
Software Windows 11 + OpenSUSE MicroOS
Cons for AMD:
- 1st gen RT, so its a slower than Ampere's RT
- No DLSS, which gives huge performance advantage in the second most popular game in the world (Fortnite)
- NVENC is far superior to AMD's implementation. (OBS Streaming software has native support for Nvidia but not AMD)
- Nvidia Broadcast is a huge asset to content creators, no equivalent from AMD
- Way behind on professional applications due to Nvidia's broad CUDA support.

Just to break this down.

1\ You don't know RT performance numbers. Its likely weaker, but literally have nfi atm. Also, with RT still all being hybrid.... its a feature not a requirement.
2\ AMD mentioned its DLSS alternative, it hasn't detailed it yet however.
3\ No info has been provided about the encoder except what it can support and it supports all current standards.
4\ If Nvidia Broadcast is a value ad for you, thats great. The majority of us just want to play games, not broadcast our warts to the world. See above regarding feature.
5\ CUDA is somewhat of a chicken and egg, became a defacto standard due to initial better support. However, 'massively better' is a misnomer, and is really dependent on what professional task you are doing as more things look to ditch CUDA (which has started to pick up pace over the last year).

tl;dr making definitive statements is crass.
 
Joined
Dec 30, 2010
Messages
2,082 (0.43/day)
This is why you buy aftermarket cards, depending on the model not only do you get higher clocks but also (much) better cooling.

These days there's hardly benefit from going aftermarket vs reference. Apart from AMD delivering a reference with 3 fans with proberly idle-stop and a custom fan profile, in the past even blowers could be tuned and / or adepted the way you wanted it. It's just that both AMD and nvidia select a default fan profile that ramp up pretty late untill the card reaches 80 degrees or so before it goes beserk. All could be solved by setitng the fan profile at approx 40 to 60% of it's duty cycle and you never had any issue with it.

Reference cards these days have the perfect VRM for most users; wether thats water, air or even LN2 as there is hardly any condition these days that you need a larger VRM then reference already is pre-designed for you. You wont be overclocking that much either that you need the VRM working at it's full capacity. The proces node or limitation of the silicon itself wont allow this either. And knowing AMD there's proberly a curent limit running into those chips as well to prevent degrading these.

It's why you see game / boost clock; simular as Zen CPU's they allow higher clocks when light workloads are happening, they go down when heavy workload is being applied. All this is polled every 25ms on hardware level to present you the best clocks possible in relation of temperature and power consumption. This is obviously a bliss for consumers, but it takes the fun out of OC'ing a bit if you plan to run on water. AMD is already maxing out the silicon for you, you have to go subzero to get more out of it.

And if it's not what you need in relation of sound, slap a watercooler and your done. Watercooling always has better efficiency compared to air to be honest. And because of that you could hold the boost clocks longer. I'm glad for AMD to bring back not one but two generations leap forward that competes with the 3090 which costs 1500 compared to 999$. 24GB of Vram is'nt a real reason to pay 500$ more, considering nvidia pays like 4 $ per each extra memory chip. The margins are huge.
 
Joined
May 10, 2020
Messages
733 (0.52/day)
Processor Intel i7 13900K
Motherboard Asus ROG Strix Z690-E Gaming
Cooling Arctic Freezer II 360
Memory 32 Gb Kingston Fury Renegade 6400 C32
Video Card(s) PNY RTX 4080 XLR8 OC
Storage 1 TB Samsung 970 EVO + 1 TB Samsung 970 EVO Plus + 2 TB Samsung 870
Display(s) Asus TUF Gaming VG27AQL1A + Samsung C24RG50
Case Corsair 5000D Airflow
Power Supply EVGA G6 850W
Mouse Razer Basilisk
Keyboard Razer Huntsman Elite
Benchmark Scores 3dMark TimeSpy - 26698 Cinebench R23 2258/40751
I was honestly expecting more, especially from the 6800 that was my target. I mean, 16 Gb of VRAM are highly unnecessary (10 would have been perfect) and the price, probably because of that amount of VRAM, is $50/60 higher than the sweet spot, and definitely too close to the 6800XT.
we know nothing about RT performance, so we should wait for the review before draw any conclusion.

especially since the gpu's do even better with x570 and zen 3 cpu. i'm really hoping i can get my hands on a 5600x and 6800 or 6800 xt. should be my final build for many years, and hoping i can sell my gtx 1070 laptop to at least cover the some of the purchase... bleh
When did they speak about X570 ???
 

Mussels

Freshwater Moderator
Staff member
Joined
Oct 6, 2004
Messages
58,413 (8.21/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
I was honestly expecting more, especially from the 6800 that was my target. I mean, 16 Gb of VRAM are highly unnecessary (10 would have been perfect) and the price, probably because of that amount of VRAM, is $50/60 higher than the sweet spot, and definitely too close to the 6800XT.
we know nothing about RT performance, so we should wait for the review before draw any conclusion.


When did they speak about X570 ???

theres a new feature that if used with B550 or X570, you get a performance boost. seems like that boost was included in the benchmark results, although we dont know much yet.
 
Joined
May 10, 2020
Messages
733 (0.52/day)
Processor Intel i7 13900K
Motherboard Asus ROG Strix Z690-E Gaming
Cooling Arctic Freezer II 360
Memory 32 Gb Kingston Fury Renegade 6400 C32
Video Card(s) PNY RTX 4080 XLR8 OC
Storage 1 TB Samsung 970 EVO + 1 TB Samsung 970 EVO Plus + 2 TB Samsung 870
Display(s) Asus TUF Gaming VG27AQL1A + Samsung C24RG50
Case Corsair 5000D Airflow
Power Supply EVGA G6 850W
Mouse Razer Basilisk
Keyboard Razer Huntsman Elite
Benchmark Scores 3dMark TimeSpy - 26698 Cinebench R23 2258/40751
Speculation for what exactly is AMD's reason for the higher price of the 6800 vs the 3070 has been interesting to read in the past couple of hours.

Some speculated that the ~80 dollar difference was "feature tax", SAM + RAGE Mode allowing it to eke ahead of a 3070; assuming an All-AMD 5000/500/6000 ecosystem. GamersNexus' pinned YT post seemed to think similar; that the ability to use RAGE Mode + SAM was the reasoning behind the increase (not that they agreed or disagreed with it).

But others speculated that it was really to push people towards the 6800 XT ("well, if I'm going to fork out $580; may as well just fork out a bit more and get the 6800 XT instead for an extra $70"); same reason the 6900XT is also still priced a lot higher despite no likely way to squeeze in a "6850 XT" or "6900" (non-XT) between the 6800 XT and 6900 XT.

And still others speculate it was simply so they could make some extra $$$ before dropping the price when the replacement 3070 Ti/S comes out, suddenly wiping out any reason to buy an upcoming 3060 Ti/S or plain 3070 when you can have 2080Ti/3070 performance for cheaper.

And last one I've seen echoed a few times; the price was simply a placeholder, since they didn't yet know how much the 3070 would sell for at the time of filming the show, and could match or lower the price closer to release.
Speculations? It has 16 Gb of VRAM vs 8 Gb of VRAM. Price difference explained.

the problem is: are those 16 Gb somehow useful at the target 1440P resolution ? Hardly...

theres a new feature that if used with B550 or X570, you get a performance boost. seems like that boost was included in the benchmark results, although we dont know much yet.
Ah ok, so not only X570...
 
Joined
Sep 26, 2012
Messages
856 (0.20/day)
Location
Australia
System Name ATHENA
Processor AMD 7950X
Motherboard ASUS Crosshair X670E Extreme
Cooling Noctua NH-D15S, 7 x Noctua NF-A14 industrialPPC IP67 2000RPM
Memory 2x32GB Trident Z RGB 6000Mhz CL30
Video Card(s) ASUS 4090 Strix
Storage 3 x Kingston Fury 4TB, 4 x Samsung 870 QVO
Display(s) Alienware AW3821DW, Wacom Cintiq Pro 15
Case Fractal Design Torrent
Audio Device(s) Topping A90/D90 MQA, Fluid FPX7 Fader Pro, Beyerdynamic T1 G2, Beyerdynamic MMX300
Power Supply ASUS THOR 1600T
Mouse Xtrfy MZ1 - Zy' Rail, Logitech MX Vertical, Logitech MX Master 3
Keyboard Logitech G915 TKL
VR HMD Oculus Quest 2
Software Windows 11 + OpenSUSE MicroOS
Speculations? It has 16 Gb of VRAM vs 8 Gb of VRAM. Price difference explained.

All three are 16GB, but I wouldn't be surprised if there is an 8GB 6800 variant.
 

Mussels

Freshwater Moderator
Staff member
Joined
Oct 6, 2004
Messages
58,413 (8.21/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
Ah ok, so not only X570...
My guess is some form of PCI-E 4.0 caching, using available system ram to speed things up. Maybe thats why they went 16GB as well, slower than the competition but if the datas already loaded/cached, it wont matter
 
Joined
May 10, 2020
Messages
733 (0.52/day)
Processor Intel i7 13900K
Motherboard Asus ROG Strix Z690-E Gaming
Cooling Arctic Freezer II 360
Memory 32 Gb Kingston Fury Renegade 6400 C32
Video Card(s) PNY RTX 4080 XLR8 OC
Storage 1 TB Samsung 970 EVO + 1 TB Samsung 970 EVO Plus + 2 TB Samsung 870
Display(s) Asus TUF Gaming VG27AQL1A + Samsung C24RG50
Case Corsair 5000D Airflow
Power Supply EVGA G6 850W
Mouse Razer Basilisk
Keyboard Razer Huntsman Elite
Benchmark Scores 3dMark TimeSpy - 26698 Cinebench R23 2258/40751
My guess is some form of PCI-E 4.0 caching, using available system ram to speed things up. Maybe thats why they went 16GB as well, slower than the competition but if the datas already loaded/cached, it wont matter
Understood, but since I have a B550 motherboard I was wondering about that feature to be supported on every Ryzen 5000 board. AMD didn’t mention X570, but just Ryzen 5000.
 

SLK

Joined
Sep 2, 2019
Messages
30 (0.02/day)
Just to break this down.

1\ You don't know RT performance numbers. Its likely weaker, but literally have nfi atm. Also, with RT still all being hybrid.... its a feature not a requirement.
2\ AMD mentioned its DLSS alternative, it hasn't detailed it yet however.
3\ No info has been provided about the encoder except what it can support and it supports all current standards.
4\ If Nvidia Broadcast is a value ad for you, thats great. The majority of us just want to play games, not broadcast our warts to the world. See above regarding feature.
5\ CUDA is somewhat of a chicken and egg, became a defacto standard due to initial better support. However, 'massively better' is a misnomer, and is really dependent on what professional task you are doing as more things look to ditch CUDA (which has started to pick up pace over the last year).

tl;dr making definitive statements is crass.

1) AMD DLSS alternative is an unknown, I don't pay for unknown. DLSS 2.0 is here and it "just works".
2) Again AMD's encoder is an unknown, I can only base on recent history and Nvidia's solution is proven to be superior at this moment.
3) RT can only be appreciated by someone who values realistic graphics, for the rest, it's just a feature.

Even if I were a pure gamer, there is no reason for me to go AMD. When I pay almost $600-$700 for a GPU, $50 difference is nothing for these added features, not to mention, AMD has to regain some credibility with its driver problems in the last year.
 
Joined
Sep 26, 2012
Messages
856 (0.20/day)
Location
Australia
System Name ATHENA
Processor AMD 7950X
Motherboard ASUS Crosshair X670E Extreme
Cooling Noctua NH-D15S, 7 x Noctua NF-A14 industrialPPC IP67 2000RPM
Memory 2x32GB Trident Z RGB 6000Mhz CL30
Video Card(s) ASUS 4090 Strix
Storage 3 x Kingston Fury 4TB, 4 x Samsung 870 QVO
Display(s) Alienware AW3821DW, Wacom Cintiq Pro 15
Case Fractal Design Torrent
Audio Device(s) Topping A90/D90 MQA, Fluid FPX7 Fader Pro, Beyerdynamic T1 G2, Beyerdynamic MMX300
Power Supply ASUS THOR 1600T
Mouse Xtrfy MZ1 - Zy' Rail, Logitech MX Vertical, Logitech MX Master 3
Keyboard Logitech G915 TKL
VR HMD Oculus Quest 2
Software Windows 11 + OpenSUSE MicroOS
is an unknown

And Nvidia availability doesn't exist. You keep dealing in absolutes over a product that has only been announced in a 20 minute clip and press deck. But sure, go buy a 3080/3090 since you deal in certainty, and you certainly won't find one > <.

And seriously, AMD driver problems? When I got my 2080 Ti I was BSODing for fucks sake. Both vendors have and continue to have issues at times.
 
Joined
Dec 26, 2016
Messages
281 (0.11/day)
Processor Ryzen 3900x
Motherboard B550M Steel Legend
Cooling XPX (custom loop)
Memory 32GB 3200MHz cl16
Video Card(s) 3080 with Bykski block (custom loop)
Storage 980 Pro
Case Fractal 804
Power Supply Focus Plus Gold 750FX
Mouse G603
Keyboard G610 brown
Software yes, lots!
1) AMD DLSS alternative is an unknown, I don't pay for unknown. DLSS 2.0 is here and it "just works".
2) Again AMD's encoder is an unknown, I can only base on recent history and Nvidia's solution is proven to be superior at this moment.
3) RT can only be appreciated by someone who values realistic graphics, for the rest, it's just a feature.

Even if I were a pure gamer, there is no reason for me to go AMD. When I pay almost $600-$700 for a GPU, $50 difference is nothing for these added features, not to mention, AMD has to regain some credibility with its driver problems in the last year.
Well, everything said about RT performance on Radeon is pure speculation at this point. So don't judge it, before there is any evidence! Anyways it remains to be seen how RT will be adopted in the future, right now its a goodie, used on 20 games (none of which interests me). The future may bring broader adoption of RT, as the consoles can do it. But even then, the consoles are slower than the 6800xt and games will be optimized for consoles (as its always been) so RT it will probably run great on the AMD cards too.

I have tried all the different hardware encoders out there, and none of them was good. To be honest they're all shit! Sorry if you want quality, you have to do it in software. So no reason to go green for me there. Only reason to go for NVENC would be if you only had a tiny 4 or 6 core CPU that couldn't handle software encoding. But if you only had such a low tier cpu, you wouldn't buy 3080/6800xt anyways.

DLSS is absolutely overhyped. First of all there is only a handful of games that support it. Then the difference to native rendering is absolutely visible even with 2.0. Its just a blur. And, last but not least, it only makes sense for absolute hardcore esport guys, because 3080 and 6800xt seen to have no problems rendering in 4k/60fps or 1440p/120fps, so its really only a thing for competitive 4k/120fps+ gamers.
 
Joined
May 2, 2017
Messages
7,762 (3.08/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
What uhh, what titles out there support AMD's ray tracing right now?
Most of them, most likely. I don't know of any RTX-enabled games that don't use DXR, which is a DX12_2 feature, which RDNA 2 explicitly supports. It's likely that games will need some tweaks and minor updates to work out any kinks of the specific implementation, but as long as the GPUs support DXR they should work out of the box.
It just occurred to me, the 6800XT shroud design has no vents on the side with the I/O... so all that heat is getting shot out the top of the car direct on to the CPU... Nvidia has the better reference design here, the most hot part gets shot outside the case next to the I/O and and the less hot air shot out in the case at the rear of the card... hmm this will be interesting to see CPU temps with this gpu... that seems like a really bad design flaw... the air is literally going to hit the CPU straight as it leaves the card, especially if you use an air cooler and it will just suck that hot air right in before it has a chance to escape.
GPUs with vertically oriented fins like that are quite common, and typically perform well. Like all open-air/axial fan cooling setups it requires sufficient case airflow, but it really shouldn't be an issue in terms of heating up your CPU. Nvidia's design with a large fan literally blowing hot air into your CPU cooler's intake is much more directly troublesome there, and even that works just fine.


This thread though ... damn. I mean, even though we don't have third-party review numbers yet, so we can't trust the numbers given 100%, we are still seeing AMD promise a bigger jump in performance/W than Nvidia delivered with Maxwell. (And given the risk of shareholder lawsuits, promises need to be reasonably accurate.) Does nobody else understand just how insane that is? The GPU industry hasn't seen anything like this for a decade or more. And they aren't even doing a "best case vs. worst case" comparison, but comparing the (admittedly worst case) 5700 XT against the seemingly overall representative 6800 XT - if they were going for a skewed benchmark, the ultra-binned 6900 XT at the same power would have been the obvious choice.

Yes, the prices could be lower (and given the use of cheaper memory technology and a narrower bus than Nvidia, AMD probably has the margins to do that if competition necessitates it, even with 2x the memory). For now they look to either deliver a bit less performance for 2/3 the price, matching or slightly better performance for $50 less, or somewhat better performance for $79 more. From a purely competitive standpoint, that sounds pretty okay to me. From a "I want the best GPU I can afford" standpoint it would obviously be better if they went into a full price war, but that is highly unlikely in an industry like this. This nonetheless means we for the first time in a long time will have a competitive GPU market across the entire price range.

I'll be mighty interested in a 6800 XT - after seeing 3rd party reviews, obviously - but just as interested in seeing how this pans out across the rest of the lineup. I'm predicting a very interesting year for ~$2-400 GPUs after this, given that $5-600 GPUs are now matching the value of previous-gen $400 GPUs we should get a noticeable bump in perf/$ in that price range.
 
Joined
Dec 26, 2016
Messages
281 (0.11/day)
Processor Ryzen 3900x
Motherboard B550M Steel Legend
Cooling XPX (custom loop)
Memory 32GB 3200MHz cl16
Video Card(s) 3080 with Bykski block (custom loop)
Storage 980 Pro
Case Fractal 804
Power Supply Focus Plus Gold 750FX
Mouse G603
Keyboard G610 brown
Software yes, lots!
For all the people complaining about the prices, remember that 3080 for 700 is absolute nonsense! Or has anyone actually gotten a FE 3080 for 700 bucks? No, nobody has! Cheapest AiB cards are about 850 and winning the lottery is more likely than getting a 3080 for 850 at the moment!

It remains to be seen if there will be reasonable quantities of 6800xt of course, but since AMDs move to built in that cache instead of using exotic hardly available 6x memory and using TSMCs 7nm of which AMD seems to have good availability seems to point to better availability at launch, but we will see.
 
Joined
Aug 12, 2019
Messages
1,689 (1.00/day)
Location
LV-426
System Name Custom
Processor i9 9900k
Motherboard Gigabyte Z390 arous master
Cooling corsair h150i
Memory 4x8 3200mhz corsair
Video Card(s) Galax RTX 3090 EX Gamer White OC
Storage 500gb Samsung 970 Evo PLus
Display(s) MSi MAG341CQ
Case Lian Li Pc-011 Dynamic
Audio Device(s) Arctis Pro Wireless
Power Supply 850w Seasonic Focus Platinum
Mouse Logitech G403
Keyboard Logitech G110
Very happy that amd released new gpu in the higher tier segment that competes with Nvidia.
Hrmmmm now the hard choices... Amd or Nvidia???

the 6800 (non xt) looks good and i think nvidia will answer it with RTX 3080 LE :p
 
Top