Wednesday, October 28th 2020

AMD Announces the Radeon RX 6000 Series: Performance that Restores Competitiveness

AMD (NASDAQ: AMD) today unveiled the AMD Radeon RX 6000 Series graphics cards, delivering powerhouse performance, incredibly life-like visuals, and must-have features that set a new standard for enthusiast-class PC gaming experiences. Representing the forefront of extreme engineering and design, the highly anticipated AMD Radeon RX 6000 Series includes the AMD Radeon RX 6800 and Radeon RX 6800 XT graphics cards, as well as the new flagship Radeon RX 6900 XT - the fastest AMD gaming graphics card ever developed.

AMD Radeon RX 6000 Series graphics cards are built upon groundbreaking AMD RDNA 2 gaming architecture, a new foundation for next-generation consoles, PCs, laptops and mobile devices, designed to deliver the optimal combination of performance and power efficiency. AMD RDNA 2 gaming architecture provides up to 2X higher performance in select titles with the AMD Radeon RX 6900 XT graphics card compared to the AMD Radeon RX 5700 XT graphics card built on AMD RDNA architecture, and up to 54 percent more performance-per-watt when comparing the AMD Radeon RX 6800 XT graphics card to the AMD Radeon RX 5700 XT graphics card using the same 7 nm process technology.
AMD RDNA 2 offers a number of innovations, including applying advanced power saving techniques to high-performance compute units to improve energy efficiency by up to 30 percent per cycle per compute unit, and leveraging high-speed design methodologies to provide up to a 30 percent frequency boost at the same power level. It also includes new AMD Infinity Cache technology that offers up to 2.4X greater bandwidth-per-watt compared to GDDR6-only AMD RDNA -based architectural designs.

"Today's announcement is the culmination of years of R&D focused on bringing the best of AMD Radeon graphics to the enthusiast and ultra-enthusiast gaming markets, and represents a major evolution in PC gaming," said Scott Herkelman, corporate vice president and general manager, Graphics Business Unit at AMD. "The new AMD Radeon RX 6800, RX 6800 XT and RX 6900 XT graphics cards deliver world class 4K and 1440p performance in major AAA titles, new levels of immersion with breathtaking life-like visuals, and must-have features that provide the ultimate gaming experiences. I can't wait for gamers to get these incredible new graphics cards in their hands."

Powerhouse Performance, Vivid Visuals & Incredible Gaming Experiences
AMD Radeon RX 6000 Series graphics cards support high-bandwidth PCIe 4.0 technology and feature 16 GB of GDDR6 memory to power the most demanding 4K workloads today and in the future. Key features and capabilities include:

Powerhouse Performance
  • AMD Infinity Cache - A high-performance, last-level data cache suitable for 4K and 1440p gaming with the highest level of detail enabled. 128 MB of on-die cache dramatically reduces latency and power consumption, delivering higher overall gaming performance than traditional architectural designs.
  • AMD Smart Access Memory - An exclusive feature of systems with AMD Ryzen 5000 Series processors, AMD B550 and X570 motherboards and Radeon RX 6000 Series graphics cards. It gives AMD Ryzen processors greater access to the high-speed GDDR6 graphics memory, accelerating CPU processing and providing up to a 13-percent performance increase on a AMD Radeon RX 6800 XT graphics card in Forza Horizon 4 at 4K when combined with the new Rage Mode one-click overclocking setting.9,10
  • Built for Standard Chassis - With a length of 267 mm and 2x8 standard 8-pin power connectors, and designed to operate with existing enthusiast-class 650 W-750 W power supplies, gamers can easily upgrade their existing large to small form factor PCs without additional cost.
True to Life, High-Fidelity Visuals
  • DirectX 12 Ultimate Support - Provides a powerful blend of raytracing, compute, and rasterized effects, such as DirectX Raytracing (DXR) and Variable Rate Shading, to elevate games to a new level of realism.
  • DirectX Raytracing (DXR) - Adding a high performance, fixed-function Ray Accelerator engine to each compute unit, AMD RDNA 2-based graphics cards are optimized to deliver real-time lighting, shadow and reflection realism with DXR. When paired with AMD FidelityFX, which enables hybrid rendering, developers can combine rasterized and ray-traced effects to ensure an optimal combination of image quality and performance.
  • AMD FidelityFX - An open-source toolkit for game developers available on AMD GPUOpen. It features a collection of lighting, shadow and reflection effects that make it easier for developers to add high-quality post-process effects that make games look beautiful while offering the optimal balance of visual fidelity and performance.
  • Variable Rate Shading (VRS) - Dynamically reduces the shading rate for different areas of a frame that do not require a high level of visual detail, delivering higher levels of overall performance with little to no perceptible change in image quality.
Elevated Gaming Experience
  • Microsoft DirectStorage Support - Future support for the DirectStorage API enables lightning-fast load times and high-quality textures by eliminating storage API-related bottlenecks and limiting CPU involvement.
  • Radeon Software Performance Tuning Presets - Simple one-click presets in Radeon Software help gamers easily extract the most from their graphics card. The presets include the new Rage Mode stable over clocking setting that takes advantage of extra available headroom to deliver higher gaming performance.
  • Radeon Anti-Lag - Significantly decreases input-to-display response times and offers a competitive edge in gameplay.
AMD Radeon RX 6000 Series Product Family
Robust Gaming Ecosystem and Partnerships
In the coming weeks, AMD will release a series of videos from its ISV partners showcasing the incredible gaming experiences enabled by AMD Radeon RX 6000 Series graphics cards in some of this year's most anticipated games. These videos can be viewed on the AMD website.
  • DIRT 5 - October 29
  • Godfall - November 2
  • World of Warcraft : Shadowlands - November 10
  • RiftBreaker - November 12
  • FarCry 6 - November 17
Pricing and Availability
  • AMD Radeon RX 6800 and Radeon RX 6800 XT graphics cards are expected to be available from global etailers/retailers and on AMD.com beginning November 18, 2020, for $579 USD SEP and $649 USD SEP, respectively. The AMD Radeon RX 6900 XT is expected to be available December 8, 2020, for $999 USD SEP.
  • AMD Radeon RX 6800 and RX 6800 XT graphics cards are also expected to be available from AMD board partners, including ASRock, ASUS, Gigabyte, MSI, PowerColor, SAPPHIRE and XFX, beginning in November 2020.
The complete AMD slide deck follows.
Add your own comment

394 Comments on AMD Announces the Radeon RX 6000 Series: Performance that Restores Competitiveness

#151
lynx29
R0H1T
This is why you buy aftermarket cards, depending on the model not only do you get higher clocks but also (much) better cooling.
that's not true for rtx 3080 though and AMD seemed like they learned their lesson from last time... but I guess not. only the high end Asus cools better than stock 3080
Posted on Reply
#152
R0H1T
The 3xxx series launch is mostly vaporware. I don't know if Nvidia intended for it to be as such, but they clearly knew where RDNA2 cards would fit & just wanted some early/cheap publicity & sales. This botch job could hurt them hard, unlike AMD they don't have too many customer facing avenues where they can redeem themselves & yeah as some others have said this could be a Ryzen moment for their (AMD) GPU department!
Posted on Reply
#153
lynx29
R0H1T
The 3xxx series launch is mostly vaporware. I don't know if Nvidia intended for it to be as such, but they clearly knew where RDNA2 cards would fit & just wanted some early/cheap publicity & sales. This botch job could hurt them hard, unlike AMD they don't have too many customer facing avenues where they can redeem themselves & yeah as some others have said this could be a Ryzen moment for their (AMD) GPU department!
especially since the gpu's do even better with x570 and zen 3 cpu. i'm really hoping i can get my hands on a 5600x and 6800 or 6800 xt. should be my final build for many years, and hoping i can sell my gtx 1070 laptop to at least cover the some of the purchase... bleh
Posted on Reply
#154
renz496
RedelZaVedno
And that really is the only awesome release from 21 Navi lineup. Big lost opportunity. 6800XT priced at $599 and 6800 at $449 would obliterate entire Nvidia A104/A102 lineup. It looks like AMD is content with 20% discrete GPU market share, that's why no price war with Nvidia :( 6900XT is awesome deal for 0.1% of the gaming market that actually buys $1K GPUs.
see what happen for the last 10 years. did price war really help AMD gain more market share?
Posted on Reply
#155
Searing
Chrispy_
Uh, WTF?
My graph? Uh... No. I didn't make that graph.
You do know where that graph's from, right?

Ah wait, I think I know what you're saying.
That graph they posted with SAM. Yeah, we don't have Zen3 CPUs to compare to yet. Unlike the 6800XT which was an apples-to-apples comparison, the 6800 graph isn't a fair comparison and is using the Zen3 + Infinity Cache + SAM amalgamated results to make it look better. Some of those results are pretty impressive but they're deliberately unfair to promote the whole 3-part solution of Ryzen5000+X570+RX6800. We won't know until November what the actual GPU-only results are, and I was just going off Lisa Su's actual words when she said "matches a 2080Ti at just $579"
no i meant your markings on the graph make no sense because you are comparing to the /2080ti3070 but the RX 6800 looks to be 18 percent faster at 1440p vs the 2080 ti, it should be 118 on the top of your fraction...
Posted on Reply
#156
medi01
Nephilim666
Once reviews and RT performance
In which games?
Laughable World of Warcraft effects barely anyone notices (bar framerate dip)?

Note how despite DXR being a DirectX standard, API, Jensen Huang still managed to shit all over the place and make "nvidia sponsored" games (at least) use green proprietary extensions.
I guess he doesn't remember how he killed OpenGL.
Posted on Reply
#157
SLK
IMO, AMD is still a generation behind Nvidia.

Pros for AMD :
- Competitive in rasterization now
- Can compete on price

Cons for AMD:
- 1st gen RT, so its a slower than Ampere's RT
- No DLSS, which gives huge performance advantage in the second most popular game in the world (Fortnite)
- NVENC is far superior to AMD's implementation. (OBS Streaming software has native support for Nvidia but not AMD)
- Nvidia Broadcast is a huge asset to content creators, no equivalent from AMD
- Way behind on professional applications due to Nvidia's broad CUDA support.

My choice is clear.
Posted on Reply
#158
medi01
rvalencia
According to Digital Foundry, PS5's raytracing performance is similar to RTX 2060 Super.
According to Digital Totally Not Shills But Do It Like Shills Just By Coincidence Foundry, XSeX is merely 2070 at raster perfr.
Posted on Reply
#159
lynx29
SLK
IMO, AMD is still a generation behind Nvidia.

Pros for AMD :
- Competitive in rasterization now
- Can compete on price

Cons for AMD:
- 1st gen RT, so its a slower than Ampere's RT
- No DLSS, which gives huge performance advantage in the second most popular game in the world (Fortnite)
- NVENC is far superior to AMD's implementation. (OBS Streaming software has native support for Nvidia but not AMD)
- Nvidia Broadcast is a huge asset to content creators, no equivalent from AMD
- Way behind on professional applications due to Nvidia's broad CUDA support.

My choice is clear.
your choice doesn't matter, none of this will be in stock for any of us for 6 months still, lol if then. covid has brought in a ton of people to compete with not just scammers. im going to try my best to get whatever comes in stock first within budget. lol

also, for me i don't care about RT, DLSS (though I do think it is nice but not enough games use it really), I don't care about streaming, don't care about broadcast for same reason, and I don't care about professional stuff.

just a gamer guy that likes to game at high refresh rates 1440p. AMD is 100% for me if i can find it in stock on launch day, big if. heh.
Posted on Reply
#160
medi01
TechLurker
Speculation for what exactly is AMD's reason for the higher price of the 6800 vs the 3070 has been interesting to read in the past couple of hours.

Some speculated that the ~80 dollar difference was "feature tax", SAM + RAGE Mode allowing it to eke ahead of a 3070;
Is it some sort of a joke?
6800 is said to be 18% faster than 2080Ti.
It has twice the RAM of 3070.
Posted on Reply
#161
Camm
SLK
Cons for AMD:
- 1st gen RT, so its a slower than Ampere's RT
- No DLSS, which gives huge performance advantage in the second most popular game in the world (Fortnite)
- NVENC is far superior to AMD's implementation. (OBS Streaming software has native support for Nvidia but not AMD)
- Nvidia Broadcast is a huge asset to content creators, no equivalent from AMD
- Way behind on professional applications due to Nvidia's broad CUDA support.
Just to break this down.

1\ You don't know RT performance numbers. Its likely weaker, but literally have nfi atm. Also, with RT still all being hybrid.... its a feature not a requirement.
2\ AMD mentioned its DLSS alternative, it hasn't detailed it yet however.
3\ No info has been provided about the encoder except what it can support and it supports all current standards.
4\ If Nvidia Broadcast is a value ad for you, thats great. The majority of us just want to play games, not broadcast our warts to the world. See above regarding feature.
5\ CUDA is somewhat of a chicken and egg, became a defacto standard due to initial better support. However, 'massively better' is a misnomer, and is really dependent on what professional task you are doing as more things look to ditch CUDA (which has started to pick up pace over the last year).

tl;dr making definitive statements is crass.
Posted on Reply
#162
Jism
R0H1T
This is why you buy aftermarket cards, depending on the model not only do you get higher clocks but also (much) better cooling.
These days there's hardly benefit from going aftermarket vs reference. Apart from AMD delivering a reference with 3 fans with proberly idle-stop and a custom fan profile, in the past even blowers could be tuned and / or adepted the way you wanted it. It's just that both AMD and nvidia select a default fan profile that ramp up pretty late untill the card reaches 80 degrees or so before it goes beserk. All could be solved by setitng the fan profile at approx 40 to 60% of it's duty cycle and you never had any issue with it.

Reference cards these days have the perfect VRM for most users; wether thats water, air or even LN2 as there is hardly any condition these days that you need a larger VRM then reference already is pre-designed for you. You wont be overclocking that much either that you need the VRM working at it's full capacity. The proces node or limitation of the silicon itself wont allow this either. And knowing AMD there's proberly a curent limit running into those chips as well to prevent degrading these.

It's why you see game / boost clock; simular as Zen CPU's they allow higher clocks when light workloads are happening, they go down when heavy workload is being applied. All this is polled every 25ms on hardware level to present you the best clocks possible in relation of temperature and power consumption. This is obviously a bliss for consumers, but it takes the fun out of OC'ing a bit if you plan to run on water. AMD is already maxing out the silicon for you, you have to go subzero to get more out of it.

And if it's not what you need in relation of sound, slap a watercooler and your done. Watercooling always has better efficiency compared to air to be honest. And because of that you could hold the boost clocks longer. I'm glad for AMD to bring back not one but two generations leap forward that competes with the 3090 which costs 1500 compared to 999$. 24GB of Vram is'nt a real reason to pay 500$ more, considering nvidia pays like 4 $ per each extra memory chip. The margins are huge.
Posted on Reply
#163
Max(IT)
I was honestly expecting more, especially from the 6800 that was my target. I mean, 16 Gb of VRAM are highly unnecessary (10 would have been perfect) and the price, probably because of that amount of VRAM, is $50/60 higher than the sweet spot, and definitely too close to the 6800XT.
we know nothing about RT performance, so we should wait for the review before draw any conclusion.
lynx29
especially since the gpu's do even better with x570 and zen 3 cpu. i'm really hoping i can get my hands on a 5600x and 6800 or 6800 xt. should be my final build for many years, and hoping i can sell my gtx 1070 laptop to at least cover the some of the purchase... bleh
When did they speak about X570 ???
Posted on Reply
#164
Mussels
Moderprator
Max(IT)
I was honestly expecting more, especially from the 6800 that was my target. I mean, 16 Gb of VRAM are highly unnecessary (10 would have been perfect) and the price, probably because of that amount of VRAM, is $50/60 higher than the sweet spot, and definitely too close to the 6800XT.
we know nothing about RT performance, so we should wait for the review before draw any conclusion.


When did they speak about X570 ???
theres a new feature that if used with B550 or X570, you get a performance boost. seems like that boost was included in the benchmark results, although we dont know much yet.
Posted on Reply
#165
Max(IT)
TechLurker
Speculation for what exactly is AMD's reason for the higher price of the 6800 vs the 3070 has been interesting to read in the past couple of hours.

Some speculated that the ~80 dollar difference was "feature tax", SAM + RAGE Mode allowing it to eke ahead of a 3070; assuming an All-AMD 5000/500/6000 ecosystem. GamersNexus' pinned YT post seemed to think similar; that the ability to use RAGE Mode + SAM was the reasoning behind the increase (not that they agreed or disagreed with it).

But others speculated that it was really to push people towards the 6800 XT ("well, if I'm going to fork out $580; may as well just fork out a bit more and get the 6800 XT instead for an extra $70"); same reason the 6900XT is also still priced a lot higher despite no likely way to squeeze in a "6850 XT" or "6900" (non-XT) between the 6800 XT and 6900 XT.

And still others speculate it was simply so they could make some extra $$$ before dropping the price when the replacement 3070 Ti/S comes out, suddenly wiping out any reason to buy an upcoming 3060 Ti/S or plain 3070 when you can have 2080Ti/3070 performance for cheaper.

And last one I've seen echoed a few times; the price was simply a placeholder, since they didn't yet know how much the 3070 would sell for at the time of filming the show, and could match or lower the price closer to release.
Speculations? It has 16 Gb of VRAM vs 8 Gb of VRAM. Price difference explained.

the problem is: are those 16 Gb somehow useful at the target 1440P resolution ? Hardly...
Mussels
theres a new feature that if used with B550 or X570, you get a performance boost. seems like that boost was included in the benchmark results, although we dont know much yet.
Ah ok, so not only X570...
Posted on Reply
#166
Camm
Max(IT)
Speculations? It has 16 Gb of VRAM vs 8 Gb of VRAM. Price difference explained.
All three are 16GB, but I wouldn't be surprised if there is an 8GB 6800 variant.
Posted on Reply
#167
Mussels
Moderprator
Max(IT)
Ah ok, so not only X570...
My guess is some form of PCI-E 4.0 caching, using available system ram to speed things up. Maybe thats why they went 16GB as well, slower than the competition but if the datas already loaded/cached, it wont matter
Posted on Reply
#168
Max(IT)
Mussels
My guess is some form of PCI-E 4.0 caching, using available system ram to speed things up. Maybe thats why they went 16GB as well, slower than the competition but if the datas already loaded/cached, it wont matter
Understood, but since I have a B550 motherboard I was wondering about that feature to be supported on every Ryzen 5000 board. AMD didn’t mention X570, but just Ryzen 5000.
Posted on Reply
#169
SLK
Camm
Just to break this down.

1\ You don't know RT performance numbers. Its likely weaker, but literally have nfi atm. Also, with RT still all being hybrid.... its a feature not a requirement.
2\ AMD mentioned its DLSS alternative, it hasn't detailed it yet however.
3\ No info has been provided about the encoder except what it can support and it supports all current standards.
4\ If Nvidia Broadcast is a value ad for you, thats great. The majority of us just want to play games, not broadcast our warts to the world. See above regarding feature.
5\ CUDA is somewhat of a chicken and egg, became a defacto standard due to initial better support. However, 'massively better' is a misnomer, and is really dependent on what professional task you are doing as more things look to ditch CUDA (which has started to pick up pace over the last year).

tl;dr making definitive statements is crass.
1) AMD DLSS alternative is an unknown, I don't pay for unknown. DLSS 2.0 is here and it "just works".
2) Again AMD's encoder is an unknown, I can only base on recent history and Nvidia's solution is proven to be superior at this moment.
3) RT can only be appreciated by someone who values realistic graphics, for the rest, it's just a feature.

Even if I were a pure gamer, there is no reason for me to go AMD. When I pay almost $600-$700 for a GPU, $50 difference is nothing for these added features, not to mention, AMD has to regain some credibility with its driver problems in the last year.
Posted on Reply
#170
Camm
SLK
is an unknown
And Nvidia availability doesn't exist. You keep dealing in absolutes over a product that has only been announced in a 20 minute clip and press deck. But sure, go buy a 3080/3090 since you deal in certainty, and you certainly won't find one > <.

And seriously, AMD driver problems? When I got my 2080 Ti I was BSODing for fucks sake. Both vendors have and continue to have issues at times.
Posted on Reply
#171
ebivan
SLK
1) AMD DLSS alternative is an unknown, I don't pay for unknown. DLSS 2.0 is here and it "just works".
2) Again AMD's encoder is an unknown, I can only base on recent history and Nvidia's solution is proven to be superior at this moment.
3) RT can only be appreciated by someone who values realistic graphics, for the rest, it's just a feature.

Even if I were a pure gamer, there is no reason for me to go AMD. When I pay almost $600-$700 for a GPU, $50 difference is nothing for these added features, not to mention, AMD has to regain some credibility with its driver problems in the last year.
Well, everything said about RT performance on Radeon is pure speculation at this point. So don't judge it, before there is any evidence! Anyways it remains to be seen how RT will be adopted in the future, right now its a goodie, used on 20 games (none of which interests me). The future may bring broader adoption of RT, as the consoles can do it. But even then, the consoles are slower than the 6800xt and games will be optimized for consoles (as its always been) so RT it will probably run great on the AMD cards too.

I have tried all the different hardware encoders out there, and none of them was good. To be honest they're all shit! Sorry if you want quality, you have to do it in software. So no reason to go green for me there. Only reason to go for NVENC would be if you only had a tiny 4 or 6 core CPU that couldn't handle software encoding. But if you only had such a low tier cpu, you wouldn't buy 3080/6800xt anyways.

DLSS is absolutely overhyped. First of all there is only a handful of games that support it. Then the difference to native rendering is absolutely visible even with 2.0. Its just a blur. And, last but not least, it only makes sense for absolute hardcore esport guys, because 3080 and 6800xt seen to have no problems rendering in 4k/60fps or 1440p/120fps, so its really only a thing for competitive 4k/120fps+ gamers.
Posted on Reply
#172
Valantar
Mussels
What uhh, what titles out there support AMD's ray tracing right now?
Most of them, most likely. I don't know of any RTX-enabled games that don't use DXR, which is a DX12_2 feature, which RDNA 2 explicitly supports. It's likely that games will need some tweaks and minor updates to work out any kinks of the specific implementation, but as long as the GPUs support DXR they should work out of the box.
lynx29
It just occurred to me, the 6800XT shroud design has no vents on the side with the I/O... so all that heat is getting shot out the top of the car direct on to the CPU... Nvidia has the better reference design here, the most hot part gets shot outside the case next to the I/O and and the less hot air shot out in the case at the rear of the card... hmm this will be interesting to see CPU temps with this gpu... that seems like a really bad design flaw... the air is literally going to hit the CPU straight as it leaves the card, especially if you use an air cooler and it will just suck that hot air right in before it has a chance to escape.
GPUs with vertically oriented fins like that are quite common, and typically perform well. Like all open-air/axial fan cooling setups it requires sufficient case airflow, but it really shouldn't be an issue in terms of heating up your CPU. Nvidia's design with a large fan literally blowing hot air into your CPU cooler's intake is much more directly troublesome there, and even that works just fine.


This thread though ... damn. I mean, even though we don't have third-party review numbers yet, so we can't trust the numbers given 100%, we are still seeing AMD promise a bigger jump in performance/W than Nvidia delivered with Maxwell. (And given the risk of shareholder lawsuits, promises need to be reasonably accurate.) Does nobody else understand just how insane that is? The GPU industry hasn't seen anything like this for a decade or more. And they aren't even doing a "best case vs. worst case" comparison, but comparing the (admittedly worst case) 5700 XT against the seemingly overall representative 6800 XT - if they were going for a skewed benchmark, the ultra-binned 6900 XT at the same power would have been the obvious choice.

Yes, the prices could be lower (and given the use of cheaper memory technology and a narrower bus than Nvidia, AMD probably has the margins to do that if competition necessitates it, even with 2x the memory). For now they look to either deliver a bit less performance for 2/3 the price, matching or slightly better performance for $50 less, or somewhat better performance for $79 more. From a purely competitive standpoint, that sounds pretty okay to me. From a "I want the best GPU I can afford" standpoint it would obviously be better if they went into a full price war, but that is highly unlikely in an industry like this. This nonetheless means we for the first time in a long time will have a competitive GPU market across the entire price range.

I'll be mighty interested in a 6800 XT - after seeing 3rd party reviews, obviously - but just as interested in seeing how this pans out across the rest of the lineup. I'm predicting a very interesting year for ~$2-400 GPUs after this, given that $5-600 GPUs are now matching the value of previous-gen $400 GPUs we should get a noticeable bump in perf/$ in that price range.
Posted on Reply
#173
ebivan
For all the people complaining about the prices, remember that 3080 for 700 is absolute nonsense! Or has anyone actually gotten a FE 3080 for 700 bucks? No, nobody has! Cheapest AiB cards are about 850 and winning the lottery is more likely than getting a 3080 for 850 at the moment!

It remains to be seen if there will be reasonable quantities of 6800xt of course, but since AMDs move to built in that cache instead of using exotic hardly available 6x memory and using TSMCs 7nm of which AMD seems to have good availability seems to point to better availability at launch, but we will see.
Posted on Reply
#174
Hyderz
Very happy that amd released new gpu in the higher tier segment that competes with Nvidia.
Hrmmmm now the hard choices... Amd or Nvidia???

the 6800 (non xt) looks good and i think nvidia will answer it with RTX 3080 LE :P
Posted on Reply
#175
ratirt
jabbadap
uhm where did they claim such a thing? All I can see is the slide claiming 35 games supporting FidelityFX, which does not mean that those 35 games has or will have any form of DXR or RT. FidelityFX has been around for a while now. And yeah directx ultimate is more than just RT.
It's 5 games at start. double-clicked for some reason and didn't fix it sorry for that.
SLK
Cons for AMD:
- 1st gen RT, so its a slower than Ampere's RT
- No DLSS, which gives huge performance advantage in the second most popular game in the world (Fortnite)
- NVENC is far superior to AMD's implementation. (OBS Streaming software has native support for Nvidia but not AMD)
- Nvidia Broadcast is a huge asset to content creators, no equivalent from AMD
- Way behind on professional applications due to Nvidia's broad CUDA support.
Where did you get this from? Your thoughts?
There is DLSS in form of Super Resolution. We don't know how it works but it is there and we will have to wait and see what it brings.
You dont know if it will be slower than Amperes RT. 6000 series will use Microsoft DXR API with the Ray Accelerators for every CU.
You way of perceiving this makes me confused.
CUDA support :) , NVENC :)
It's just the same way if I'd said NV sucks cause it doesn't have AMD Cores.
Posted on Reply
Add your own comment