• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RDNA3 Graphics Launch Event Live-blog: RX 7000 Series, Next-Generation Performance

But I believe most consumers who will pay over $800 for a new graphics card will be looking at that parameter especially.
I won't, and I'm expecting to buy a 7950XTX. Still in the dabble phase with RT, so will take a look at a few titles and assume next generation again will provide some useful leaps.
 
I've just watched the announcement. It's nice to see some beautifully rendered space images instead of the kitchen of some big boss. :p

Jokes aside - very professional, very lean, and no mention of Nvidia, or even GPU specs whatsoever. We'll have to wait for reviews to know something certain, I guess.
 
Really need to see the final numbers bear out on these cards, RT is a big selling point for me, as well as reconstruction.

If the 7900XTX is indeed a hell of a beast at 4k, and it can do RT with equal or better relative performance drop to my 3080, it's a contender for sure, especially at such a better price vs the 4090.

FSR3 FG I yearn for information on!

Ray Tracing is still a fancy doodad you enable to go OH THIS THIS IS COOL, in a small handful of triple A games, then turn it off after it slashes FPS or introduces all sorts of nasty motion smoothing, artifacts, and ghosting when DLSS or FSR is enabled.
Couldn't disagree more! no nasty motion smoothing, no artefacts, no ghosting, just the pinnacle of graphical fidelity in games, and now that I've had a taste of it, it's a must have.

Not for you? awesome, power to you, always leave it off and shout it from the rooftops apparently, but for me and many others like me, we love this stuff and it's a must have.
 
Im super torn... on one hand Ray Tracing is the future and it looks and feels cool to use it. On the other hand MCM dies are the future, and I love the design , power, size of this thing...

Im gonna repaste my 3080, play some games and wait for reviews.
 
Im super torn... on one hand Ray Tracing is the future and it looks and feels cool to use it. On the other hand MCM dies are the future, and I love the design , power, size of this thing...

Im gonna repaste my 3080, play some games and wait for reviews.

I wouldn't worry, these cards will be sold out day 0, and remain overpriced from that day onwards.
 
Im gonna repaste my 3080, play some games and wait for reviews.
Love my 3080, and am continually impressed even 2 years in how bloody good this card is at 4k heavily undervolted.

Like you I'm toying up that big leap to the next card, currently it feels like only the 4090 or 7900XTX could be that upgrade, but I need independent reviews first.
 
Am I in the minority who thinks RT isn't important? Too few games use it. Most gpu's do not do it well yet. It needs to become far more mainstream and performative on lower end hardware before I will care.
 
Am I in the minority who thinks RT isn't important? Too few games use it. Most gpu's do not do it well yet. It needs to become far more mainstream and performative on lower end hardware before I will care.

Real-time ray tracing is promising technology for gaming but it's still in the early stages of adoption. RT hardware support really needs to show up in consumer hardware (smartphones, game consoles) for it to be "important."

AI-assisted upscaling technology is more important at this time.

But with each passing week, there are more and more games that support ray tracing. Some of the most popular titles now have ray tracing (Fortnite, Minecraft), it's no longer a niche benefit for obscure games.

It's important for AMD to figure out how to do ray tracing well because most likely the next generation videogame consoles will have the feature and if AMD wants to contend as the supplier for those chips, they need a good RT implementation and development environment.
 
Real-time ray tracing is promising technology for gaming but it's still in the early stages of adoption. RT hardware support really needs to show up in consumer hardware (smartphones, game consoles) for it to be "important."

AI-assisted upscaling technology is more important at this time.

But with each passing week, there are more and more games that support ray tracing. Some of the most popular titles now have ray tracing (Fortnite, Minecraft), it's no longer a niche benefit for obscure games.

It's important for AMD to figure out how to do ray tracing well because most likely the next generation videogame consoles will have the feature and if AMD wants to contend as the supplier for those chips, they need a good RT implementation and development environment.
The main difference between Nvidia and AMD is Nvidia seems to want to dedicate a lot of space to do RT and to do AI. AMD on the other hand want to do it with their compute units.

The upside for Nvidia is when RT is used, they get a very high performance boost and they don't affect the Compute performance. On the other hand, When no raytracing is used, those units sit idle.

The upside for AMD is their compute units are used all the time, but performance suffer when they use ray tracing as they have use compute units for it that could have been used for something else.

In the past, when we had similar paradigm (using general compute units instead of dedicated hardware), Initially the vendor that used the dedicated hardware won, but on the long run, architecture that were more flexible won.

That do not means that in 5 years, Navi 31 will be faster in ray tracing than ADA102, not at all, but in few architecture, it's quite possible that a wider RDNA with way more computes units become faster than an architecture that had dedicated units.

So i see why AMD could be going into that direction right now even if that mean they don't win. They want to perfect it for later.

It's quite possible also, that in few generation, the compute performance for running shaders would be way enough and the spare power could be used on raytracing or vice versa. Right now, there is still too much to do in each compute units, but we could image what a larger, 400mm2 + RDNA 4x+ on N3 with way more compute units could look like.

But right now, it look like if you want to turn RT at 4K, well, you better go green for now.
 
The main difference between Nvidia and AMD is Nvidia seems to want to dedicate a lot of space to do RT and to do AI. AMD on the other hand want to do it with their compute units.

The upside for Nvidia is when RT is used, they get a very high performance boost and they don't affect the Compute performance. On the other hand, When no raytracing is used, those units sit idle.

The upside for AMD is their compute units are used all the time, but performance suffer when they use ray tracing as they have use compute units for it that could have been used for something else.

In the past, when we had similar paradigm (using general compute units instead of dedicated hardware), Initially the vendor that used the dedicated hardware won, but on the long run, architecture that were more flexible won.

That do not means that in 5 years, Navi 31 will be faster in ray tracing than ADA102, not at all, but in few architecture, it's quite possible that a wider RDNA with way more computes units become faster than an architecture that had dedicated units.

So i see why AMD could be going into that direction right now even if that mean they don't win. They want to perfect it for later.

It's quite possible also, that in few generation, the compute performance for running shaders would be way enough and the spare power could be used on raytracing or vice versa. Right now, there is still too much to do in each compute units, but we could image what a larger, 400mm2 + RDNA 4x+ on N3 with way more compute units could look like.

But right now, it look like if you want to turn RT at 4K, well, you better go green for now.
The 4090 has 40% less performance when using RT vs not. How is that a performance boost unless they upscaling too? It applies to either brand, and RT is a generation away from becoming mainstream in games and hardware. For now it’s like tessellated concrete barriers.
 
After the ampere and rdna2 series I thought that the need for more rasterisation wouldn’t exist anymore. The fps have reached ridiculous and pointless levels.

Personally I would be happy with a 2080Ti Super with the same level of raster performance and 5-10x the RT.

All the AAA titles that are coming, have RT and a few GI. And as previously stated, we pay for a new card in order to play the new games. If I want to play Tomb raider and Star Wars, a 5700XT is more than enough.
It will be disappointing if they just match a 3080 in RT.
NVIDIA could kill the whole series with a single card.
 
Naaah.

The very first thing is something that looks like a good deal.

This looks like a really good deal. There IS RT performance, let's face it. It will be playable, the gap today isn't even up to a point where it is a total no deal for RT on AMD. So you'll play your five games on it, and that's that.

Let's see the gaming library progress a bit on the RT front before we start calling it a must have feature. Its really not. Games look pretty without it. Games run like shit with it, in a relative sense. The technology is still sub par in both camps.
At the high end many will be choosing Nvidia as more future proof because of it's performance in RT. More future proof with a 1.4a DisplayPort connector. Nvidia is a master at offering cards with high specs today and a nice hidden killer for that same cards, either this is limited memory capacity or bandwidth, or now the video out connectors.
My point is that in the high end people will be looking at RT performance because they are spending a significant amount of money. In the mid range and low end (I have a hard time calling low end cards that will be costing over $200, but anyway), AMD's performance in RT will be problematic. RX 7800 will be slower at RT and RX 7700 even slower and RX 7600 even slower. So, if RX 7600 costs $500 and it is worst in RT compared to Nvidia's RTX 3070, or latter Nvidia's RTX 4060, people will keep buying blindly Nvidia cards even if AMD's cards are butchering those in raster. We see it today. I am keep repeating it that we already see it everywhere. In Steam survay, in sales, in market share, in Nvidia's confidence that it can sell a worst card compared to AMD's equivalent model at a huge premium. Nvidia was doing the same 12 years ago when it was selling pathetic cards that where much worst in gaming compared to AMD's offering, by using PhysX and CUDA, plus the promotion from tech sites, as a strong marketing card.
 
At the high end many will be choosing Nvidia as more future proof because of it's performance in RT. More future proof with a 1.4a DisplayPort connector. Nvidia is a master at offering cards with high specs today and a nice hidden killer for that same cards, either this is limited memory capacity or bandwidth, or now the video out connectors.
My point is that in the high end people will be looking at RT performance because they are spending a significant amount of money. In the mid range and low end (I have a hard time calling low end cards that will be costing over $200, but anyway), AMD's performance in RT will be problematic. RX 7800 will be slower at RT and RX 7700 even slower and RX 7600 even slower. So, if RX 7600 costs $500 and it is worst in RT compared to Nvidia's RTX 3070, or latter Nvidia's RTX 4060, people will keep buying blindly Nvidia cards even if AMD's cards are butchering those in raster. We see it today. I am keep repeating it that we already see it everywhere. In Steam survay, in sales, in market share, in Nvidia's confidence that it can sell a worst card compared to AMD's equivalent model at a huge premium. Nvidia was doing the same 12 years ago when it was selling pathetic cards that where much worst in gaming compared to AMD's offering, by using PhysX and CUDA, plus the promotion from tech sites, as a strong marketing card.
We'll see, historically there was a lot more going on and if anything, what it shows is that these pushed technologies tend to just die off. What they did temporarily for market share is another thing, but RT is nowhere near a real presence in the market, despite what the push might want you to believe. The vast majority of implementations is utter crap and offers no benefit.

Still I get the pull you describe about having a card that has full support for everything - I talk about feature parity too, more often than not - but its still part of a package, and RT is not a substantial part of it, compared to price/perf in raster, availability, TDP, perf/w, and overall experience/quality. Nvidia's Ada has a good perf/w and an RT lead, but everything else is substantially worse.
 
We'll see, historically there was a lot more going on and if anything, what it shows is that these pushed technologies tend to just die off. What they did temporarily for market share is another thing, but RT is nowhere near a real presence in the market, despite what the push might want you to believe. The vast majority of implementations is utter crap and offers no benefit.

Still I get the pull you describe about having a card that has full support for everything - I talk about feature parity too, more often than not - but its still part of a package, and RT is not a substantial part of it, compared to price/perf in raster, availability, TDP, perf/w, and overall experience/quality. Nvidia's Ada has a good perf/w and an RT lead, but everything else is substantially worse.
What we see in the future is, I hope, me being completely, hilariously, wrong.

But another thing I see lately, is people buying based on performance today and features promoted today. And AMD is losing in both those categories. AMD is promoting a platform created for longevity, CPUs with equal performance cores, CPUs with high efficiency, and who is winning? Intel, because it is offering more cores today, it is winning benchmarks today, it is offering a cheaper platform today. People don't care if most of those Intel cores are Efficiency cores. They don't care if Ryzen 9 7950X will butcher Intel i9 13900K in applications or games that will need more than 8 P cores in 5 years. They don't care if in that period of 5 years the AM5 owner will be just swapping CPUs and not replacing the whole platform. And they don't care about efficiency.

Lately people care only for TODAY. And in GPUs today is RT. In 5 years we might have something else, or RT be dead, or programmers start using RT in a more efficient way that it is not killing performance. But today it does sell cards. Overpriced worst than the competition's equivalent, better performing in raster cards. Nvidia knows that, that's why it is throwing fake frames in a, still, beta testing form.
geforce-rtx-40-series-nvidia-dlss3-performance-chart.png

Above image from here.

People only look at bars, they only understand bars. And when in reviews they reach the page where it compares models at RT performance, they will have the best excuse to go with the much more expensive Nvidia option. I mean there is already integrated in their brains that AMD sucks in drivers, right? That RT performance will be another example that Nvidia is superior. They will totally disregard all those raster performance charts, because there even the slower Nvidia card will be good enough. They will be focusing on RT charts, because there, the AMD card will just be bad. So they will be choosing the "good enough in everything", over the "better at most, but bad at that new feature that we heard that transforms our games in reality".
Irronicaly, the higher price of the Nvidia model will be one more proof that RT is important and matters. Because "more expensive is better". Right? I believe this is one more true fact about how the average consumer thinks. More expensive equals better, especially when the brand is much stronger.
 
Last edited:
What we see in the future is, I hope, me being completely, hilariously, wrong.

But another thing I see lately, is people buying based on performance today and features promoted today. And AMD is losing in both those categories. AMD is promoting a platform created for longevity, CPUs with equal performance cores, CPUs with high efficiency, and who is winning? Intel, because it is offering more cores today, it is winning benchmarks today, it is offering a cheaper platform today. People don't care if most of those Intel cores are Efficiency cores. They don't care if Ryzen 9 7950X will butcher Intel i9 13900K in applications or games that will need more than 8 P cores in 5 years. They don't care if in that period of 5 years the AM5 owner will be just swapping CPUs and not replacing the whole platform. And they don't care about efficiency.

Lately people care only for TODAY. And in GPUs today is RT. In 5 years we might have something else, or RT be dead, or programmers start using RT in a more efficient way that it is not killing performance. But today it does sell cards. Overpriced worst than the competition's equivalent, better performing in raster cards. Nvidia knows that, that's why it is throwing fake frames in a, still, beta testing form.
geforce-rtx-40-series-nvidia-dlss3-performance-chart.png

Above image from here.

People only look at bars, they only understand bars. And when in reviews they reach the page where it compares models at RT performance, they will have the best excuse to go with the much more expensive Nvidia option. I mean there is already integrated in their brains that AMD sucks in drivers, right? That RT performance will be another example that Nvidia is superior. They will totally disregard all those raster performance charts, because there even the slower Nvidia card will be good enough. They will be focusing on RT charts, because there, the AMD card will just be bad. So they will be choosing the "good enough in everything", over the "better at most, but bad at that new feature that we heard that transforms our games in reality".
Irronicaly, the higher price of the Nvidia model will be one more proof that RT is important and matters. Because "more expensive is better". Right? I believe this is one more true fact about how the average consumer thinks. More expensive equals better, especially when the brand is much stronger.

I really don't think RT is any kind of player in this game. This is 20 years ongoing. AMD recovered fine from the initial Turing release, as you can see, but the trend never changed for the last, what, 16-18 years. There is far more going on than just having feature X or Y. One might wonder if the consumer even has enough of an influence on these duopolies ;)

The same thing happened wrt AMD/Intel.

1667550515239.png
 
I guess stupid prices are here to stay. Not even AMD saved us from them.
 
I really don't think RT is any kind of player in this game. This is 20 years ongoing. AMD recovered fine from the initial Turing release, as you can see, but the trend never changed for the last, what, 16-18 years.

View attachment 268537
Looking at that chart, Nvidia won significant market share two times based on features and one time based on performance.
With GeForce 8000 series in 2007 where it was offering high performance, but mostly new features like CUDA and PhysX. CUDA was important back then for video transcoding and PhysX was promising amazing physics in games. AMD was offering some marvelous options in mid range, like the HD 3850 and latter the amazing HD 4870 and HD 4850, but never really recovered. Talk about bad drivers and the lack of CUDA and PhysX combined with tech press being friendlier to Nvidia was the reason for that. Even after introducing the amazing GCN based HD 7000 people where already bumbling about AMD's bad drivers. When AMD introduced the R9 290X and R9 290, it was beating Nvidia, but tech sites where driving reader's focus away from that fact and to the over 90 Celcious temperatures of the reference models. GeForce 900 and GeForce 1000 series are the examples where Nvidia offered value and performance. It was the clear winner there over AMD that was straggling to remain competitive. Looking at the chart, AMD did managed to close the gap with Polaris, because it was offering high raster performance and great value, combined with excellent mining performance when people where asking for raster and/or mining performance at a great price. But look what happens when Nvidia comes out with RTX 2000 and RayTracing. Market share jumps in it's favor because people rush to buy the new cards with that new amazing feature. RTX 2080 Ti is obviously an overpriced card, but at the same time is the perfect marketing vehicle for Nvidia to declare superiority and sell the cheaper models like hot cakes. After that we have again a mining period where probably the wafer capacity and mining performance dictates who wins market share.
 
Am I in the minority who thinks RT isn't important? Too few games use it. Most gpu's do not do it well yet. It needs to become far more mainstream and performative on lower end hardware before I will care.
You're not in the minority, though opinions differ a lot on the topic. I personally think RT is important because some of the games I like use it, and its adoption is slowly but surely increasing year by year. Also, we already have plenty of raster performance (you can play non-RT games on an Nvidia Pascal GPU from 2016 just fine) - where we're lacking performance is RT without the DLSS/FSR upscaling bullshit.
 
The 4090 has 40% less performance when using RT vs not. How is that a performance boost unless they upscaling too? It applies to either brand, and RT is a generation away from becoming mainstream in games and hardware. For now it’s like tessellated concrete barriers.
The wording was not the best i agree, let say, they get less performance impact for activating the feature.

I think most game can run some kind of RT right now and if they do it well in conjunction of other effect, you can even do it on Turing/RDNA2. It all depend how far you go. RT reflection in area where Screen space reflection can't help is already running good on those GPU (like FarCry 6).

It's when you want to do everything RT (like quake2 RTX or minecraft RTX) that performance become an issue.

If game want to use 1 of the RT effect in a raster pipeline, they can get away with it. (Global Illumination, Reflection or Shadow). Among them, I think it's the Global Illumination that have the most effects on visuals in most scenario.

The thing is game dev really got good at cheating effect. that the visual gain is hard to see in a lot of case. But those cheats are not realistic (although some can look better because they got carefully crafted)

For render that aim to be photorealistic, RT will be needed.

But not all game must go that path, there is still plenty of style around.
 
Back
Top