• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RDNA3 Graphics Launch Event Live-blog: RX 7000 Series, Next-Generation Performance

Looks like we have a team red fanboy here... you should have read my comment more closely: I compared RAW performance (RDNA3 61TFLOPs vs. AD102 83TFLOPs) and based on that - and because AMD themselves were overmodest themselves about the announcement ("Fastest GPU below 1k$) - it is not that hard to assume that the RTX 4090 will keep it's crown when it comes to 4k+ performance. Though it is important, that nVIDIA might not be able to bring the performance - especially with competitive sttings - to display because of technical (DP 1.4) restrictions.
There is one little hope here for 7000 series. With AMD staying at lower frequency than expected from most of us, probably to achieve their "over 50%" performance per watt goal, we could see custom cards from AIBs hitting a 20-40% overclock in the near future. Those cards will probably be going at 400W or over, but thanks to Nvidia, there are plenty of coolers out there to deal with that power consumption.
 
MOST FOLKS DON'T NOT CARE ABOUT RT FOR THE LOVE OF GOD!!! FPS IS STILL KING I'll glad turn up the in game graphics than to use some gimmicks crap.
What fps is king? 300? 500? Come on... High-end products SHOULD focus on RT these days. Still pushing for frames even beyond hundreds of FPS is stupid.
 
Explain Steam statistics where every card with a meaningful market share is an Nvidia one.
You DON'T HAVE AN IDEA WHAT MOST FOLKS CARE ABOUT. And I am not saying that I know better. I say JUST LOOK AT THE DAMN MARKET SHARE NUMBERS AND SALES.
My God WAKE UP.
Everyone has a 4090? No, it’s the pinnacle of RT hardware, so why don’t they own a new 4090?

How many of those are included in prebuilt machines, laptops, or from people who buy Nvidia cause they have the same mentality of it’s the best?

Who are you trying to convince here? Does AMD pay rent for that space in your head, or does Nvidia? Competition is good, a premium card for less than a grand is great. Now go have a juice box and relax
 
Wow will gladly pay up to 1500€ for an XTX.
4080 launch for 1199$ is DOA then?
Even without the 7900 xtx dropping at $1000, yeah, the 4080 16GB at 1200 was DOA.

What fps is king? 300? 500? Come on... High-end products SHOULD focus on RT these days. Still pushing for frames even beyond hundreds of FPS is stupid.
What can you do with that if only a few titles support it?, spending 1600 to play 5 games?, not to mention everyone has different tastes in games so even of those 5 games, or maybe 10, not everyone will like them all.......
They should develop their tech but not charge people in advance for something that is not in the majority of games.
 
Even without the 7900 xtx dropping at $1000, yeah, the 4080 16GB at 1200 was DOA.


What can you do with that if only a few titles support it?, spending 1600 to play 5 games?, not to mention everyone has different tastes in games so even of those 5 games, or maybe 10, not everyone will like them all.......
They should develop their tech but not charge people in advance for something that is not in the majority of games.

Were do you guys get this stuff?
There are 243 Dx12 (100%) games
144 of (58%) those games support Some type of Raytracing.
 
Everyone has a 4090? No, it’s the pinnacle of RT hardware, so why don’t they own a new 4090?

How many of those are included in prebuilt machines, laptops, or from people who buy Nvidia cause they have the same mentality of it’s the best?
From how much far back I can remember, from 2000 and latter, people where looking what the best NEW GPU could offer and they where rushing to upgrade their old and slow graphics cards to come closer to what that amaizing model could offer. If that model had brand X on it, they where rushing to buy brand X models. Even the cheapest there was. You and others just completely disregard this parameter.
So, it doesn't need everyone to have a 4090. We can see Nvidia selling under $600 cards, still at premium. Why? Because consumers go and buy blindly Nvidia. Why Nvidia gets in those prebuild systems? Because consumers ask for them. Why do we see Nvidia in most expensive laptops? Because consumers ask for them.
Who are you trying to convince here? Does AMD pay rent for that space in your head, or does Nvidia? Competition is good, a premium card for less than a grand is great. Now go have a juice box and relax
I am trying to convinse no one. Just pointing at a different perspective than your and others. You think a good price and rasterization will heal the market that today is close to a monopoly, resembling the CPU market when AMD was still selling Bulldozer CPUs? AMD is building very good GPUs, but it is missing the point. It needs to beat Nvidia in it's own game, RayTracing, to make it's brand look more than just a second grade option.

I will avoid going your way with personal attacks. I will only say this. Competition is good when it exists.
 
Looks like we have a team red fanboy here... you should have read my comment more closely: I compared RAW performance (RDNA3 61TFLOPs vs. AD102 83TFLOPs) and based on that - and because AMD themselves were overmodest themselves about the announcement ("Fastest GPU below 1k$) - it is not that hard to assume that the RTX 4090 will keep it's crown when it comes to 4k+ performance. Though it is important, that nVIDIA might not be able to bring the performance - especially with competitive sttings - to display because of technical (DP 1.4) restrictions.

Oh god this thread is turning autistic.

No, listing the data does not make one a fanboy. I'm going to go touch grass, something I recommend others here do as well.
 
There is one little hope here for 7000 series. With AMD staying at lower frequency than expected from most of us, probably to achieve their "over 50%" performance per watt goal, we could see custom cards from AIBs hitting a 20-40% overclock in the near future. Those cards will probably be going at 400W or over, but thanks to Nvidia, there are plenty of coolers out there to deal with that power consumption.

Looks like I have the problem of being a non-native english speaker here. I can't help myself but it looks like every comment from me on this topic forces someone to defend AMD. In my opinion RX 7000 is more than OK, but still future iterations of the Adrenaline software and FSR 3 are what really matters to me - especially if FSR 3 will not be RDNA3 exclusive.

RX 7000 does not have to be the best performing GPU in it's generation to be successful, especially when it comes to pricing and/or power consumption. RTX 4090 is a HALO product. It's 1.6k+ price tag is absurd. It's power consumption is a joke in a world where - like here in Germany - the price for a kW/h just doubled this year. But comments like the quoted one above show that people still just look on performance numbers: So sad! - to quote a person I really despise.

Oh god this thread is turning autistic.

No, listing the data does not make one a fanboy. I'm going to go touch grass, something I recommend others here do as well.

Defending a company makes me think you are a fanboy of said company.
 
Sharp pricing

7900 XTX 24GB US$999
7900 XT 20GB US$899

While we still need to wait for third-party reviews, it looks like once again AMD will come up on top in terms of performance-per-watt and performance-per-dollar for conventional 3D rasterization performance.
You can wait all you want I will be getting an reference XTX and as soon as I can be putting a waterblock on that here we go! I could care less if it is faster than a 4090 it will be faster than my 6800Xt so I will feel it.
 
The single compute die is a little underwhelming. I was hoping (from the rumour mill) that AMD had figured out the magic sauce to make multiple compute dies like on their Ryzen and EPYC CPUs; The idea of a cheap, small, high-yield die fit that can be doubled or quadrupled as necessary sounded like a solution to scalability limitations of large monolithic dies.

Perhaps this is just a first step, testing if they can decouple essential parts like the memory controllers and cache, and the next generation will test splitting the compute core up into smaller dies. Given that yields and production costs massively favour smaller dies, there's the potential for huge performance gains if or when AMD figure it out for consumer products.

Raytracing performance is meh. 7900-series RT uplift of 1.5-1.6x pretty much matches the 1.5-1.7x improvements over 6900-series for everything else. Perhaps I'm not understanding the announcement, but it seems like raytracing performance has increased by about zero relative to everything else. IMO AMD needed to increase raytracing performance by 50% more than their raster performance improvements if they want to fully compete with Nvidia this generation.

Maybe AMD don't care, just like me. As someone with a RTX 3070 and an RX 6700 on hand, raytracing is still shit anyway. I've borrowed a 3090 to mess around with an most RT games still failed to deliver compelling graphics or playable framerates on my 4K TV. RT reflections are just about the only aspect worth enabling because they're quite obviously an improvement over screen-space reflections. Global Illumination and raytraced shadows are just horrific performance destroyers for pitifully small differences to the final output.

Perhaps the RTX5000-series will finally have enough raytracing power to keep up with advances in games' graphical requirements but for now the 4090 is basically just about good enough to render 4K60 in 2-year-old titles without using a cheaty upscaler as a crutch. That's impressive, but also depressingly unfit for the 4K120 and 1440p240 monitors of yesteryear.
 
Last edited:
Looks like I have the problem of being a non-native english speaker here. I can't help myself but it looks like every comment from me on this topic forces someone to defend AMD. In my opinion RX 7000 is more than OK, but still future iterations of the Adrenaline software and FSR 3 are what really matters to me - especially if FSR 3 will not be RDNA3 exclusive.

RX 7000 does not have to be the best performing GPU in it's generation to be successful, especially when it comes to pricing and/or power consumption. RTX 4090 is a HALO product. It's 1.6k+ price tag is absurd. It's power consumption is a joke in a world where - like here in Germany - the price for a kW/h just doubled this year. But comments like the quoted one above show that people still just look on performance numbers: So sad! - to quote a person I really despise.
TechPowerUp is a huge site with many non-native english speakers here.
I think we should apologize to the native english speakers for butchering their language.

AMD doesn't need another Polaris or an RX 5700 to grab people's attention. AMD needs a killer product because Nvidia has managed to turn the GPU market to their own playground. And tech press is supporting them lately. For example, those last months many articles about GPU pricing where totally focusing on Nvidia cards. AMD was dropping prices across the board and most articles where focusing on the price drops of the originally ridiculously overpriced RTX 3090. Nvidia is everywhere in articles and AMD a footnote. AMD came very close to Nvidia with RX 6000 series. RX 7000 series should have been a leap forward, but I don't see it. It's another RX 5700.
 
What can you do with that if only a few titles support it?, spending 1600 to play 5 games?, not to mention everyone has different tastes in games so even of those 5 games, or maybe 10, not everyone will like them all.......
They should develop their tech but not charge people in advance for something that is not in the majority of games.
It depends on what games you play. I love Control and Deliver us the Moon, and I also like Cyberpunk. They're all RT games. On the other hand, I wouldn't give a damn if CS:GO ran at 500 fps instead of 300 even if I played it.
 
TechPowerUp is a huge site with many non-native english speakers here.
I think we should apologize to the native english speakers for butchering their language.

AMD doesn't need another Polaris or an RX 5700 to grab people's attention. AMD needs a killer product because Nvidia has managed to turn the GPU market to their own playground. And tech press is supporting them lately. For example, those last months many articles about GPU pricing where totally focusing on Nvidia cards. AMD was dropping prices across the board and most articles where focusing on the price drops of the originally ridiculously overpriced RTX 3090. Nvidia is everywhere in articles and AMD a footnote. AMD came very close to Nvidia with RX 6000 series. RX 7000 series should have been a leap forward, but I don't see it. It's another RX 5700.
yeah, it's like a stagnation.
 
Perhaps this is just a first step
Think of it as Zen 1.
yeah, it's like a stagnation.
More like they are still focusing on CPUs. So, they are back into "good enough" options for the GPU market. Maybe they don't feel they can fight both Nvidia and Intel, so they are focusing on where they still have an advantage and that's CPUs and I mean CPUs for servers. In retail Intel managed to turn the tables with it's hybrid design.
 
Ambient oclusion, indirect diffure lighting, global illumination.
That's what I care about. I don't care if any 4090/7900XTX spits 5000fps at 16K on CS GO.

I want RT performance at 1440p.

AMD doesn't need to compete at ultra high end level. I'm more than happy if they deliver a 7900XT with 3090/Ti level of RT performance.
I'm happy with their prices as long as the numbers are there.

Do you know when the review embargo is lifted?
 
It's another RX 5700
How on earth can you compare a rx 5700 xt to a 7900 xtx?
The 5700 xt did not even support rt, and was way slower than the 2080 Ti.
 
Why the push for that crap when there is such a few games that support it?, let it go......i have a list of almost 70 games bought on steam and none of them support rt.
How many games in that list of yours require a brand new 1k $ gpu to play? That's not a great argument my man, you don't buy a new GPU to play all those games that can be played in your old one as well, you are buying it to play the few heavy AAA titles. And most of those do actually have RT.
 
How on earth can you compare a rx 5700 xt to a 7900 xtx?
The 5700 xt did not even support rt, and was way slower than the 2080 Ti.
It's a step backwards compared to RX 6000 series vs RTX 3000 series, so I see it as AMD going back to RX 5700 era, where it had a very nice mid range option, but still mid range.
Now 7900 XTX will probably come close to RTX 4090 in raster, but still it will be losing badly on RT performance. Most people here disagree with me about the significance of RT performance. But I believe most consumers who will pay over $800 for a new graphics card will be looking at that parameter especially. The majority of tech press and YouTubers will also focus on that, playing Nvidia's marketing game.
I hope I am wrong, most people think I am wrong, but until I see AMD's market share going up and people buying with their brains, meaning choosing an RX 6600 over an RTX 3050 for example, I am afraid I am closer to being correct.
Lastly, let's not forget that Nvidia only needs to reduce prices.
 
From 42 to 62 fps in Ray Tracing WITH FSR?
They should have understood by now that RT is THE MAIN FEATURE TODAY TO FOCUS.
Morons.
Naaah.

The very first thing is something that looks like a good deal.

This looks like a really good deal. There IS RT performance, let's face it. It will be playable, the gap today isn't even up to a point where it is a total no deal for RT on AMD. So you'll play your five games on it, and that's that.

Let's see the gaming library progress a bit on the RT front before we start calling it a must have feature. Its really not. Games look pretty without it. Games run like shit with it, in a relative sense. The technology is still sub par in both camps.
 
How many games in that list of yours require a brand new 1k $ gpu to play? That's not a great argument my man, you don't buy a new GPU to play all those games that can be played in your old one as well, you are buying it to play the few heavy AAA titles. And most of those do actually have RT.
Then i may be a minority that doesn't play those few heavy AAA games, tbh, i only need 1440p at 144fps.

Let's see the gaming library progress a bit on the RT front before we start calling it a must have feature. Its really not. Games look pretty without it.
That's my whole point.
 
It's a step backwards compared to RX 6000 series vs RTX 3000 series, so I see it as AMD going back to RX 5700 era, where it had a very nice mid range option, but still mid range.
Now 7900 XTX will probably come close to RTX 4090 in raster, but still it will be losing badly on RT performance. Most people here disagree with me about the significance of RT performance. But I believe most consumers who will pay over $800 for a new graphics card will be looking at that parameter especially. The majority of tech press and YouTubers will also focus on that, playing Nvidia's marketing game.
I hope I am wrong, most people think I am wrong, but until I see AMD's market share going up and people buying with their brains, meaning choosing an RX 6600 over an RTX 3050 for example, I am afraid I am closer to being correct.
Lastly, let's not forget that Nvidia only needs to reduce prices.
There is indeed going to be a segment of the market in the high end wanting better RT performance. But they can do math. Those with more money than sense can't do math - but they never could.

When you save a CPU+board+RAM worth on your GPU, let's see how that balances out ;) And another thing, for those tech enthusiasts - what is more interesting, the first, fattest chiplet GPU on the planet that can actually game, or a few more FPS showing the same RT effects? We are well into territory where performance in the majority of games is solid down to upper midrange.

I'm still gaming on a 2016 GPU... No complaints.

Were do you guys get this stuff?
There are 243 Dx12 (100%) games
144 of (58%) those games support Some type of Raytracing.
'Some type' yes, and how many are worth playing? And on the other side of the coin, how many would run below par on RDNA3? Its going to be a stupendously tiny subset.
 
Last edited:
So I've finally caught up on the AMD official stuff and honestly the silver lining in the data-light and uninspiring performance numbers is the 54% performance/Watt improvements over RDNA2.

RDNA2 is already killing it with performance/Watt and I think there are plenty of people like me who don't want a 300W, 450W, or even 600W graphics card. If I can tolerate a 250W card, is it an AMD card or an Nvidia card that gets me the best experience within that power budget?

Also, there's no sensible way to shoehorn a cards with silly-high power draw into a laptop. The mobile 3080 was so utterly castrated by the form factor that a desktop 3060Ti could match or often beat it. The 40-series power consumption has increased by 50%, but the laws of physics still dictate how much copper fin surface area you can squeeze into a laptop and still blow air through it. Don't even get me started on laptop power bricks that weigh more than any laptop I'd consider buying, let alone the fact you need TWO of them to power some laptops....
 
AMD is way behind in RT. Even Intel's ARC looks better, meaning AMD could have a real problem when Intel fixes it's problems.
AMD should had targeted at close to 3-4 times performance increase in RT. Very expensive cards will NOT sell if they are failing in RT compared to the competition. And if the competition wins in high end, it will mean that they will also be selling low and mid range cards. See what is happening today. People buy RTX 3050.
So, what I smoke bd, is reality. What you smoke, it's your problem.
Personally, I don't give a rat's behind about RT - only interested in raster performance and price.
 
Let's stick to the topic.
Stop the insulting jabs at each other.
Thread bans, warnings forthcoming for any more of these antics.

Thank You

Take notice: This is a warning post... you have been warned
 
I feel like a I can problem I can see with design now. it's bottlenecked by instructions, by my view.
They trippled shaders count & doubled RT core count.
They only double instructions pre-clock count while needing it to be more than over quadrupled.
Normally you double shaders & RT cores & increase the front end. That usually means the front end is quadrupled.
seems like they have the front end quintupled. Also if the R.O.P's are only 128 then that's a real disappointment too.
 
Back
Top