Monday, May 13th 2024

AMD RDNA 5 a "Clean Sheet" Graphics Architecture, RDNA 4 Merely Corrects a Bug Over RDNA 3

AMD's future RDNA 5 graphics architecture will bear a "clean sheet" design, and may probably not even have the RDNA branding, says WJM47196, a source of AMD leaks on ChipHell. Two generations ahead of the current RDNA 3 architecture powering the Radeon RX 7000 series discrete GPUs, RDNA 5 could see AMD reimagine the GPU and its key components, much in the same way RDNA did over the former "Vega" architecture, bringing in a significant performance/watt jump, which AMD could build upon with its successful RDNA 2 powered Radeon RX 6000 series.

Performance per Watt is the biggest metric on which a generation of GPUs can be assessed, and analysts believe that RDNA 3 missed the mark with generational gains in performance/watt despite the switch to the advanced 5 nm EUV process from the 7 nm DUV. AMD's decision to disaggregate the GPU, with some of its components being built on the older 6 nm node may have also impacted the performance/watt curve. The leaker also makes a sensational claim that "Navi 31" was originally supposed to feature 192 MB of Infinity Cache, which would have meant 32 MB segments of it per memory cache die (MCD). The company instead went with 16 MB per MCD, or just 96 MB per GPU, which only get reduced as AMD segmented the RX 7900 XT and RX 7900 GRE by disabling one or two MCDs.
The upcoming RDNA 4 architecture will correct some of the glaring component level problems causing the performance/Watt curve to waver on RDNA 3; and the top RDNA 4 part could end up with performance comparable to the current RX 7900 series, while being from a segment lower, and a smaller GPU overall. In case you missed it, AMD will not make a big GPU that succeeds the "Navi 31" and "Navi 21" for the RDNA 4 generation, but rather focus on the performance segment, offering more bang for the buck well under the $800-mark, so it could claw back some market share from NVIDIA in the performance- mid-range, and mainstream product segments. While it remains to be seen if RDNA 5 will get AMD back into the enthusiast segment, it is expected to bring a significant gain in performance due to the re-architected design.

One rumored aspect of RDNA 4 that even this source agrees with, is that AMD is working to significantly improve its performance with ray tracing workloads, by redesigning its hardware. While RDNA 3 builds on the Ray Accelerator component AMD introduced with RDNA 2, with certain optimizations yielding a 50% generational improvement in ray testing and intersection performance; RDNA 4 could see AMD put more of the ray tracing workload through fixed-function accelerators, unburdening the shader engines. This significant improvement in ray tracing performance, performance/watt improvements at an architectural level, and the switch to a newer foundry node such as 4 nm or 3 nm, is how AMD ends up with a new generation on its hands.

AMD is expected to unveil RDNA 4 this year, and if we're lucky, we might see a teaser at the 2024 Computex, next month.
Sources: wjm47196 (ChipHell), VideoCardz
Add your own comment

150 Comments on AMD RDNA 5 a "Clean Sheet" Graphics Architecture, RDNA 4 Merely Corrects a Bug Over RDNA 3

#126
Chrispy_
FiredropsAre these cost savings in the room with us right now?
If you have more than 8 CPU cores in your PC, yes. What used to cost thousands and require an HEDT platform is now at consumer-tier pricing in the same socket as the budget/entry-level offerings.
Posted on Reply
#127
Random_User
oxrufiioxoThe one thing that confuses me if RDNA 4 is just fixed RDNA3 with better RT why not do a full lineup. Maybe chiplets were so bad for them and they don't want to make a large die is all I can think of. Going back to the RDNA 1 gameplan just seems defeatist to me.
There might be couple of reasons, I can guess of:

1. If they do repeat the entire RDNA3 stack with fixes, the very top chips (if theoretically consider 1:1 RDNA3:RDNA4) will still be less powerful, than nVidia counterparts. And eventually would end up as lower class cards, eg 7900XT becoming 8800XT etc, which loses its premium value anyway. Unless, of course, AMD won't try to sell it at premium prices...
2. The allocation for TSMC waffers became pricey, and the foundry time, at least for higher nodes, is completely (?) reserved for Apple, and nVidia enterprise stuff. So while the waffers are fewer, it's more reasonable to make more of the smaller, more reliable dies from it.

This is just my take, which I've shared,and It may be wrong.
Posted on Reply
#128
Firedrops
Chrispy_If you have more than 8 CPU cores in your PC, yes. What used to cost thousands and require an HEDT platform is now at consumer-tier pricing in the same socket as the budget/entry-level offerings.
Oh, TIL this thread was about CPUs, not GPUs. Well done, AMD's honor has been defended.
Posted on Reply
#129
Neo_Morpheus
SL2I don't understand why some people expect a price war every now and then. Just look at history, it never happens these days, especially not at launch.
They are being hypocrites.

They want AMD to force Ngreedia in cutting prices just so they can buy cheaper Ngreedia gpus.

They never had any intentions in buying AMD gpus.
Posted on Reply
#130
Chrispy_
FiredropsOh, TIL this thread was about CPUs, not GPUs. Well done, AMD's honor has been defended.
Nah, it's more that AMD next-gen GPU discussions obviously involve speculation about chiplets, since that was the big deal for RNDA3, and chiplet discussions inevitably lead to the pros and cons of existing chiplet designs, and AMD's experience with them. It's impossible not to bring up CPUs because they're 100% relevant to AMD's experience, sales success, and evolution of products made with chiplets.

We don't know for sure whether chiplets are going to feature in future RDNA4 and RDNA5 products, but they've definitely pushed the core count beyond what monolithic CPUs could ever hope to achieve, and that's exactly what graphics cards need. The 4090 is fast because it has so many shader cores (as well as the bandwidth and power delivery to feed them) but more cores basically means more performance, which is why AMD are likely chasing the dream of splitting compute/shader arrays across multiple dies.
Posted on Reply
#131
Lt. James Hetfield
Wasteland
  • "AMD will not make a big GPU that succeeds the "Navi 31" and "Navi 21" for the RDNA 4 generation ..."
  • "but rather focus on the performance segment, offering more bang for the buck well under the $800-mark"
Launch MSRP of Navi 31 and Navi 21: $1,000 and $900, respectively.

I sure hope "well under $800" means WAY, WAY under, because otherwise, there isn't much of a distinction here. If that $800 number came from AMD, it sounds like they're hedging bets something fierce.
if AMD Tries Charging $499+ then AMD might as well Cancel this gen altogether!! im not buying an "RDNA 1" GPU @ over $400USD Navi 48 GPUs need to start @ $399 OR just cancel cuz i wont be buying anything that i already have (aka RX 7900XT 20GB) and prob has way less Ram too :(

better RT performance alone is not worth it ( i dont play any games that has any decent RT/PT implementations. Re4 remake and LOUP1 and thats it oh wel & WD : Legion (GARBAGE) i do have GP Ultiamte and can play many MS games life FS 2020 and Forza etc but i dont really like those games (mostly use GP for Xbox 360 game like Skate/sakte 3 on Pc/& Ally

so this also has been proven that RDNA3 had bugs cuz RDNA4 is just a "BUGS FIX" Generation !!!! YAY! how about @amd .. You give me 30% of my $$$ back for RDNA3 Cards i bought or face a Class Action Lawsuit like with Bulldozer Lies////

Proves AMD KNEW RDNA3 HAD BUGGS
TumbleGeorgeHow much should a graphics card cost if its GPU is mid-range? The prices you suggest seem absurd to me.
if N48 is just a "RX 5700XT" basicly then it should only be $399 to 8700XT ? but have an 8700XTX with say more Ram (32GB Clamshell) idunno! or 8700XTX with 24Gbps GDDR6 but still 16GB but its Way faster an way higher clocked!! but has to have 16GB GDDR6 @minimum!!!

i already have a AMd refeernce 7900XT and it serves me fine... if cheap enough yeah id love to play with N44/48 but if it cost too much id just rather get anotehr ref 7900XT or even a 7900GRE (should have been the actuall 7800XT 16GB from the start)

i used to be an Big Time AMD Fan //// Im Not anmore!! they cant make RDNA APUs on AM4 :( only CRAP VEGA 8 :( and the AM5 Desktop APUs also suck ( its the best so far but? I want an 8Core 16 Threads APU with 40-60+ RDNA3.5 or RDAN4 Compute Units ( or 20-30 WGP's (Work Group Units) (Sounds better than Work Group Processors lol
Posted on Reply
#132
AusWolf
Lt. James Hetfieldif AMD Tries Charging $499+ then AMD might as well Cancel this gen altogether!! im not buying an "RDNA 1" GPU @ over $400USD Navi 48 GPUs need to start @ $399 OR just cancel cuz i wont be buying anything that i already have (aka RX 7900XT 20GB) and prob has way less Ram too :(

better RT performance alone is not worth it ( i dont play any games that has any decent RT/PT implementations. Re4 remake and LOUP1 and thats it oh wel & WD : Legion (GARBAGE) i do have GP Ultiamte and can play many MS games life FS 2020 and Forza etc but i dont really like those games (mostly use GP for Xbox 360 game like Skate/sakte 3 on Pc/& Ally

so this also has been proven that RDNA3 had bugs cuz RDNA4 is just a "BUGS FIX" Generation !!!! YAY! how about @amd .. You give me 30% of my $$$ back for RDNA3 Cards i bought or face a Class Action Lawsuit like with Bulldozer Lies////

Proves AMD KNEW RDNA3 HAD BUGGS


if N48 is just a "RX 5700XT" basicly then it should only be $399 to 8700XT ? but have an 8700XTX with say more Ram (32GB Clamshell) idunno! or 8700XTX with 24Gbps GDDR6 but still 16GB but its Way faster an way higher clocked!! but has to have 16GB GDDR6 @minimum!!!

i already have a AMd refeernce 7900XT and it serves me fine... if cheap enough yeah id love to play with N44/48 but if it cost too much id just rather get anotehr ref 7900XT or even a 7900GRE (should have been the actuall 7800XT 16GB from the start)

i used to be an Big Time AMD Fan //// Im Not anmore!! they cant make RDNA APUs on AM4 :( only CRAP VEGA 8 :( and the AM5 Desktop APUs also suck ( its the best so far but? I want an 8Core 16 Threads APU with 40-60+ RDNA3.5 or RDAN4 Compute Units ( or 20-30 WGP's (Work Group Units) (Sounds better than Work Group Processors lol
Why would you even look at an RDNA 4 card if you already have a 7900 XT? Generation per generation upgrades have been extremely small recently, it's totally unwarranted.
It's like saying "I'm not gonna buy a 7600 because it's not faster than my 6650 XT, hmph!" Well, guess what. You're not the target audience anyway. ;)

And I don't want to spoil you all the secrets of the world but...
1. Santa Claus isn't real, and
2. No company has ever cared whether you're a fan or not, so you'd better not be, and instead, observe events from a distance and make educated buying decisions.
Posted on Reply
#133
ARF
AusWolfWhy would you even look at an RDNA 4 card if you already have a 7900 XT? Generation per generation upgrades have been extremely small recently, it's totally unwarranted.
It's like saying "I'm not gonna buy a 7600 because it's not faster than my 6650 XT, hmph!" Well, guess what. You're not the target audience anyway. ;)
The vast majority of sales are under 250$, exactly where those 6650s and 7600s fall. If AMD again decides to skip any generational performance uplift, then the sales will go downhill.
Speaking of RX 7900 XT. It is a big, hot and power hungry card. If AMD succeeds in making a new card within 5 or 10% the performance, and at 50-60% the power draw, that would be a huge selling point, and motivation for the users to prefer the latter.
Posted on Reply
#134
AusWolf
ARFThe vast majority of sales are under 250$, exactly where those 6650s and 7600s fall. If AMD again decides to skip any generational performance uplift, then the sales will go downhill.
I know. But imagine someone upgrading from an RX 480. Then, the 7600 will give you slightly higher efficiency than a 6650 XT would for the same price. But if you have a 6650 XT, there's no point buying a 7600.
ARFSpeaking of RX 7900 XT. It is a big, hot and power hungry card. If AMD succeeds in making a new card within 5 or 10% the performance, and at 50-60% the power draw, that would be a huge selling point, and motivation for the users to prefer the latter.
I agree. But again, not for people who already have a 7900 XT. Rather, if you have a 6800 XT or lower.
Posted on Reply
#135
kapone32
AusWolfI know. But imagine someone upgrading from an RX 480. Then, the 7600 will give you slightly higher efficiency than a 6650 XT would for the same price. But if you have a 6650 XT, there's no point buying a 7600.


I agree. But again, not for people who already have a 7900 XT. Rather, if you have a 6800 XT or lower.
Even with the narrative there are plenty of happy 7900XT users. The card can run any Game at 4k. Even TWWH3 with mods and provide 100 FPS average. To make a card that would be as fast and draw less power would be nice but I want real performance increase in terms of feel. A 6900XT is just not as fast as a 7900XT so if that is what it means then great.

In actual fact it is about monitor performance as well. The 7900XT is perfect for 4K 144Hz freesync panels. That was the apex but that spec has been updated with Ultrawide monitors with high refresh and pixel density rates and 4K 240Hz panels. I expect that the top end card will come with the same VRAM buffer of 20GB but the chip to be much faster to push those panels. I wonder if that would change the feel of racing Sims that run at high refresh rates already like AMS2? I know that on launch the narrative on the 7900XT was not good but with driver updates and new Games to play it is great and a card that will have you explore your library.
Posted on Reply
#136
ARF
AusWolfI know. But imagine someone upgrading from an RX 480. Then, the 7600 will give you slightly higher efficiency than a 6650 XT would for the same price. But if you have a 6650 XT, there's no point buying a 7600.
RX 480 when initially launched in 2016 was in a different performance segment than what RX 7600 is today. I guess no one will move from an 8GB card to another 8GB card after 8 years of gaming with it.
Also, look at the Steam hardware survey. There is no RX 7600 out there, which only proves how terrible offer it actually is.
People don't buy, because it became prohibitively expensive.
The users with RX 480 should look for RX 7800 XT, or RX 7700 XT in the worse case.

store.steampowered.com/hwsurvey/videocard/
Posted on Reply
#137
AusWolf
ARFRX 480 when initially launched in 2016 was in a different performance segment than what RX 7600 is today. I guess no one will move from an 8GB card to another 8GB card after 8 years of gaming with it.
Also, look at the Steam hardware survey. There is no RX 7600 out there, which only proves how terrible offer it actually is.
People don't buy, because it became prohibitively expensive.
The users with RX 480 should look for RX 7800 XT, or RX 7700 XT in the worse case.

store.steampowered.com/hwsurvey/videocard/
When launched, the 4 GB 480 had an MSRP of $199, and the 8 GB version $229. The 7600 is a $269 card. Just because the model name ends with x80, it doesn't mean it was at the same level as current x800 cards. Actually, it was more of a competitor to the 1060 than anything.

The Steam hardware survey only shows the common mindset of "Nvidia=good, AMD=bad", so I wouldn't rely on it.
Posted on Reply
#138
ARF
AusWolfWhen launched, the 4 GB 480 had an MSRP of $199, and the 8 GB version $229. The 7600 is a $269 card. Just because the model name ends with x80, it doesn't mean it was at the same level as current x800 cards. Actually, it was more of a competitor to the 1060 than anything.

The Steam hardware survey only shows the common mindset of "Nvidia=good, AMD=bad", so I wouldn't rely on it.
Nope. Look at the reviews:

www.techpowerup.com/review/amd-rx-480/24.html
www.techpowerup.com/review/amd-radeon-rx-7600/32.html

When launched, the RX 480 8GB was only 38% behind the top dog then R9 Fury X, and GTX 1080 was 84% faster.
Today, RX 7600 is 100% behind RX 7900 XTX, and RTX 4090 is 136% faster.
Posted on Reply
#139
AusWolf
ARFNope. Look at the reviews:

www.techpowerup.com/review/amd-rx-480/24.html
www.techpowerup.com/review/amd-radeon-rx-7600/32.html

When launched, the RX 480 8GB was only 38% behind the top dog then R9 Fury X, and GTX 1080 was 84% faster.
Today, RX 7600 is 100% behind RX 7900 XTX, and RTX 4090 is 136% faster.
The $599 1080 is surely not in the same league for its age as the $1599 4090, don't you think? Like I said, don't get confused by model names. Look at the prices instead.
Posted on Reply
#140
ymdhis
AusWolfWhen launched, the 4 GB 480 had an MSRP of $199, and the 8 GB version $229. The 7600 is a $269 card. Just because the model name ends with x80, it doesn't mean it was at the same level as current x800 cards. Actually, it was more of a competitor to the 1060 than anything.

The Steam hardware survey only shows the common mindset of "Nvidia=good, AMD=bad", so I wouldn't rely on it.
Don't forget that with the 480 when you bought the 4GB $199 card, it was a 8GB card with a 4GB sticker over the part on the box that said 8GB. It was also by far the most popular AMD card for ages on the Steam hardware survey.
Hardware power wise it was always mid-range at best, something around the level of a gtx 1060. And as far as that goes, the rx7600 is in the same spot as the rx480 was. Except that in performance-per-dollar, it shows an extremely unimpressive, low increase considering that 7 or 8 years have since passed.

In fact, the RX 6500 XT and the RX 480 launched at the same price and had almost the same performance, so eight years got you zero performance per dollar increase (and the 6500 was even missing some features the 480 had, in the video decoder as I recall).
Posted on Reply
#141
Chrispy_
ymdhisIn fact, the RX 6500 XT and the RX 480 launched at the same price and had almost the same performance, so eight years got you zero performance per dollar increase (and the 6500 was even missing some features the 480 had, in the video decoder as I recall).
That's true but don't look at pricing during the ETH mining boom and then COVID-19 shutdown of manufacturing in China.

Yes, the 6500XT was a turd that was also missing features, but it was never even supposed to be a complete desktop GPU and it likely hit the market as a desktop card simply because AMD had these chips destined to be secondary GPUs in laptops, but all the laptop manufacturers were either in lockdown, or had cancelled SKUs that these chips were destined for because of production and supply issues in China at the time. This is the same time when desktop GPUs were selling for 200% their MSRP due to scalping.

When the 6500XT was selling for $250, used RX 5700 with an MSRP of $379 were all over ebay for $1000, and selling so fast you had to be quick if you wanted one!
Posted on Reply
#142
LabRat 891
Chrispy_That's true but don't look at pricing during the ETH mining boom and then COVID-19 shutdown of manufacturing in China.

Yes, the 6500XT was a turd that was also missing features, but it was never even supposed to be a complete desktop GPU and it likely hit the market as a desktop card simply because AMD had these chips destined to be secondary GPUs in laptops, but all the laptop manufacturers were either in lockdown, or had cancelled SKUs that these chips were destined for because of production and supply issues in China at the time. This is the same time when desktop GPUs were selling for 200% their MSRP due to scalping.

When the 6500XT was selling for $250, used RX 5700 with an MSRP of $379 were all over ebay for $1000, and selling so fast you had to be quick if you wanted one!
If it wasn't for the 'shortage' at the time, the pricing of the card and 'value' would've been insanely poor. @ the time, it seemed reasonable.

I have one, (missing video codec aside) its 4GB of GDDR6 performs on-par or a smidge better than an RX 580 8GB in most games.

IMO: Navi 24 was a missed opportunity... Gen4x4 mobile-derived GPU, and no one bothered to 'steal the limelight' with an M.2 GPU based on it. (Not even a Riser-Included Kit for SFF builds that the cards were clearly better-suited for)
Posted on Reply
#143
Chrispy_
LabRat 891If it wasn't for the 'shortage' at the time, the pricing of the card and 'value' would've been insanely poor. @ the time, it seemed reasonable.
If it hadn't been for the shortage, the 6400 and 6500 series wouldn't have been released as desktop cards at all.

They're power-optimised companion dies designed to hang off a 4x link to the laptop IGP in Cezanne and Rembrandt mobile APUs, which is why they're missing a bunch of video engine features and outputs - those are already in the IGP and there was no point duplicating the existing stuff. As a 6500M it's a remarkably decent offering at <50W for slim laptops. Nvidia don't really have anything in that range, you were basically going to buy a chungus plastic 3050 laptop, or pay a price premium to get a 3050/3060 in a thin-and-light. In theory there's a Geforce MX 570 based on Ampere but I've never seen one on sale, might be something that hasn't made it to the UK, France, Holland, or Singapore which are the only countries I buy from for work.
LabRat 891IMO: Navi 24 was a missed opportunity... Gen4x4 mobile-derived GPU, and no one bothered to 'steal the limelight' with an M.2 GPU based on it. (Not even a Riser-Included Kit for SFF builds that the cards were clearly better-suited for)
Given the size of some Gen5 SSD heatsinks, I think a Navi24 M.2 GPU isn't entirely unreasonable.
Posted on Reply
#144
kapone32
LabRat 891If it wasn't for the 'shortage' at the time, the pricing of the card and 'value' would've been insanely poor. @ the time, it seemed reasonable.

I have one, (missing video codec aside) its 4GB of GDDR6 performs on-par or a smidge better than an RX 580 8GB in most games.

IMO: Navi 24 was a missed opportunity... Gen4x4 mobile-derived GPU, and no one bothered to 'steal the limelight' with an M.2 GPU based on it. (Not even a Riser-Included Kit for SFF builds that the cards were clearly better-suited for)
The thing with the 6500XT is the 2 Watt Idle power draw and the 4K 120Hz support for Smart TVs. The 4GB frame buffer was the only thing I did not like. I did see the future when I opened that card and saw how small the chip was. Which astounded me when it OC to 3Ghz with one click.
Posted on Reply
#145
Launcestonian
Hanging out for this RDNA 4 series, they will be a nice upgrade over my RDNA 2 card. Top of the range for me is my choice when the arrive in retail channels. Until then I oc the crap out of my RDNA 2 card! :D
Posted on Reply
#146
Chrispy_
LauncestonianHanging out for this RDNA 4 series, they will be a nice upgrade over my RDNA 2 card. Top of the range for me is my choice when the arrive in retail channels. Until then I oc the crap out of my RDNA 2 card! :D
The top of the range this coming gen is rumoured to be midrange (ie, 7800XT-sized, performance probably 7900XT) which is fine by me. That's enough performance for 99% of the market and the people who are truly chasing the top end will probably just pay the $2000 for a 4090 anyway because money is unlikely to be a constraint, and patience to wait for Blackwell is also likely to be in short supply.
Posted on Reply
#147
AusWolf
Chrispy_The top of the range this coming gen is rumoured to be midrange (ie, 7800XT-sized, performance probably 7900XT) which is fine by me. That's enough performance for 99% of the market and the people who are truly chasing the top end will probably just pay the $2000 for a 4090 anyway because money is unlikely to be a constraint, and patience to wait for Blackwell is also likely to be in short supply.
Agreed. Besides, we seem to be getting an updated RT engine, which AMD needs more than they need a halo card in my opinion. Add improved idle power, and I'll call it a win.
Posted on Reply
#148
Chrispy_
AusWolfAgreed. Besides, we seem to be getting an updated RT engine, which AMD needs more than they need a halo card in my opinion. Add improved idle power, and I'll call it a win.
Yeah, they need better RT performance to keep up appearances, but realistically the RT experience in current titles with even a 4090 is underwhelming.

For me what wrecks RT image quality is the crawling shadows from sample noise and the multiple frames of delay between something appearing in frame and the lighting and shadows looking anything close to "correct" as you'd see in a static screenshot.

Case in point, the crown-jewel of raytracing in 2024; CP2077 2.1 with RR:
How it's supposed to look, after 10-20 frames of temporal filtering without moving the camera view at all, vs how it actually looks in motion that's representative of a real gameplay experience.

Yeah, RT lighting is a dogshit splotchy mess in motion. Don't believe the "best-case-scenario" still screenshots or video where the camera is only moving forwards. Camera tilts and pans, as well as strafing sideways are all going to introduce these really really ugly, unacceptably low-quality temporal artifacts. Even people with RTX hardware seem to be conned by it because whenever they take their hand off the mouse to press the screenshot key, they stabilise the image for half a second which means the captured screenshot looks fine, even though they know it looks wrong and dirty whenever they're actually playing.

You know what, I was so focussed on RT lighting when I made those screenshots that I completely missed how bad DLSS looks in motion, too. Clearly it also needs a few frames to smooth out the image and in motion the ugliness of 720p really shines though (DLSS performance is the only way my GPU can get >60fps with path-tracing)

Here's hoping RDNA5 really focuses on image quality in motion and not just temporal nonsense that's only good for carefully curated screenshots and video that looks alright after youtube/twitch compression has taken its pound of flesh...
Posted on Reply
#149
Launcestonian
Couldn't care less about RT, thing that concerns me most is performance per watt with RDNA 4 @ 3K (21:9 aspect) . Yesterday I was playing starfield & started a new quest from the 10 June patch update & yet just talking to an NPC & my PC box is drinking over 450W of power (Ultra settings less shadows on high) for just this mundane non action scene! game is still un optimised well for PC platform imo. Could be the way my RDNA 2 card behaves when OC but I have power tab disabled in performance options. All this with latest Adrenalin drivers (24.5.1) so there's that.
Posted on Reply
#150
Chrispy_
LauncestonianCouldn't care less about RT, thing that concerns me most is performance per watt with RDNA 4 @ 3K (21:9 aspect) . Yesterday I was playing starfield & started a new quest from the 10 June patch update & yet just talking to an NPC & my PC box is drinking over 450W of power (Ultra settings less shadows on high) for just this mundane non action scene! game is still un optimised well for PC platform imo. Could be the way my RDNA 2 card behaves when OC but I have power tab disabled in performance options. All this with latest Adrenalin drivers (24.5.1) so there's that.
Ah yeah, I had a 6800XT previously and undervolted it down to around 200W to keep my gaming room a little cooler.
If you've OC'd yours I presume you've put the power limit slider all the way to the right and likely have the GPU using ~300W?
Posted on Reply
Add your own comment
Jun 16th, 2024 08:00 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts