• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Radeon RX 9060 XT to Roll Out 8 GB GDDR6 Edition, Despite Rumors

The only constant in time is change...
And yet, the more things change the more they stay the same.

Also in a world where people use terms such as '2K' liberally to denote any resolution within 40% of 2000 horizontal pixels at this point... good luck getting a true definition to last more than a year lol.
Exactly. By the numbers, 2k is the same as 1080p because 1920 is closer to 2000 than 2560. 2560x1440p is 2.5k and yet every dingbat on the net calls it 2k.. :rolleyes: Nitwits..
 
Last edited:
Then don't buy it (I don't mean to disagree with you, just saying).


Good. Maybe then, AMD (and Nvidia) will learn.

I have a question - which one do you think will be faster - RX 7600 XT 16 GB or the "new" RX 9060 XT 8 GB ?
And another - if it proves that the latter is slower, how will AMD explain this generational regression ?
 
I have a question - which one do you think will be faster - RX 7600 XT 16 GB or the "new" RX 9060 XT 8 GB ?
And another - if it proves that the latter is slower, how will AMD explain this generational regression ?
Should that not be compared to the 9060xt 16gb? The 8gb would be like the 7600 8 gig variant .
 
I have a question - which one do you think will be faster - RX 7600 XT 16 GB or the "new" RX 9060 XT 8 GB ?
It'll depend on the situation.
In any case where the 8 GB VRAM is not exceeded, or the game engine isn't bothered by swapping data with the system RAM too much, the 9060 will be faster. Any other cases, the 7600 XT of course.

And another - if it proves that the latter is slower, how will AMD explain this generational regression ?
I don't care. I'm not gonna buy one, and neither should anyone slightly concerned by the above. Thankfully, the 16 GB variant exists.
 
If you don't understand what's being said, can't help you man. It's pretty clear.


Let me quote my self

Because the card with more vram will cost more than the card with less

Sure man.
So, in conjunction with this statement: vram has nothing to do with the cards being expensive.

You are saying the cards will be expensive regardless of the vram amount. So why even have the lower vram amount then? Saying it will cost less isn't an argument because if you can already afford an "expensive" gpu why would you ever go with the card with less vram that is also "Expensive? Unless of course you can't afford an "expensive" gpu and the lesser vram brings it down to the "non-expensive" price like the B570 and B580 gpus.
 
So, in conjunction with this statement: vram has nothing to do with the cards being expensive.

You are saying the cards will be expensive regardless of the vram amount. So why even have the lower vram amount then? Saying it will cost less isn't an argument because if you can already afford an "expensive" gpu why would you ever go with the card with less vram that is also "Expensive? Unless of course you can't afford an "expensive" gpu and the lesser vram brings it down to the "non-expensive" price like the B570 and B580 gpus.
Well, at least you understood what I was saying.

It depends on the price, if it's only 30$ then sure youll go for the higher vram card. But still - I don't see a reason for the 8gb not existing at all.
 
A 16GB version is also coming, but the question remains - why did AMD make decisions to have both Navi 48 and Navi 44 with equal amounts of VRAM?
Couldn't they design the Navi 48 to have 24 GB, while Navi 44 16 and 12 GB versions?
Is it so difficult for them to follow more logic and sense in order to offer according to the gaming market realities, without artificially plaguing their cards with insufficient VRAM amounts ?

1746539556501.png


 
A 16GB version is also coming, but the question remains - why did AMD make decisions to have both Navi 48 and Navi 44 with equal amounts of VRAM?
Couldn't they design the Navi 48 to have 24 GB, while Navi 44 16 and 12 GB versions?
Is it so difficult for them to follow more logic and sense in order to offer according to the gaming market realities, without artificially plaguing their cards with insufficient VRAM amounts ?

View attachment 398414

It's because of the memory controller. To have 24 GB VRAM, you need either a 192-bit bus, which is not enough, or a 384-bit one, which makes the GPU bigger and more expensive. Neither of these are good solutions in the current market. 16 GB on 256 bits is the price-to-performance sweet spot.

Similarly on Navi 44. To have a 12 GB variant, you need to cut down the 128-bit bus to 96 bits. You don't want that.
 
It's because of the memory controller. To have 24 GB VRAM, you need either a 192-bit bus, which is not enough, or a 384-bit one, which makes the GPU bigger and more expensive.

A wider memory bus doesn't mean a larger die. You have enough space on the 360 mm2 Navi 48 to widen the bus over the current 256-bit.
Also, 192-bit bus if cleverly designed with a larger Infinity Cache to 128 MB or better 256 MB could have been a solution.

Similarly on Navi 44. To have a 12 GB variant, you need to cut down the 128-bit bus to 96 bits. You don't want that.

Or to 192-bit, which you definitely want.
 
Last edited:
A wider memory bus doesn't mean a larger die. You have enough space on the 360 mm2 Navi 48 to widen the bus over the current 256-bit.
Where?
1746552102172.jpeg


Also, 192-bit bus if cleverly designed with a larger Infinity Cache to 128 MB or better 256 MB could have been a solution.
I disagree. Infinity Cache doesn't seem to give enough advantage to substitute a wider memory bus.

Or to 192-bit, which you definitely want.
Again, larger die. You don't want that in the current overpriced state of the market.

(I mean I'd want the chip, I just wouldn't want the price of it)
 

On the sides.
If you look at the die, you will see that the 16 16-bit memory bus controllers are placed on two of the four sides of the die.
You have plenty of space to include more 16-bit controllers.

1746552525375.png
 
On the sides.
If you look at the die, you will see that the 16 16-bit memory bus controllers are placed on two of the four sides of the die.
You have plenty of space to include more 16-bit controllers.

View attachment 398454
That's just a diagram, it doesn't scale with the die. Please mark it on the die shot that I attached. You'll see that there is no empty space on it.

Edit: Here's a better one.
1746553039286.png
 
It's because of the memory controller. To have 24 GB VRAM, you need either a 192-bit bus, which is not enough, or a 384-bit one, which makes the GPU bigger and more expensive. Neither of these are good solutions in the current market. 16 GB on 256 bits is the price-to-performance sweet spot.

Similarly on Navi 44. To have a 12 GB variant, you need to cut down the 128-bit bus to 96 bits. You don't want that.
Or 1.5/3GB memory chips...
 
AMD was obviously expecting this gen to be a complete wipe-out thanks to their MCM designs getting scrapped, so N44 was always supposed to be a bottom bin die and N48 was cobbled together last minute (Hence N48 instead of N43 or N42 which is roughly where it belongs on the stack, because the numbering goes up the later the die is designed) to be a lean mean clocked to the sky machine.

Trying to extract logic from AMD's current product stack is going to be an exercise in frustration, because the product stack itself is a last minute gamble to make sure that they didn't miss this gen entirely. Turns out Nvidia is too busy making a quintillion dollars on AI to care about consumer Blackwell and now we have to act like RDNA4 was anything else other than a desperate gamble for AMD not to get stuck with a xx50 series part as their only new offering on the market.

I'm sure AMD wishes they did a lot of things differently for RDNA 4 knowing what they know now, they could have come in absurdly strong with a top to bottom monolithic stack, but they had to go to war with the cards they have not the ones they wished they'd designed knowing what we know now.
 
AMD was obviously expecting this gen to be a complete wipe-out thanks to their MCM designs getting scrapped, so N44 was always supposed to be a bottom bin die and N48 was cobbled together last minute (Hence N48 instead of N43 or N42 which is roughly where it belongs on the stack, because the numbering goes up the later the die is designed) to be a lean mean clocked to the sky machine.

Trying to extract logic from AMD's current product stack is going to be an exercise in frustration, because the product stack itself is a last minute gamble to make sure that they didn't miss this gen entirely. Turns out Nvidia is too busy making a quintillion dollars on AI to care about consumer Blackwell and now we have to act like RDNA4 was anything else other than a desperate gamble for AMD not to get stuck with a xx50 series part as their only new offering on the market.

I'm sure AMD wishes they did a lot of things differently for RDNA 4 knowing what they know now, they could have come in absurdly strong with a top to bottom monolithic stack, but they had to go to war with the cards they have not the ones they wished they'd designed knowing what we know now.
They needed something to release their RT improvements and FSR 4 on. Waiting until UDNA is ready wouldn't have been a very good option.
 
Well, at least you understood what I was saying.

It depends on the price, if it's only 30$ then sure youll go for the higher vram card. But still - I don't see a reason for the 8gb not existing at all.
I think it might have to do with the paradox of choice.

The concern some posters raise here, I speculate, is more about consumer psychology: having the choice of save-for-less-vram may attract financially restricted and compulsive buyers, who in turn may feel regretful for not "saving just a little more and buying that 16g card" in some (maybe quite foreseeable) future.
And when rendering a VRAM-bound game, I think we can all see how frustrating the thought of "if I just spend a little more back then" can be.

I agree with you - it's really good to have the freedom of choice. If I have a firm grip on my usecases - clearly knowing what games I'm going to play - with this GPU, there's gotta be scenes that 9068 9060/8 is going to be a better fit. But most customers don't have a firm grip on these technicalities - they might not even know what kind of games they're going to, and going to want to, play.

It kinda feels like the iPhone (7) storage problem: 32 or 128 gigs. People don't have trouble with the 32 gigs model existing, they have trouble with the 64 gigs model not existing they don't like the feeling of running out of storage on a perfectly functioning phone, and the feeling of "if only have i spend a little more..."
 
I have a question - which one do you think will be faster - RX 7600 XT 16 GB or the "new" RX 9060 XT 8 GB ?
And another - if it proves that the latter is slower, how will AMD explain this generational regression ?
Why would the 9060xt be slower than the 7600xt? Its half a 9070xt so it should be just a tiny bit slower than the 5060ti, which makes it faster than the 7600xt
 
Why would the 9060xt be slower than the 7600xt? Its half a 9070xt so it should be just a tiny bit slower than the 5060ti, which makes it faster than the 7600xt
He likely alludes to the 8GB buffer on the 9060 XT 8GB, compared to 16GB on 7600 XT. It's entirely possible that in a number of games 8GB card will lose to older (and on paper slower) 16GB card due to VRAM exhaustion. That being said 7600 also has 8GB version and 9060 XT also has a 16GB version.

Albeit the 7600 8GB lacks the XT naming that 7600 XT 16GB has.
AMD could have avoided much of the bad publicity this generation by omitting XT from the 8GB 9060 XT name and just calling it 9060 8GB while the other would be as is 9060 XT 16GB.

Honestly im a bit baffled as to why they even gave the 8GB card XT moniker as 9070 series already has both XT and non-XT versions (even though they have the same amount of VRAM, but have core count differences).
 
He likely alludes to the 8GB buffer on the 9060 XT 8GB, compared to 16GB on 7600 XT. It's entirely possible that in a number of games 8GB card will lose to older (and on paper slower) 16GB card due to VRAM exhaustion. That being said 7600 also has 8GB version and 9060 XT also has a 16GB version.

Albeit the 7600 8GB lacks the XT naming that 7600 XT 16GB has.
AMD could have avoided much of the bad publicity this generation by omitting XT from the 8GB 9060 XT name and just calling it 9060 8GB while the other would be as is 9060 XT 16GB.

Honestly im a bit baffled as to why they even gave the 8GB card XT moniker as 9070 series already has both XT and non-XT versions (even though they have the same amount of VRAM, but have core count differences).

Probably because there is a 9060 non-xt still in the pipe after they've harvested enough defective dies.

xx50 branding of cards seems to be toxic at this point.
 
Probably because there is a 9060 non-xt still in the pipe after they've harvested enough defective dies.

xx50 branding of cards seems to be toxic at this point.
9070 XT 16GB
9070 16GB
9070 GRE 12GB
9060 XT 16GB
9060 GRE 12GB
9060 8GB

This would be somewhat logical, but unfortunately that's not the world we live in...
 
He likely alludes to the 8GB buffer on the 9060 XT 8GB, compared to 16GB on 7600 XT. It's entirely possible that in a number of games 8GB card will lose to older (and on paper slower) 16GB card due to VRAM exhaustion. That being said 7600 also has 8GB version and 9060 XT also has a 16GB version.

Albeit the 7600 8GB lacks the XT naming that 7600 XT 16GB has.
AMD could have avoided much of the bad publicity this generation by omitting XT from the 8GB 9060 XT name and just calling it 9060 8GB while the other would be as is 9060 XT 16GB.

Honestly im a bit baffled as to why they even gave the 8GB card XT moniker as 9070 series already has both XT and non-XT versions (even though they have the same amount of VRAM, but have core count differences).
Ah okay, yes if you take into account the games that at ultra settings 8gb is an issue sure they might end up neck and neck or something.
 
Ah okay, yes if you take into account the games that at ultra settings 8gb is an issue sure they might end up neck and neck or something.
Some games already choke at 1080p medium settings.
 
Back
Top