• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 5070 SUPER Possible Specs Emerge: 24 Gbit Memory Takes Centerstage

Recently bought a 9070xt, so wondering the performance comparison between 5070super.
AMD can really grab the share if they could release something like a 9060xtx or a 9065 that’s stronger than 9060xt, which I
I feel like is too weak for 1440p.
 
I wonder if this means there will be 12GB versions of the 5060 & 5060ti? IF they do this and they keep the price in check, they'll fill hole in the budget market and shut the nay-sayers up.

I hemmed and hawed for weeks about this. 5070 12GB or wait for 5070 S 18GB? 5060 Ti 16GB or wait for a possible 5060 TiS 12 GB? Will the 5060 change at all?

I ended up deciding that the 5060 and 5060 Ti will likely stay at 8 and 16 GB as they are now because the options are already covered for VRAM. No 5060 Ti Super as there's no precedent. The 5070 12GB will likely also stay with the 18GB 5070 Super going for $599 or maybe even $649 MSRP. Which also well covers the VRAM options at that tier up.

And I bought a 5070 because methinks the 5070 Super will demand a premium over MSRP for quite a while with that VRAM allocation and the drop close to MSRP could be a year from now. Though as I live in the US, I paid a premium over MSRP anyway but there's no avoiding that at this time.
 
5070 12GB or wait for 5070 S 18GB? 5060 Ti 16GB or wait for a possible 5060 TiS 12 GB? Will the 5060 change at all?
That's one of the reasons I'm waiting to upgrade from my 3080. The GPU market is a mess and the best way forward for some of us is unclear, especially given that my current card does what is needed of it, mostly and current offerings do not provide a really compelling performance uptick and thus do not provide a large motivation.

I ended up deciding that the 5060 and 5060 Ti will likely stay at 8 and 16 GB as they are now because the options are already covered for VRAM. No 5060 Ti Super as there's no precedent. The 5070 12GB will likely also stay with the 18GB 5070 Super going for $599 or maybe even $649 MSRP. Which also well covers the VRAM options at that tier up.
All possible. However, NVidia has in the past frequently diversified their product lineup to spread across more price brackets and reach a wider group of customers.

Though as I live in the US, I paid a premium over MSRP anyway but there's no avoiding that at this time.
Exactly.
 
I wonder if this means there will be 12GB versions of the 5060 & 5060ti? IF they do this and they keep the price in check, they'll fill hole in the budget market and shut the nay-sayers up.
But they are already selling the 8GB versions that could have been 12GB from the get go. But NVIDIA wants to sell those same people a 4GB "upgrade" in the future because that will be the only thing they can afford when 8GB cards they already bought won't be able to run games anymore.

RTX 5060 should've only been in two versions, RTX 5060 16GB and RTX 5060Ti 16GB. Everything else was corporate BS. Also the price premiums on extra 8GB's of VRAM is just wild. It shouldn't cost that much.
 
But they are already selling the 8GB versions that could have been 12GB from the get go. But NVIDIA wants to sell those same people a 4GB "upgrade" in the future because that will be the only thing they can afford when 8GB cards they already bought won't be able to run games anymore.

RTX 5060 should've only been in two versions, RTX 5060 16GB and RTX 5060Ti 16GB. Everything else was corporate BS. Also the price premiums on extra 8GB's of VRAM is just wild. It shouldn't cost that much.
You felt the need to rifle off that, um, comment why exactly?
 
Last edited:
But they are already selling the 8GB versions that could have been 12GB from the get go. But NVIDIA wants to sell those same people a 4GB "upgrade" in the future because that will be the only thing they can afford when 8GB cards they already bought won't be able to run games anymore.

RTX 5060 should've only been in two versions, RTX 5060 16GB and RTX 5060Ti 16GB. Everything else was corporate BS. Also the price premiums on extra 8GB's of VRAM is just wild. It shouldn't cost that much.
Exactly, Nvidia could have sold 12GB cards as being the baseline amount of VRAM, but then they couldn't get consumers to upgrade in another 2 years, as 8GB causes some games now to crash or not even run.
The RTX 5060 8GB seems hilariously dumb with only 8GB of VRAM, and the 5070 12GB does too given its price point when the 3060 had a 12GB version. Also when the RX9070 has 16GB for the same price.
If 12GB would be the base amount it would definitely shut up the 8GB will forever be enough VRAM diehard nvidia fans.
 
Exactly, Nvidia could have sold 12GB cards as being the baseline amount of VRAM, but then they couldn't get consumers to upgrade in another 2 years, as 8GB causes some games now to crash or not even run.
The RTX 5060 8GB seems hilariously dumb with only 8GB of VRAM, and the 5070 12GB does too given its price point when the 3060 had a 12GB version. Also when the RX9070 has 16GB for the same price.
If 12GB would be the base amount it would definitely shut up the 8GB will forever be enough VRAM diehard nvidia fans.
Can we please not start this lameass line of crap again?
 
You quoted me. I'm curious why you felt the need to brand-bash NVidia while quoting me.
Brand bash? What does that even mean? I just pointed out bs NVIDIA does and gets flamed for it. That's not bashing, that's criticizing. Big difference. You just happened to post the post with that in it so I quoted it. Has nothing to do with you and everything to do with NVIDIA. AMD doing the same, they are just along on the NVIDIA train at this point with the market share they have so I can't really blame them though they could use that to their advantage and marketing with RX 9060 XT but they haven't.
 
Brand bash? What does that even mean? I just pointed out bs NVIDIA does and gets flamed for it. That's not bashing, that's criticizing. Big difference. You just happened to post the post with that in it so I quoted it. Has nothing to do with you and everything to do with NVIDIA. AMD doing the same, they are just along on the NVIDIA train at this point with the market share they have so I can't really blame them though they could use that to their advantage and marketing with RX 9060 XT but they haven't.
How can you not blame them but blame nvidia? :D

Is Jensen's AMD's CEO? Is he making decision for them? What in the world is that freaking logic?
 
18GB and a mild (4%) bump to the core, coupled with slightly higher clocks from the extra 25W might let it catch up with the vanilla 9070.

The 5070 was 10% slower than a 9070 and short on VRAM, which made it a pretty bad buy if you take both cards solely on MSRPs. In the real world the 9070 is above MSRP and the 5070 12GB actually dropped below MSRP in several regions so they're still pretty close on performance/$ but you'd chose Nvidia for API support and AMD to avoid the inevitable 12GB time bomb.

Presumably Nvidia is trying to stop the 9070 poaching 5070 sales so they won't price it too much higher, the idea of Nvidia giving away free cores and VRAM capcity is unlikely, so I doubt we'll see a $549 5070S and an official price cut on the 5070 like we had with the 40-series Super launch.
 
Brand bash? What does that even mean?
Are you kidding? Are you trying to emulate osmium or do you just not understand your own senseless twaddle?
That's not bashing, that's criticizing. Big difference.
No, it's brand bashing.
AMD doing the same, they are just along on the NVIDIA train at this point with the market share they have so I can't really blame them though they could use that to their advantage and marketing with RX 9060 XT but they haven't.
You didn't mention AMD until AFTER I called you out on your nonsense. Push off.

This article is about the potential 5070ti 24GB, it's NOT about the 8GB card debate.
 
Are you kidding? Are you trying to emulate osmium or do you just not understand your own senseless twaddle?

No, it's brand bashing.

You didn't mention AMD until AFTER I called you out on your nonsense. Push off.

This article is about the potential 5070ti 24GB, it's NOT about the 8GB card debate.

Oh shut up, I've talked shit about AMD too several times. Stop your bs crusade. Not to mention my current Radeon is the first AMD card after literal decade of GeForce cards ownership. So just stuff it.

Also you seem to have issues differentiating what criticism is and what bashing is. Criticism is what I was doing. Bashing is just saying NVIDIA or AMD sucks and give no further clarification why. I clearly stated why. It's also universally agreed upon what NVIDIA doing is bs entirely related to greed. They deserve all the bashing criticism they can get. Especially since they are "upgrading" cards that don't really need 24 frigging GB of memory, but starved those 8GB ones that absolutely needed more.
 
Getting back on topic after that hilarious nonsense;

It would seem there is a solid amount of merit to these rumors about 24GB cards, but there's also seemingly a very good chance of 5060/ti/12GB, 5070/ti/18GB cards as well.
NVidia might have had all along for when the 3GB chips were widely available.

I can't find it anywhere, does anyone know if memory makers are doing 1.5GB chips?
 
I can't find it anywhere, does anyone know if memory makers are doing 1.5GB chips?
I've not heard anything about that - and with 2Gbit chips likely being better price/capacity and being pumped out at a ridiculous commodity rate it's not worth the effort of changing production.

If there's demand for it, I'm sure it'll get made, but everyone's pushing for higher density, not lower.
 
I've not heard anything about that - and with 2Gbit chips likely being better price/capacity and being pumped out at a ridiculous commodity rate it's not worth the effort of changing production.

If there's demand for it, I'm sure it'll get made, but everyone's pushing for higher density, not lower.
Anything is possible. Wasn't really expecting it but never hurts to ask.
 
JEDEC standards have 12Gbit (1.5GB) modules but even going back to GDDR6, no GPUs ever used them.

I suspect it's supply and demand. Customers buy 8Gbit and 16Gbit chips, so the manufacturers ramp up capacity at those sizes, and economies of scale kick in to make them cheaper, making the chance of anyone wanting the 12Gbit capacity even lower.
 
JEDEC standards have 12Gbit (1.5GB) modules but even going back to GDDR6, no GPUs ever used them.

I suspect it's supply and demand. Customers buy 8Gbit and 16Gbit chips, so the manufacturers ramp up capacity at those sizes, and economies of scale kick in to make them cheaper, making the chance of anyone wanting the 12Gbit capacity even lower.
I was thinking something along the lines of a 9GB card with 6 total chips. While 6 chips may seem more expensive than 3, manufacturing works in ways that are not always intuitive. 6 1.5GB chips might work out to be less costly that 3 of the 3GB chips, which would make for a better budget offering.
 
I was thinking something along the lines of a 9GB card with 6 total chips. While 6 chips may seem more expensive than 3, manufacturing works in ways that are not always intuitive. 6 1.5GB chips might work out to be less costly that 3 of the 3GB chips, which would make for a better budget offering.
Maybe double-stacked, like the 4060Ti/5060Ti 16GB variants? There are additional PCB costs involved with running GDDR packages in dual-rank like that, so that might negate any potential cost savings from dropping from 12GB to 9GB VRAM.
 
9GB card with 6 total chips.
Making a 1.5 GB chip instead of a 2 GB one is, AT MOST, 20 percent cheaper. Realistically the difference is in single digits (if it exists; with some techniques, it doesn't). 3 GB ones might be about 40 to 80 percent more expensive than 2 GB ones but since they all cost roughly 3 dollars a chip for NVIDIA I don't see how it's any feasible to shrink it to 1.5. Makes absolutely zero sense. If they want to screw their customers they clearly showed they can do it much more easily and drastically as it is.

The memory itself is the cheapest thing in these chips. You can think of them as of some generic tap water in fancy bottles. You go from one gallon to two, great, it cost you 3 more cents in water. And 50 dollars more in vanadium-titanium-fuckanium alloy used for the bottle.

6 chips of .75 the volume will ALWAYS be more expensive than 4 chips of a standard volume.
 
Making a 1.5 GB chip instead of a 2 GB one is, AT MOST, 20 percent cheaper. Realistically the difference is in single digits (if it exists; with some techniques, it doesn't). 3 GB ones might be about 40 to 80 percent more expensive than 2 GB ones but since they all cost roughly 3 dollars a chip for NVIDIA I don't see how it's any feasible to shrink it to 1.5. Makes absolutely zero sense. If they want to screw their customers they clearly showed they can do it much more easily and drastically as it is.
Needless negativity aside, I've been in manufacturing, there are many variables that can affect the end consumer price. There is no way for any of us to definitively say it could or couldn't work. It's a guessing game until someone actually does it.
 
Needless negativity aside, I've been in manufacturing, there are many variables that can affect the end consumer price. There is no way for any of us to definitively say it could or couldn't work. It's a guessing game until someone actually does it.
In this particular instance, the complexity is almost identical. They'll make a 1.5 GB chip for X money and a 3 GB chip for 1.2X money, or 1.7X worst case scenario. What's the point for anyone to opt for a smaller option when you can get twice more for not much more?

We're far more likely to see 4+ GB GDDR7 modules than 1.5 GB ones.
 
Back
Top