Thursday, August 26th 2021

NVIDIA Readying GeForce RTX 3090 SUPER, A Fully Unlocked GA102 with 400W Power?

NVIDIA is readying the GeForce RTX 3090 SUPER, the first "SUPER" series model from the RTX 30-series, following a recent round of "Ti" refreshes for its product stack. According to kopite7kimi and Greymon55, who each have a high strike-rate with NVIDIA rumors, the RTX 3090 SUPER could finally max-out the 8 nm "GA102" silicon on which nearly all high-end models from this NVIDIA GeForce generation are based. A fully unlocked GA102 comes with 10,752 CUDA cores, 336 Tensor cores, 84 RT cores, 336 TMUs, and 112 ROPs. The RTX 3090 stops short of maxing this out, with its 10,496 CUDA cores.

NVIDIA's strategy with the alleged RTX 3090 SUPER will be to not only max out the GA102 silicon, with its 10,752 CUDA cores, but also equip it with the fastest possible GDDR6X memory variant, which ticks at 21 Gbps data-rate, compared to 19.5 Gbps on the RTX 3090, and 19 Gbps on the RTX 3080 and RTX 3080 Ti. At this speed, across the chip's 384-bit wide memory bus, the RTX 3090 SUPER will enjoy 1 TB/s of memory bandwidth. Besides more CUDA cores, it's possible that the GPU Boost frequency could be increased. All this comes at a cost, though, with Greymon55 predicting a total graphics power (TGP) of at least 400 W, compared to 350 W of the RTX 3090. A product launch is expected within 2021.
Sources: Greymon55 (Twitter), kopite7kimi (Twitter), WCCFTech
Add your own comment

99 Comments on NVIDIA Readying GeForce RTX 3090 SUPER, A Fully Unlocked GA102 with 400W Power?

#76
joey121215
Dr. DroAre you implying you're the only one here that has a 3090 and a 5950X? If we're "complaining" it's because it's senseless from more perspectives than one. Being an enthusiast definitely isn't about just buying whatever overpriced slop just because, quite contrary, we're lovers of technology here. And such product, as stated by the leaker, is a regression against the existing RTX 3090 model in more ways than one, while not offering anything new for the vast sum of money that it would cost.
If you have a 3090 already, you should be excited about this, as long as you can snag one at msrp (which with the high price of the card it shouldn't be that hard, especially since the 3090 is already the easiest one to get), you're most likely going to make a profit selling your used 3090 and get the performance boost of the 3090 super. It's more ga102 dyes being used for gaming cards instead of professional cards. Nvidia is just allocating perfectly printed ga102 dyes to the gaming community instead of putting them into professional cards. The dyes being used for these cards would never be used in lower end cards in the first place and without nvidia making money on the higher end cards you'd not see as many advances from nvidia as we do and those lower end cards everyone posting here seems to want would be either more expensive or less powerful. It also shows that nvidia is confident in the amount of units they're producing. No I'm not saying I'm the only one here with a 3090 and 5950x, but please tell me how the several people saying this card alone is going to max out their power supply have a 3090 and 5950x when the combined power draws of those two are over the draw of this one card? Just because a leaker has drawn their own conclusions about an unconfirmed product (since neither the price nor the actual wattage were leaked and just guessed at) doesn't mean you have to, but everyone on this site seems to have taken the leakers GUESSES at their word. If you're not excited about this card, that's great, it's NOT FOR YOU. Don't spit in the faces of the people who are excited about, instead look at it as the good thing that it is. You know what's going to slow nvidia to ramp up production and open new manufacturing spots to produce more cards that people like yourself want? More money. You know what's going to allow the 40xx series to push the envelope even further? Pushing the envelope on the 30xx series as far as they can. Nvidia prices it's lower range cards at fair prices. The enthusiast level is for people who aren't concerned about the price only pushing the envelope as far as they can, just like an exotic car isn't for people looking to get from point a to point b.

Do none of you realize that the reason nvidia is releasing this card is because they have more dyes than can sell to the professional crowd? If they didn't this card would not exist, nvidia is always going to make a larger profit off the professional crowd than the gaming crowd. To me this just signals the beginning of the end of the card shortage, it'll just trickle down from here.
Posted on Reply
#77
MrMeth
What I want to know is if they increase the price of this card further , (currently 3090s sell for $2800-$3000 in Canada !! ) Will they make sure that it's properly cooled this time or will the put the memory on the back with no passive cooling again and cross there fingers and hope for the best. Don't understand how Nvidia gets away with this crap continuously !!
Posted on Reply
#78
Vayra86
joey121215Lmao y'all are great pc enthusiasts, bench about enthusiast level cards just because you can't afford them. I can't afford a Lamborghini either, but I don't complain about the price I just appreciate the craftsmanship. If you don't want them to make 3090 supers than y'all really don't want lower grade cards either.

Whatever done with this forum. Got too many games to play on my 3090/5950x
Dont let the door hit your enthusiast ass as you go, eh?
Simplykt0420I bet they could make at least 3 3060s or 3070 out of one 3090 waffer... its stupid to make a highend gpu 10 percent better than there already expensive flagship.... they would make more people happy making lower level gpus...
3090 and 3080 are the same silicon. Nvidia has no urge to waste any capacity and they never had. Every chip that works goes somewhere.
joey121215If you have a 3090 already, you should be excited about this, as long as you can snag one at msrp (which with the high price of the card it shouldn't be that hard, especially since the 3090 is already the easiest one to get), you're most likely going to make a profit selling your used 3090 and get the performance boost of the 3090 super. It's more ga102 dyes being used for gaming cards instead of professional cards. Nvidia is just allocating perfectly printed ga102 dyes to the gaming community instead of putting them into professional cards. The dyes being used for these cards would never be used in lower end cards in the first place and without nvidia making money on the higher end cards you'd not see as many advances from nvidia as we do and those lower end cards everyone posting here seems to want would be either more expensive or less powerful. It also shows that nvidia is confident in the amount of units they're producing. No I'm not saying I'm the only one here with a 3090 and 5950x, but please tell me how the several people saying this card alone is going to max out their power supply have a 3090 and 5950x when the combined power draws of those two are over the draw of this one card? Just because a leaker has drawn their own conclusions about an unconfirmed product (since neither the price nor the actual wattage were leaked and just guessed at) doesn't mean you have to, but everyone on this site seems to have taken the leakers GUESSES at their word. If you're not excited about this card, that's great, it's NOT FOR YOU. Don't spit in the faces of the people who are excited about, instead look at it as the good thing that it is. You know what's going to slow nvidia to ramp up production and open new manufacturing spots to produce more cards that people like yourself want? More money. You know what's going to allow the 40xx series to push the envelope even further? Pushing the envelope on the 30xx series as far as they can. Nvidia prices it's lower range cards at fair prices. The enthusiast level is for people who aren't concerned about the price only pushing the envelope as far as they can, just like an exotic car isn't for people looking to get from point a to point b.

Do none of you realize that the reason nvidia is releasing this card is because they have more dyes than can sell to the professional crowd? If they didn't this card would not exist, nvidia is always going to make a larger profit off the professional crowd than the gaming crowd. To me this just signals the beginning of the end of the card shortage, it'll just trickle down from here.
Cool story but it falls flat on the fact that Nvidia doesnt have or want its own Fabs. All they can do is push Apple or another fatty out of the way to gain access to specific nodes. This is why Ampere got baked mostly on Samsung and only a tiny set of top end enterprise stuff goes through TSMC - Quite simply because Samsungs node isnt up to the task.

So, really, being all happy about a Super that is still inferior to whats possible even with the same atchitecture so Nvidia makes more money "to produce more", it really doesnt get more backwards than that. You pay premium for sub top, cut down chips on a sub optimal node.

Enjoy it, anyway. But a 3090 really isnt great in any way shape or form if you ask me. The fact is AMD has the better chips now, node and efficiency wise.

I do think your last sentence is accurate. Shortages are slowly going away. But it has nothing to do with Nvidia sales or money.
Posted on Reply
#79
Solid State Soul ( SSS )
AretakIt's crazy. People used to mock the power consumption of cards like the R9 290X, which drew 240-250W on average. Now people happily shove cards pulling 100W+ more in their system. It's sad that power consumption has only gone up and up in the GPU world.
Nvidia have been focusing on RT core ray tracing and DLSS tensor cores performance, instead of improving performance per watt of cuda cores, we have, made zero performance per watt improvements since pascal and that's the reason why AMD cards now are slightly more efficient than Nvidia.

Imo ray tracing was a mistake and the performance benefits of DLSS could have easly been made up with actual upgrades in performance gen after gen instead of ray tracing, but people are going to disagree with me, and thats fine, Nvidia spent hundreds of millions to convince laymen consumers ray tracing was the future, when in reality it made things worse for benefits only an eagle eye would see.
Posted on Reply
#80
Chrispy_
Vayra86Nice Samsung node says hi again, and again, not even with a ten foot pole will I touch this power hog crap. Not paying the Nv premium just to get a worse chip, its TSMC or bust.
The Samsung node seems fine for GA-104 and below.
Peformance/Watt of the 3060/3060Ti/3070 is fantastic - way better than that previous-gen TSMC 12nm and competitive with the RDNA2 stuff made on TSMC 7nm.

I think the real problem is that Ampere's design simply isn't power efficient at scale. Nvidia and Nvidia apologists are quick to blame Samsung but the real blame here is on Nvidia for making a GPU with >28bn transistors that is >50% bigger than the previous gen. There's no way that was ever not going to be a power-guzzling monster. To put it into context, Nvida's GA-102 has more transistors than the 6900XT and you have to remember that almost a third of the transistors in Navi21 are just cache that use very little power.
Posted on Reply
#81
Solid State Soul ( SSS )
At this rate, 4 slot cooler are going to get very common, you look at a high end card from the mid 2000 and its a 1 slot PCB with a small fan on a small heatsink, now we have air condition radiators on Graphics cards, its like we are evolving backwards
Posted on Reply
#82
sukathukassa
NordicThis card is cool. I will enjoy looking at the benchmarks like I enjoy seeing an F1 race. I will never be able to afford one of these nor do I care, that isn't the point. This card is a powerhouse and is exciting from an enthusiasts perspective.
Yet somehow this reminds me of yesterdays GP
Posted on Reply
#83
Vayra86
Chrispy_The Samsung node seems fine for GA-104 and below.
Peformance/Watt of the 3060/3060Ti/3070 is fantastic - way better than that previous-gen TSMC 12nm and competitive with the RDNA2 stuff made on TSMC 7nm.

I think the real problem is that Ampere's design simply isn't power efficient at scale. Nvidia and Nvidia apologists are quick to blame Samsung but the real blame here is on Nvidia for making a GPU with >28bn transistors that is >50% bigger than the previous gen. There's no way that was ever not going to be a power-guzzling monster. To put it into context, Nvida's GA-102 has more transistors than the 6900XT and you have to remember that almost a third of the transistors in Navi21 are just cache that use very little power.
The price of RT - I mentioned this during Turing releases too, and you're right. For smaller chips its 'okay' - but let's not forget the fact that even 'okay' GA104 has substantially higher TDP for each tier than previous gen Turing and that one already pushed it up too. The peak clocks are lower. So its efficient... but still big and it won't clock high, and I suppose only a cheaper Samsung node makes it marketable. If you think of it, the 3090 is really the worst possible chip they can bake on it, yet its positioned at the top with ditto pricing. Cut down... :D Put it next to 1080ti efficiency/temps/TDP and it gets pretty laughable.

Is it Nvidia doing too many transistors...? Remember, lower peak clocks... so that's going to be an early turning point when you size up. If a node can't support big chips well, isn't it just a node with more limitations than the others? Its not the smallest, either. Its not the more efficient one with smaller chips, either. I have no conclusive answer tbh, but the competition offers perspective.

Ampere's design, had it been on 7nm TSMC, would probably sit 20-40W lower with no trouble depending on what SKU you look at. But even so, those chips are still effin huge... and I'm actually seeing Nvidia moving towards an 'AMD' move here, going bigger and bigger making their chips harder to market in the end. There are limits to size, Turing was already heavily inflated and trend seems to continue as Nvidia plans for full blown RT weight going forward. It is really as predicted... I think AMD is being VERY wise postponing that expansion of GPU hardware, look at what's happening around us - shortages don't accomodate mass sales of huge chips at all. Gaming for the happy few won't ever survive very long. I think Nvidia is going to get forced to either change or price itself out of the market.

And...meanwhile we don't see massive aversion to less capable RT-chips either. Its still 'fun to have' but never 'can't be missed', only a few are on that train if I look around. Its still just a box of effects that can be fixed in other ways and devs still do that, even primarily so. Consoles not being very fast in RT is another nail in the coffin - and those are certainly going to last longer than Ampere or Nvidia's next gen. I think the judge is still not out on what the market consensus is going to be - but multi functional chips that can do a little bit OR get used for raw performance are the only real way forward, long term. RT effects won't get easier. Another path I'm seeing is some hard limit (or baseline) of non-RT performance across the entire stack, and just RT bits getting scaled up, if the tech has matured in gaming. But something's gonna give.
Solid State Soul ( SSS )Nvidia have been focusing on RT core ray tracing and DLSS tensor cores performance, instead of improving performance per watt of cuda cores, we have, made zero performance per watt improvements since pascal and that's the reason why AMD cards now are slightly more efficient than Nvidia.

Imo ray tracing was a mistake and the performance benefits of DLSS could have easly been made up with actual upgrades in performance gen after gen instead of ray tracing, but people are going to disagree with me, and thats fine, Nvidia spent hundreds of millions to convince laymen consumers ray tracing was the future, when in reality it made things worse for benefits only an eagle eye would see.
We should not be surprised if CUDA efficiency has reached its peak. I think they already moved there with Maxwell and Pascal cemented it with high peak clocks with power delivery tweaks and small feature cuts. Given a good node, what they CAN do is clock CUDA a good 300 mhz higher again. More cache has also been implemented since Turing (and Ampere added more I believe). Gonna be interesting what they'll try next.
Posted on Reply
#84
MentalAcetylide
joey121215Lmao y'all are great pc enthusiasts, bench about enthusiast level cards just because you can't afford them. I can't afford a Lamborghini either, but I don't complain about the price I just appreciate the craftsmanship. If you don't want them to make 3090 supers than y'all really don't want lower grade cards either.

Whatever done with this forum. Got too many games to play on my 3090/5950x
ffs, the issue isn't price, but rather the idea that they're making an even more expensive card that has very little performance gain over its predecessor(the RTX 3090) and slapping on "Super" to it when they could stick to focusing on producing more of the lower tier cards. What does the Super have? A few more CUDA cores & Tensor cores? Its just stupid imo. With their line of reasoning, they should just start producing 4000 series cards... Its like they're changing their card lineup more than they change their underwear.
Posted on Reply
#85
Solid State Soul ( SSS )
Vayra86We should not be surprised if CUDA efficiency has reached its peak. I think they already moved there with Maxwell and Pascal cemented it with high peak clocks with power delivery tweaks and small feature cuts. Given a good node, what they CAN do is clock CUDA a good 300 mhz higher again. More cache has also been implemented since Turing (and Ampere added more I believe). Gonna be interesting what they'll try next.
the 1070 and 1080 were 150w and 180w TDP cards, the 3070 and 3080 are 250w and 350w cards, nearly 50% power consumption increase, flagship nvida cards like 1080 Ti max at 250w, now flagship ampere draws over 400w :twitch::fear:, thats ridiculous, where are the performance per watt improvements since 2016 ??? :(

oh right, Nvidia spent it on ray tracing and tensor cores, good investment :shadedshu:
Posted on Reply
#86
Vayra86
Solid State Soul ( SSS )the 1070 and 1080 were 150w and 180w TDP cards, the 3070 and 3080 are 250w and 350w cards, nearly 50% power consumption increase, flagship nvida cards like 1080 Ti max at 250w, now flagship ampere draws over 400w :twitch::fear:, thats ridiculous, where are the performance per watt improvements since 2016 ??? :(

oh right, Nvidia spent it on ray tracing and tensor cores, good investment :shadedshu:
Precisely the reason I have not touched nor supported a single Nvidia GPU past Pascal. This isnt a path forward. Its backwards to implement inefficient render tech and brute force it. It serves shareholders and 'more more more' economy that we should know by now is utterly pointless and self destructive.

The fact a 5 year old card still runs everything evrn beyond 1080p only underlines that fact. RT is a desperate attempt to create new demand out of a few lights and shadows. Not better gaming in any way.
Posted on Reply
#87
Solid State Soul ( SSS )
Vayra86Precisely the reason I have not touched nor supported a single Nvidia GPU past Pascal. This isnt a path forward. Its backwards to implement inefficient render tech and brute force it. It serves shareholders and 'more more more' economy that we should know by now is utterly pointless and self destructive.

The fact a 5 year old card still runs everything evrn beyond 1080p only underlines that fact. RT is a desperate attempt to create new demand out of a few lights and shadows. Not better gaming in any way.
Exactly, Nvidia should have focused on large performance improvements to make high refresh rate gaming past 60hz a stranded, just when even mid range GPUs can push above 60 frames on modern games making high refresh rate gaming accessible to most, Nvidia decided you know what great ? a reflection features that halfs your performance for some reflections you need to watch youtube comparisons to spot their differences...Ridiculous.

Telemetry data shows that most current gen console users prefer to play on performance 60fps mode than ray tracing mode because that's what makes the experience better.

Nothing feels better than firing up a new game and see how smooth it runs at 144hz, now that dream is a lot farther unless you buy 1400$ GPUs every gen or two.
Posted on Reply
#88
BryanNitro
And I could mod it, and make it 340% improvement overall based on power usage hardware wear and tear and new set of clock and power stepping levels that are up to 39% more stable.

Software is the reason why I cant make it more stable at higher levels. they put bios flashing software in all the driver updates recently so they flashed everyone cards without they knowing, so they capped me...

that's the reason why I'm selling the fastest MINING BULLETPROOF DEATHMACHING CARD on Ebay right now. 300+fps all day 200+ 4k up to your CPU and powersupply at that point but she can mine at peaks to 125.5mh at 116w 70 to 90mh all day average.

just replace the ram and VRM thermal pads after 2 months of hard use.
Posted on Reply
#89
rutra80
xorbe400W, this is out of control. Some of these prebuilt PCs can no longer be shipped to some states because they use too much estimated power per year.
Naah, you just need to register them as agricultural machinery used to heat up the barn.
Posted on Reply
#90
dukki98
nguyenLet hope Nvidia release 3060 Super and 3070 Super along with 3090 Super, there are rather large performance gaps between 3060-3060Ti and 3070Ti-3080
Yeah, a 3060 Super closer to Ti than non-Ti version, but with 12GBs of 16Gbps memory will be enough to match the PS5 at 1440p for years to come, especially when u consider stronger RT cores and DLSS...
Posted on Reply
#91
AleXXX666
yotano211It would be .25% faster and 50% more expensive at retail price.
:)u'r right
Posted on Reply
#92
purecain
How can they not of worked on efficiency, i cant believe the card is going to use 400w. I wont be buying the card for that reason alone. Roll on the 4090...:slap:
Posted on Reply
#93
AleXXX666
dukki98Yeah, a 3060 Super closer to Ti than non-Ti version, but with 12GBs of 16Gbps memory will be enough to match the PS5 at 1440p for years to come, especially when u consider stronger RT cores and DLSS...
let it be 256 bit better, don't pray for those extra 4 gigs lol....
Posted on Reply
#95
rutra80
Liver shall be more desired and it grows back ☝️
Posted on Reply
#96
cst1992
Already donated that for the 3090.
Posted on Reply
#97
yotano211
rutra80Liver shall be more desired and it grows back ☝️
I'm no medical expert but I don't think it grows back
Posted on Reply
#98
rutra80
yotano211I'm no medical expert but I don't think it grows back
As little as 51% of the original liver mass can regenerate back to its full size.
Posted on Reply
#99
yotano211
rutra80As little as 51% of the original liver mass can regenerate back to its full size.
Holy $***, it does grow back.

Look mama, I learned something today.
Posted on Reply
Add your own comment
Copyright © 2004-2021 www.techpowerup.com. All rights reserved.
All trademarks used are properties of their respective owners.