Sunday, August 30th 2020

Gainward GeForce RTX 3090 and RTX 3080 Ampere Pictured, Slides Confirm Specs

A mega dump of the Gainward GeForce RTX 3090 Phoenix GS and RTX 3080 Phoenix GS reveal not only the common board design of the two cards, but also the final specs of the RTX 3080 and RTX 3090. The RTX 3090 features 5,248 CUDA cores, and 24 GB of 19.5 Gbps GDDR6X memory across a 384-bit memory bus, which belts out 936 GB/s of memory bandwidth. The Gainward Phoenix GS runs the RTX 3090 at 1725 MHz boost frequency. The RTX 3080, on the other hand, features 4,352 CUDA cores, and 10 GB of 19 Gbps GDDR6X memory across a 320-bit memory bus, with 760 GB/s memory bandwidth. Gainward is running the RTX 3080 at 1740 MHz on the Phoenix GS.

What's interesting is the board power figures put out by Gainward. The RTX 3090 typical board power (at least for the Phoneix GS), is rated at 350 W, while that of the RTX 3080 is rated at 320 W. These explain why we're seeing custom-design RTX 3090 cards with either three 8-pin PCIe power connectors, or in case of the Founders Edition card, the 12-pin connector that's capable of 600 W power delivery. Many of the custom-design RTX 3080 cards we've come across have two 8-pin PCIe inputs. The slides also list out "2nd generation RTX technology," and "3rd gen tensor cores." Gainward's board features a meaty triple-slot, triple-fan cooling solution that has RGB LED illumination. We predict Palit's cards to look very similar to these (with different cooler shroud designs).

Update 06:09 UTC: More pics follow, courtesy harukaze5719.

Sources: VideoCardz, harukaze5719 (Twitter)
Add your own comment

27 Comments on Gainward GeForce RTX 3090 and RTX 3080 Ampere Pictured, Slides Confirm Specs

#1
Xex360
Only one thing left for nVidia to announce, why we can't afford it.
Posted on Reply
#2
Cybrshrk
Xex360
Only one thing left for nVidia to announce, why we can't afford it.
Lol I'll be getting th upgrade to 3090 for only $150-250 out of pocket thanks to smart great resale and smart timing of it from my 2080ti
Posted on Reply
#3
blobster21
it will look great....as a poster in the toilet.
Posted on Reply
#4
cynic01
So 3080 won't support SLI after all.
Posted on Reply
#5
ratirt
It would seem it's just larger Turing packing more cores for each of the card's tier. the 3080 has exactly the same amount of cores as the 2080TI and higher DTP 320w vs 260w. Well, I'm not getting hyped till I see the benchmarks. We will see what this new release actually improves and gives to the consumer besides the price bump.
Posted on Reply
#6
Upgrayedd
Anyone have any idea why they don't have DP 2.0? You think a refresh could implement it?
Posted on Reply
#7
ZoneDymo
When I think Gainward, I always think of this sexy beast:



Reminded me of an old Dodge Charger, not sure if the cooling was particulairy good but man does it look good.
Posted on Reply
#9
ratirt
ZoneDymo
When I think Gainward, I always think of this sexy beast:



Reminded me of an old Dodge Charger, not sure if the cooling was particulairy good but man does it look good.
Yeah it kinda does :) The 1969 dodge charger I'd guess :)
Posted on Reply
#10
chstamos
What are they actually doing with 24 friggin gigabytes of RAM on that flagship? Even the boutique one percenter PCs it will be going into won't have that much for system RAM. I mean, the thing probably won't be fast enough to draw 8k (as if it mattered), so what's the point? What am I missing here?

Other than that, like most gamers, I'm most interested in flagships as an indication of how the subsequent mid range/mid-hign end GPUs of the line will perform. Call me crazy but a power hungry VRAM-munching monster like this does not exactly inspire confidence on a reasonable scaled down version. Hope I'm wrong, of course.
Posted on Reply
#11
medi01
I don't see 3 8-pins on the pics above.
Posted on Reply
#12
Vayra86
cynic01
So 3080 won't support SLI after all.
Its dead. I know its hard to believe but it was dead since Nvidia axed the fingers on midrange. Only a matter of time from that point onwards. With that, dev support got eradicated faster than you could blink. Some devs still do it, but if they do, there is no sound business case for it. After all its effort and support cost for virtually no gain. DX12 mGPU was dead when it was announced, too.
ZoneDymo
When I think Gainward, I always think of this sexy beast:



Reminded me of an old Dodge Charger, not sure if the cooling was particulairy good but man does it look good.
It cooled okay. Not a winner but good enough for sure. The biggest USP was that those fans were easily replacable too. You can open the shroud, click on new ones, close it back up.
Posted on Reply
#13
Chomiq
I almost forgot they even existed.
Posted on Reply
#14
lemoncarbonate
you know, after the era of Radeon 5970 and 6990, I've always been wondering what's the point of putting "Long GPU support up to 470mm", etc, in case marketing. We barely see GPU longer than 30cm (12" for imperial folks) these days, and modern ATX/m-ATX case can fit literally any modern GPU in the market right now.
And here we are, back to those era when Long GPU support was a thing.

That being said, I'm more interested in RTX 3070/3060 or RDNA2 rather than these overpriced things.
Posted on Reply
#15
Animalpak
See no nvidia big chunk heatsink.

All graphics cards by other vendors are like the actual.

Even the power connectors. No 12 Pin obligation.
Posted on Reply
#16
Am*
Card looks good but I don't have much confidence in this upcoming generation of Nvidia cards having power efficiency on their side -- at least from the rumored 400W power requirement for the GPU alone. Sounds like it will be a Fermi 2.0 -- unless these cards are going to come with big compute performance gains for once.

Definitely looks beefier than the awful looking Founders Edition design, which I think is going to alienate most people with both the proprietary connector as well as the triple slot requirement (definitely a no-go for ITX and most mATX builds I reckon).
Posted on Reply
#17
ComedicHistorian
Finally! I've been waiting weeks for Gainward(??) to release these pics
Posted on Reply
#18
Upgrayedd
Vayra86
Its dead. I know its hard to believe but it was dead since Nvidia axed the fingers on midrange. Only a matter of time from that point onwards. With that, dev support got eradicated faster than you could blink. Some devs still do it, but if they do, there is no sound business case for it. After all its effort and support cost for virtually no gain. DX12 mGPU was dead when it was announced, too.



It cooled okay. Not a winner but good enough for sure. The biggest USP was that those fans were easily replacable too. You can open the shroud, click on new ones, close it back up.
Well they've dropped support mostly for older generation. They still update and add profiles but it's mostly for Turing. Id like to see some scaling numbers from Maxwell through Turing. The old SLI vs HB SLI vs NVL. I would like to know if NVL provides better scaling and smoother feel in titles that didn't anyways scale great.

I fully expect CP2077 to support it.
Posted on Reply
#19
Vayra86
Upgrayedd
Well they've dropped support mostly for older generation. They still update and add profiles but it's mostly for Turing. Id like to see some scaling numbers from Maxwell through Turing. The old SLI vs HB SLI vs NVL. I would like to know if NVL provides better scaling and smoother feel in titles that didn't anyways scale great.

I fully expect CP2077 to support it.
Just a matter of time. The few titles that still get good support don't make it a feasible thing to keep doing. The fact is, a fast dwindling number of consumers has SLI capable GPUs, and devs have limited budgets to make games. I wouldn't stick my head in the sand if I were you.
Posted on Reply
#20
B-Real
Cybrshrk
Lol I'll be getting th upgrade to 3090 for only $150-250 out of pocket thanks to smart great resale and smart timing of it from my 2080ti
Wow, so you are milked $150-250? GG WP! :)I
Posted on Reply
#21
Upgrayedd
You think an add in RT only card would sell if it meant you only drop 5 frames instead of 50 with RT enabled? Like the PhysX cards of yesterdecades.
Posted on Reply
#22
kiriakost
Am*
Card looks good but I don't have much confidence in this upcoming generation of Nvidia cards having power efficiency on their side
I am always open my wallet when a card has an excellent cooling system among with best power efficiency.
The gamer which has talent and speed at taking correct decisions, he can win most trophy's, in comparison to monster VGA with a player whom afraid s to fight or die in battle.
Posted on Reply
#23
bonehead123
Upgrayedd
Anyone have any idea why they don't have DP 2.0? You think a refresh could implement it?
Well, OBVIOUSLY....

That feature will either only come on the ultra-even-moar-expensive, super-duper-Ti +++++ cards, or better yet, the NEXT revision, therefore giving them yet anutha excuse to milk moar $$ out of your wallets hahahaha ..:laugh:..:cry:..:roll:
Posted on Reply
#24
Lycanwolfen
I'm still running two 1070Ti's in SLI runs perfect for me 4k everything.

With the new cards out hopefully the price of 2070 supers come down to good price point so I can SLi them.
Posted on Reply
#25
Anymal
I hope they didnt print all the boxes with 7nm in spec list.
Posted on Reply
Add your own comment