Friday, August 28th 2020

NVIDIA GeForce RTX 3090 and 3080 Specifications Leaked

Just ahead of the September launch, specifications of NVIDIA's upcoming RTX Ampere lineup have been leaked by industry sources over at VideoCardz. According to the website, three alleged GeForce SKUs are being launched in September - RTX 3090, RTX 3080, and RTX 3070. The new lineup features major improvements: 2nd generation ray-tracing cores and 3rd generation tensor cores made for AI and ML. When it comes to connectivity and I/O, the new cards use the PCIe 4.0 interface and have support for the latest display outputs like HDMI 2.1 and DisplayPort 1.4a.

The GeForce RTX 3090 comes with 24 GB of GDDR6X memory running on a 384-bit bus at 19.5 Gbps. This gives a memory bandwidth capacity of 936 GB/s. The card features the GA102-300 GPU with 5,248 CUDA cores running at 1695 MHz, and is rated for 350 W TGP (board power). While the Founders Edition cards will use NVIDIA's new 12-pin power connector, non-Founders Edition cards, from board partners like ASUS, MSI and Gigabyte, will be powered by two 8-pin connectors. Next up is specs for the GeForce RTX 3080, a GA102-200 based card that has 4,352 CUDA cores running at 1710 MHz, paired with 10 GB of GDDR6X memory running at 19 Gbps. The memory is connected with a 320-bit bus that achieves 760 GB/s bandwidth. The board is rated at 320 W and the card is designed to be powered by dual 8-pin connectors. And finally, there is the GeForce RTX 3070, which is built around the GA104-300 GPU with a yet unknown number of CUDA cores. We only know that it has the older non-X GDDR6 memory that runs at 16 Gbps speed on a 256-bit bus. The GPUs are supposedly manufactured on TSMC's 7 nm process, possibly the EUV variant.
Source: VideoCardz
Add your own comment

216 Comments on NVIDIA GeForce RTX 3090 and 3080 Specifications Leaked

#126
dicktracy
7nm confirmed! Some fanboys are going to have to make up new lies for Ampere.
Posted on Reply
#127
Jinxed
rtwjunkie
My personal opinion is Nvidia is still experimenting. You waiting another gen might be a good idea. I may either do that as well or picky up a used 2080Ti.
Given the hundreds of hours of developer oriented videos, workshops and presentations about techniques to implement raytracing, even actually lending developers to companies like CD Projekt, DICE, Microsoft, 4A and Epic to help them implement raytracing in their games and engines, developing new technologies of AI denoising and supersampling to support raytracing, along with what seems like a massive investment into raytracing silicon real estate on their GPUs, it sure does not seem like they are experimenting. It seems like they are fully commited. But of course you are entitled to your opinion.
Posted on Reply
#128
Vayra86
Jinxed
Given the hundreds of hours of developer oriented videos, workshops and presentations about techniques to implement raytracing, even actually lending developers to companies like CD Projekt, DICE, Microsoft, 4A and Epic to help them implement raytracing in their games and engines, developing new technologies of AI denoising and supersampling, along with what seems like a massive investment into raytracing silicon real estate on their GPUs, it sure does not seem like they are experimenting. It seems like they are fully commited. But of course you are entitled to your opinion.
Commitment is nice, results count :)

Turing didn't exactly generate momentum, did it?
Posted on Reply
#129
BoboOOZ
dicktracy
7nm confirmed! Some fanboys are going to have to make up new lies for Ampere.
7nm or 14nm, who cares, what matters is performance, price, and efficiency.
Anyway don't get all hyped up, quote from videocardz:
"The data that we saw clearly mention the 7nm fabrication node. At this time we are unable to confirm if this is indeed true. "
Posted on Reply
#130
RedelZaVedno
rtwjunkie
Let me help you out. The only reason you think you “need” more than 10GB VRAM is because of lazy devs dumping textures ipon textures there just because it is there.

As to why NVIDIA is putting 24GB VRAM on the 3090? Marketing. “I’m King of the Hill.” Whatever you want to call it.
That is just not true. When game streams textures in real time (MS Flight Simulator 2020) or has procedurally generated worlds (SpaceEngine) you easily surpass 10GB Vram usage. I hit 12,7GB Vram usage when flying over Dubai at 4K/ultra, and filled all 16 GB of R7 vram in SpaceEngine. xx80 GPUs are HIGH END gpus meant to be used with 4K resolution, so 10 GB of Vram just won't cut it. Hell, Xbox X/PS5 have 16 GB GDDR6 w/ 320b bus memory on board. Sure it's not totally comparable because they lack DDR4, but still they have ultra fast SSDs to compensate to a degree.
Posted on Reply
#131
Jinxed
Vayra86
Commitment is nice, results count :)
We have to wait four more days for results (and then some for the reviews). But it's fun times for hardware fans either way.
Posted on Reply
#132
rtwjunkie
PC Gaming Enthusiast
Jinxed
Given the hundreds of hours of developer oriented videos, workshops and presentations about techniques to implement raytracing, even actually lending developers to companies like CD Projekt, DICE, Microsoft, 4A and Epic to help them implement raytracing in their games and engines, developing new technologies of AI denoising and supersampling, along with what seems like a massive investment into raytracing silicon real estate on their GPUs, it sure does not seem like they are experimenting. It seems like they are fully commited. But of course you are entitled to your opinion.
The thing is, they are only committed to still getting it right. Content is bare still. Name a few devs and a few upcoming titles on top of a few minimally RT’d titles so far and we are still in the development phase, at buyer expense.

Fact is, every one of the released games still looks fantastic without RTRT. None of these features they are making available on super-priced GPU’s make playing the game better or are necessary. Game immersion (at least on SP games) comes from gameplay and wel-written story. Graphics only enhances it.
RedelZaVedno
xx80 GPUs are HIGH END gpus meant to be used with 4K resolution
They never have been actual 4K GPU’s, no matter what generation. Only the 80 Ti’s have been, and even then only on some currently released games, and by a year later when newer games come out they are left behind.
Posted on Reply
#133
RedelZaVedno
rtwjunkie
They never have been actual 4K GPU’s, no matter what generation. Only the 80 Ti’s have been, and even then only on some currently released games, and by a year later when newer games come out they are left behind.
Anything costing 800 bucks should be 4K capable imho. 1080TI/Radeon7/2080(S) are all 4K capable cards.... It's 2020 for god sake, ppl are buying 8K TV sets and we're still discussing if GPUs costing near a grand should be 4K capable?
Posted on Reply
#134
ppn
4K capable means 60FPS not counting the 1% lows every setting at low and disabled when applicable in almost every game under the sun or 99% of them. it is still very cabable I think. Even 2070 is capable then, yay lets sell it for $800. Now if you are talking about every setting maxed, those goal posts are constantly moving. Kind of thinking that leads to a very short lived glory and badly spent $800.
Posted on Reply
#135
Jinxed
rtwjunkie
Fact is, every one of the released games still looks fantastic without RTRT. None of these features they are making available on super-priced GPU’s make playing the game better or are necessary. Game immersion (at least on SP games comes from gameplay and wel-written story. Graphics only enhances it.
And I agree there, as I told Varya:
And while a good story and gameplay is still the key and it's sad that some games put graphical fidelity first, I agree on that, adding more immersion and realism is a huge boost when the gameplay is good.
Why NOT enhance it? I remember playing Thief 1 and 2. And it was great. Very primitive shadows, but still immersive so much. Then Thief 3 came out, also excelent gameplay and story, but with a few generations of graphics enhancements, and that was such a massive boost. Did you need it in the previous two? No. Did it add a LOT to the 3rd episode? Absolutely. It's progress. It still baffles me that some people put their brand preferences above such an incredible technical advancement as raytracing is, just because one brand is way behind.
Posted on Reply
#136
BoboOOZ
RedelZaVedno
Anything costing 800 bucks should be 4K capable imho. 1080TI/Radeon7/2080(S) are all 4K capable cards.... It's 2020 for god sake, ppl are buying 8K TV sets and we're still discussing if GPUs costing near a grand should be 4K capable?
500-600USD consoles will be 4k capable, so enthusiast GPU's should definitely be 4k capable.
Posted on Reply
#137
rtwjunkie
PC Gaming Enthusiast
Jinxed
And I agree there, as I told Varya:


Why NOT enhance it? I remember playing Thief 1 and 2. And it was great. Very primitive shadows, but still immersive so much. Then Thief 3 came out, also excelent gameplay and story, but with a few generations of graphics enhancements, and that was such a massive boost. Did you need it in the previous two? No. Did it add a LOT to the 3rd episode? Absolutely. It's progress. It still baffles me that some people put their brand preferences above such an incredible technical advancement as raytracing is, just because one brand is way behind.
No brand preference here. Every few years I go AMD, but mostly I use Nvidia. I simply call things as I see them. I have a preference, but certainly not a loyalty.

I do agree, since I already said it, that gee wiz graphics can enhance an already immersive game. But the cost for that extra needs to be reasonable, considering that none of it os mandatory for an enjoyable game.
Posted on Reply
#138
theoneandonlymrk
Jinxed
And I agree there, as I told Varya:


Why NOT enhance it? I remember playing Thief 1 and 2. And it was great. Very primitive shadows, but still immersive so much. Then Thief 3 came out, also excelent gameplay and story, but with a few generations of graphics enhancements, and that was such a massive boost. Did you need it in the previous two? No. Did it add a LOT to the 3rd episode? Absolutely. It's progress. It still baffles me that some people put their brand preferences above such an incredible technical advancement as raytracing is, just because one brand is way behind.
Okay, less than five.

Starting to feel like a family guy slapping skit.
ppn
4K capable means 60FPS not counting the 1% lows every setting at low and disabled when applicable in almost every game under the sun or 99% of them. it is still very cabable I think. Even 2070 is capable then, yay lets sell it for $800. Now if you are talking about every setting maxed, those goal posts are constantly moving. Kind of thinking that leads to a very short lived glory and badly spent $800.
That's what you perceive as capable, meanwhile a fair few are gaming at 4k 60 in many games with whatever they have.
Neither you or me gets to set the rules for everyone.
Posted on Reply
#139
Jinxed
rtwjunkie
No brand preference here. Every few years I go AMD, but mostly I use Nvidia. I simply call things as I see them. I have a preference, but certainly not a loyalty.

I do agree, since I already said it, that gee wiz graphics can enhance an already immersive game. But the cost for that extra needs to be reasonable, considering that none of it os mandatory for an enjoyable game.
Reasonable is relative. It seems to me most people on tech forums these days focus too much on the high-end. But the significance of the RTX 3090 strangely enough is more important for the low end. Let me explain: If we really see 4x or more improvement in raytracing performance, we can start getting raytracing cards all the way down to the bottom of the stack. There may not be any more GTX cards. And even the lowest gaming card like RTX 3050 could have more raytracing power than the RTX 2060 has now. Do you realize the significance of that? Then raytracing REALLY becomes massive. It's so suprising almost noone mentions that in their articles.
Posted on Reply
#140
P4-630
Jinxed
There may not be any more GTX cards. And even the lowest gaming card like RTX 3050 could have more raytracing power than the RTX 2060 has now.
The other intriguing tidbit is the claim that GTX is dead, and that there will be RTX prefixes up and down the GeForce stack, with Tensor and RT Cores being dropped into even the lowest spec Ampere GPUs.
www.pcgamer.com/nvidia-ampere-turing-not-aging-well/
Posted on Reply
#142
rtwjunkie
PC Gaming Enthusiast
Jinxed
Reasonable is relative. It seems to me most people on tech forums these days focus too much on the high-end. But the significance of the RTX 3090 strangely enough is more important for the low end. Let me explain: If we really see 4x or more improvement in raytracing performance, we can start getting raytracing cards all the way down to the bottom of the stack. There may not be any more GTX cards. And even the lowest gaming card like RTX 3050 could have more raytracing power than the RTX 2060 has now. Do you realize the significance of that? Then raytracing REALLY becomes massive. It's so suprising almost noone mentions that in their articles.
Yes, I do recognize the significance of a card like the 3090. I’ve been around quite awhile, I know how the filter down of tech works.

I like to skip gens though, so buying mid-level doesn’t work for me. Even so, cost is a factor, and it’s not likely Nvidia will get my top-end dollars for the 3090.
Posted on Reply
#143
Rob94hawk
Chaitanya
I will wait to see how these new GPUs perform in Helicon before deciding on uograde.
I'm waiting on how it handles Zork.
Posted on Reply
#144
Dave65
Not buying a thing until Big Navi comes out. Then I will decide!
Posted on Reply
#145
Sithaer
I'm more interested in the ~cheaper mid range/budget options, most likely even the 3060 will be out of my budget range but if these new gen cards will push down the price on prev 'current' gen cards then thats all fine with me.:)

RT I'm not exactly hyped about, its a nice idea/tech but not interested in it enough to pay the extra for it yet 'regardless of the brand'.
Maybe 1-2 gen later when its more common/matured.
Posted on Reply
#146
harm9963
Will preorder my 3090 at Micro Center, lived down the street ,for pick up! here in Houston.
Posted on Reply
#147
dragontamer5788
rtwjunkie
Fact is, every one of the released games still looks fantastic without RTRT. None of these features they are making available on super-priced GPU’s make playing the game better or are necessary. Game immersion (at least on SP games) comes from gameplay and wel-written story. Graphics only enhances it.
Lets make a hypothetical: lets say that raytraced shadows would help the player see where enemy mooks are in some generic stealth game. Obviously, Raytracing would be very good in this kind of situation (and it'd be great to actually have the new Raytracing feature interact with gameplay features and/or puzzles). However, what happens to all the normal guys without Raytracing? They can no longer play the game effectively. So you need to come up with a rasterization trick to estimate the shadows, so that the game runs on their computers.

We're simply not at the stage yet where Raytracing can be used for a mechanical advantage. Otherwise, you'd alienate too many gamers from the game.

---------

Its really hard for me to come up with good uses of Raytracing that would actually affect the gameplay loop however. Outside of observing shadows, or maybe mirror-reflections... but even "Portal" ended up with tons of reflections (fake, non-raytraced ones. But good enough that most people didn't notice the issues). So its not like you need raytracing to make "accurate-enough for video games" kind of mechanics.
Posted on Reply
#148
Hotobu
dragontamer5788
Lets make a hypothetical: lets say that raytraced shadows would help the player see where enemy mooks are in some generic stealth game. Obviously, Raytracing would be very good in this kind of situation (and it'd be great to actually have the new Raytracing feature interact with gameplay features and/or puzzles). However, what happens to all the normal guys without Raytracing? They can no longer play the game effectively. So you need to come up with a rasterization trick to estimate the shadows, so that the game runs on their computers.

We're simply not at the stage yet where Raytracing can be used for a mechanical advantage. Otherwise, you'd alienate too many gamers from the game.

---------

Its really hard for me to come up with good uses of Raytracing that would actually affect the gameplay loop however. Outside of observing shadows, or maybe mirror-reflections... but even "Portal" ended up with tons of reflections (fake, non-raytraced ones. But good enough that most people didn't notice the issues). So its not like you need raytracing to make "accurate-enough for video games" kind of mechanics.
To your point it's hard for me to come up with many good uses outside of visual fidelity but here are a few:

- a Hitman game where you use reflections to your advantage
- maybe some FPS/multiplayer games with some reflective surfaces
- perhaps enemies that can actually *see and hear* as opposed to having generic sight and hearing cones


Still, I'm taken aback by the amount of people who seem to be downplaying Raytracing as a technology. It's true that lighting is accurate enough for games, but true Raytracing should take visual fidelity a step further. I think when Raytracing starts becoming more of a thing we'll look back and see how much of a departure from reality current lighting actually is.
Posted on Reply
#149
steen
Vayra86
That would make the appearance of the 10GB one even more questionable.

OK! Let's compare the actual chips. Both are GA102 dies! What gives?! Its not even a 104.

Devs do indeed... but now check that price tag again for this GPU. Hello? This screams bad yield and scraps and leftovers to me, sold at premium. Let's see how that VRAM is wired...
It's a combination of factors including yield (as you've alluded to), memory bus width, granularity of GDDR6X & cost. 3080 10GB with 20% reduced memory bus (& associated memory controllers), & 13 fewer 19gbps GDDR6X modules & associated power stages (I'm assuming) yields a drop of only 30w TGP. Consider 3950X vs 3900X. If raster perf has a marginal increase as implied, we can see why there was some concern over allocation of resources within TA arch. I believe they've gone all in with RTX (2xFP32) as well as the far more flexible tensor & will push TF32, etc reduced precision formats heavily. It will work especially well with game engines built around DLSS reduced res upscaling. If the nn model(s) continue with reconstruction improvement on the order of DLSS 2.0, then who cares if only 25% of displayed pixels came from the game engine & the rest is magic, right?
Posted on Reply
#150
TechLurker
Something I noticed not being mentioned is that they are seem to be settling on advertising as PCIe 4.0 ready (whether or not it's really necessary yet), which likely answers the question/dilemma GamersNexus touched upon in regards to how NVIDIA would advertise their cards. While it's an insignificant element for those in the know, it's a big thing for the masses who'd just look at it and then panic because their Intel mobo doesn't have PCIe 4.0 capability, and the only ones on the market with PCIe 4.0 at the time of release is AMD.

I'm actually hoping NVIDIA decides to completely follow through on marketing, if only to see how Intel would spin things in order to stop would-be RTX 3000 series buyers from panic buying an AMD build to go along with their new GPU (which would hilariously also benefit the same mobo makers frustrated with Intel's failure on intended PCIe 4.0).
Posted on Reply
Add your own comment