Friday, August 28th 2020

NVIDIA GeForce RTX 3090 and 3080 Specifications Leaked

Just ahead of the September launch, specifications of NVIDIA's upcoming RTX Ampere lineup have been leaked by industry sources over at VideoCardz. According to the website, three alleged GeForce SKUs are being launched in September - RTX 3090, RTX 3080, and RTX 3070. The new lineup features major improvements: 2nd generation ray-tracing cores and 3rd generation tensor cores made for AI and ML. When it comes to connectivity and I/O, the new cards use the PCIe 4.0 interface and have support for the latest display outputs like HDMI 2.1 and DisplayPort 1.4a.

The GeForce RTX 3090 comes with 24 GB of GDDR6X memory running on a 384-bit bus at 19.5 Gbps. This gives a memory bandwidth capacity of 936 GB/s. The card features the GA102-300 GPU with 5,248 CUDA cores running at 1695 MHz, and is rated for 350 W TGP (board power). While the Founders Edition cards will use NVIDIA's new 12-pin power connector, non-Founders Edition cards, from board partners like ASUS, MSI and Gigabyte, will be powered by two 8-pin connectors. Next up is specs for the GeForce RTX 3080, a GA102-200 based card that has 4,352 CUDA cores running at 1710 MHz, paired with 10 GB of GDDR6X memory running at 19 Gbps. The memory is connected with a 320-bit bus that achieves 760 GB/s bandwidth. The board is rated at 320 W and the card is designed to be powered by dual 8-pin connectors. And finally, there is the GeForce RTX 3070, which is built around the GA104-300 GPU with a yet unknown number of CUDA cores. We only know that it has the older non-X GDDR6 memory that runs at 16 Gbps speed on a 256-bit bus. The GPUs are supposedly manufactured on TSMC's 7 nm process, possibly the EUV variant.
Source: VideoCardz
Add your own comment

216 Comments on NVIDIA GeForce RTX 3090 and 3080 Specifications Leaked

#101
ZoneDymo
EarthDog
Do I really have to answer why a flagship has 24GB versus anoher down the stack has less? You've been here long enough... think about it. :)

But, it ISN'T, the flagship. there isn't a 'close enough'. 10GB = pedestrian, lolololol. It's 25% more than the 2080... and a mere 1GB less than 2080 Ti. I'd call it an improvement... especially at where it is intended to play games. ;)

That is a reaction to the third line of the post I quoted. Premature is premature. ;)
No, but you do have to answer why that gap is so insanely hugh, more then twice the ram? borderline 2.5? that is just insane.
And again, the midrange of old, RX480 had 8gb of ram and the gtx1060 had 6gb of ram...to have an RTX3080 now with 10gb is just pathetic imo with the eye on progression and placement.
Posted on Reply
#102
AnarchoPrimitiv
Jinxed
Oh really? And this is what?
Me doing a recaling of what happend, namely that someone made the claim that 5nm was all reserved by Nvidia, but there are numerous sources claiming otherwise, but if you disagree that's fine too, it's not a big deal.

I'll be happy to move on, I would just like to respectfully ask why you felt it necessary to label me an AMD fanboy, when nothing I did made any indication of that, and also felt the need to be personally offended by my joke, and then decided to make me personally a target of your ire and make a bunch of negative assumptions about my character, that's all.

I think when I said "did you bother reading before commenting", you were assuming I was doing it in an antagonistic way when I was sincerely just asking if you had happened to miss those comments previous to your own. That's why I'm confused, because never once was I intending any of my comments to be combatitive, but they were somehow taken as such. So now, I'm trying to understand why that was, that's all. I'm more than happy to just drop it, but I'd prefer for us to come to an understanding and end on a more cordial and friendly note and walk away amicably so that I can then do my best avoid misunderstandings in the future, but if you feel that's not necessary, that's more than fine too.

Again, I'm saying all this in a purely friendly, sincere and respectful way and not in an antagonistic or confrontational way at all
Posted on Reply
#103
EarthDog
Vayra86
Wait and see mode for me, simple.
Certainly better than......
Vayra86
The whole thing looks like a complete failure if you ask me, and Turing was no different.
:p :D :toast:
ZoneDymo
No, but you do have to answer why that gap is so insanely hugh, more then twice the ram?
Now you're talking. :)
Posted on Reply
#104
Vayra86
Jinxed
You forgive me if I don't belive you. But I am willing to give it the benefit of doubt.
I'm curious what you think is my motivation then if I would lie about this. But you are free to believe whatever you like ;)
EarthDog
Certainly better than......
:p :D :toast:

Now you're talking. :)
Fair enough lol
Posted on Reply
#105
rtwjunkie
PC Gaming Enthusiast
ZoneDymo
Do you need more then 10gb of Vram....well yeah if a game developer wants to stuff massive high quality textures on your 4k or higher resolution screen it does.
Why would the 3090 have 24gb if 10 was enough....
Let me help you out. The only reason you think you “need” more than 10GB VRAM is because of lazy devs dumping textures ipon textures there just because it is there.

As to why NVIDIA is putting 24GB VRAM on the 3090? Marketing. “I’m King of the Hill.” Whatever you want to call it.
Posted on Reply
#106
RedelZaVedno
ZoneDymo
No, but you do have to answer why that gap is so insanely hugh, more then twice the ram? borderline 2.5? that is just insane.
And again, the midrange of old, RX480 had 8gb of ram and the gtx1060 had 6gb of ram...to have an RTX3080 now with 10gb is just pathetic imo with the eye on progression and placement.
I agree, 10GB is just not enough on xx80 GPU these days. MS flight simulator 2020 called for 12,7GB of vram usage in MS Flight simulator 2020 while flying over Dubai at 4K/ultra. 16GB of vram should be a safe amount on high end 4K capable GPUs. But I do believe Nvidia's AIBs will offer 20Gb variant (according to Igor's Lab which is one of the most trusted leakers imo), but for more $$$, price gauging where ever it can, it's Ngreedia after all ;-)
Posted on Reply
#107
Vayra86
rtwjunkie
Let me help you out. The only reason you think you “need” more than 10GB VRAM is because of lazy devs dumping textures ipon textures there just because it is there.

As to why NVIDIA is putting 24GB VRAM on the 3090? Marketing. “I’m King of the Hill.” Whatever you want to call it.
Well its pretty strange to see 11GB on a 3 year old GPU and then 10GB on the newer one that is going to be much faster. That's a step back no matter how you spin it.

As to the 24GB... it speaks of a product lineup that is out of balance, not as intended, whatever you want to attribute to it. But they sure as hell don't do it just because its a nice number to win with. Previous GPUs underline that. VRAM cap was never a real battle, especially not in the high end.
Posted on Reply
#108
medi01
Cyberpunk will have customizable genitalia (as they are misogynists, only penises).

Imagine how glorious all that will be, when RT gimmick is slapped on it.

That alone justifies paying $2000 for 3090 (although I think it would cost $1400, basically like 2080Ti)
Posted on Reply
#109
Jinxed
AnarchoPrimitiv
I'll be happy to move on, I would just like to respectfully ask why you felt it necessary to label me an AMD fanboy, when nothing I did made any indication of that, and also felt the need to be personally offended by my joke, and then decided to make me personally a target of your ire and make a bunch of negative assumptions about my character, that's all.
What I actually said was:
Exaclty. From the attempts to downplay raytracing to the AMD fans talking about future nodes, it doesn't seem like the Red
AMD fans are not necessarily AMD fanboys. You said that, not me. Although I use that term when it's really obvious.I don't recall and can't find anywhere in this discussion I called YOU an AMD fanboy. You used that first. Are you done?
Posted on Reply
#110
rtwjunkie
PC Gaming Enthusiast
Jinxed
Yes, AMD fans were trying to downplay raytracing since Turing GPUs were released. So yes, 2 years LOL
Lots of rational, clear-headed Nvidia fans have downplayed RT as well.
Vayra86
Well its pretty strange to see 11GB on a 3 year old GPU and then 10GB on the newer one that is going to be much faster. That's a step back no matter how you spin it.
You’re comparing apples to oranges. Model to model is what you must compare. Does the 2080 have 11GB VRAM? No. It occupies the same space within its lineup as the 3080 does. 3080 improves upon that by adding 2GB.

11GB was what the top-ranked 2080Ti has. The same place in its family as the new 3090 will occupy.
Posted on Reply
#111
Jinxed
rtwjunkie
Lots of rational, clear-headed Nvidia fans have downplayed RT as well.
Not sure which ones you're talking about. But I certainly recall a lot of poeple that were known to hate Nvidia for years to do that, only to say: "Hey, I have a Nvidia GPU, I'm unbiased / Nvidia fan / etc.". That always more than anything else shows me how bad the situation of Team Red is, if even their most adamant fans are buying Nvidia GPUs (provided they were telling the truth about that in the first place).
Posted on Reply
#112
rtwjunkie
PC Gaming Enthusiast
Jinxed
You forgive me if I don't belive you. But I am willing to give it the benefit of doubt.
Make up your mind. You are either giving him the benefit of the doubt or you don’t believe him.
Posted on Reply
#113
Vayra86
rtwjunkie
Lots of rational, clear-headed Nvidia fans have downplayed RT as well.


You’re comparing apples to oranges. Model to model is what you must compare. Does the 2080 have 11GB VRAM? No. It occupies the same space within its lineup as the 3080 does. 3080 improves upon that by adding 2GB.

11GB was what the top-ranked 2080Ti has. The same place in its family as the new 3090 will occupy.
No, I'm comparing performance to performance because the tiers and price tiers change every gen. You know this. Please. The simple fact is, a much faster GPU gets a lower VRAM cap. And on top of that, the 2080ti compared to 3080 still does that a gen later than the 1080ti. That's a looong freeze on VRAM to then give a flagship 24GB.

Let me put it differently. It doesn't speak for the 3080's lifespan as it did for a 1080 with 8GB during Pascal. Catch my drift?
Posted on Reply
#114
Jinxed
rtwjunkie
Make up your mind. You are either giving him the benefit of the doubt or you don’t believe him.
He actually stated multiple things in his reply. And I am replying to multiple statements. I don't believe that it's a simple as him making a consumer choice. But give him the benefit of doubt that he's not going to buy any GPU if AMD also brings a raytracing oriented GPU. Read the whole quoted part please.
Posted on Reply
#115
ODOGG26
AnarchoPrimitiv
What are you talking about, the reason why he was bringing up AMD nodes is because someone literally made the completely false claim that Nvidia had 5nm reserved to the point of blocking AMD out and then people were correcting him, that's all, seems straightforward with no alterior motives, at least to me?


Dude, it's a joke to lighten the mood, and instead of laughing, you decided to somehow take it personally again.... Also, why is making a joke about Nvidia, automatically make someone an AMD fanboy? I don't follow how that works.... are there only two choices in this world, either be personally offended for Nvidia or be an AMD fanboy? How about I have no loyalty and buy whatever offers the best value at the price I can afford? Is that OK with you, or am I forced into this false binary you've imagined?

Never would I imagine that a light hearted joke to cut the tension, directed toward someone a million miles away from this forum would result in someone here being personally offended, and then attacking me personally, when I did no such thing, why was it necessary to make it personal?

"ran out of arguments"? What argument was I making? I believe I didn't make a single argument in this entire forum, and it seems like you're imagining a fight that doesn't exist.... I'm sincerely just trying to understand what occurred here, and why you think I'm engaging in combat when all I did was make a joke to make people laugh, and haven't attacked anyone or called anyone out. So far the only thing I did was try and point out to someone that people were talking about AMD process nodes to correct someone else who had brought it up first and that made a false statement about Nvidia reserving all the 5nm production when there's numerous, reputable sources to the contrary .... Why are you attempting drag me into a fight with you for no reason I can discern?

Again, I'm not trying to be combative, never have been, I'm just sincerely and genuinely trying to understand what's going on to de-escalate the situation and find out what's occurred to remedy it
Yea he and the other guy didn't follow why the node replies were there and went finger happy on their keyboards and to appoint people a Fanboy for AMD which to me clearly shows who was fanboying. Like you said, people really need to lighten up. It's not that serious.
Posted on Reply
#116
M2B
Vayra86
No, I'm comparing performance to performance because the tiers and price tiers change every gen. You know this. Please. The simple fact is, a much faster GPU gets a lower VRAM cap. And on top of that, the 2080ti compared to 3080 still does that a gen later than the 1080ti. That's a looong freeze on VRAM to then give a flagship 24GB.

Let me put it differently. It doesn't speak for the 3080's lifespan as it did for a 1080 with 8GB during Pascal. Catch my drift?
The 3080 most probably will have a 20GB variant.
Posted on Reply
#117
rtwjunkie
PC Gaming Enthusiast
Vayra86
No, I'm comparing performance to performance because the tiers and price tiers change every gen. You know this. Please. The simple fact is, a much faster GPU gets a lower VRAM cap. And on top of that, the 2080ti compared to 3080 still does that a gen later than the 1080ti. That's a looong freeze on VRAM to then give a flagship 24GB.

Let me put it differently. It doesn't speak for the 3080's lifespan as it did for a 1080 with 8GB during Pascal. Catch my drift?
Just don‘t compare price tiers and you’re good. Just the chips and where they are in their tiers. It keeps the picture much clearer. :)

sure you’re correct on the 3080 not being a huge leap in VRAM. But honestly, it may not mostly be needed. Don’t you frequently argue that devs are loading way more into VRAM than necessary?
Posted on Reply
#118
Vayra86
M2B
The 3080 most probably will have a 20GB variant.
That would make the appearance of the 10GB one even more questionable.
rtwjunkie
Just don‘t compare price tiers and you’re good. Just the chips and where they are in their tiers. It keeps the picture much clearer. :)

sure you’re correct on the 3080 not being a huge leap in VRAM. But honestly, it may not mostly be needed. Don’t you frequently argue that devs are loading way more into VRAM than necessary?
OK! Let's compare the actual chips. Both are GA102 dies! What gives?! Its not even a 104.

Devs do indeed... but now check that price tag again for this GPU. Hello? This screams bad yield and scraps and leftovers to me, sold at premium. Let's see how that VRAM is wired...
Posted on Reply
#119
ODOGG26
Jinxed
Oh really? And this is what?


He was not the one starting the process nodes argument. Let's just end it right here and move on.
No he didn't start a process node argument. You did. He seemed to be genuinely curios or confused as to why these would be the clocks if using same node as another player who managed higher clocks. He wasn't bashing NVIDIA or anything because it seems like those numbers may or may not be right. But lets move on cause you clearly arent comprehending things properly atm.
Posted on Reply
#120
HammerON
The Watchful Moderator
ODOGG26
Yea he and the other guy didn't follow why the node replies were there and went finger happy on their keyboards and to appoint people a Fanboy for AMD which to me clearly shows who was fanboying. Like you said, people really need to lighten up. It's not that serious.
Alright folks, stop with the fanboy crap - pretty please.
Remember that it is okay to disagree (and just leave it at that) and it often does not work to challenge those with different opinions than yours.
Carry on.
Posted on Reply
#121
M2B
Vayra86
That would make the appearance of the 10GB one even more questionable.



OK! Let's compare the actual chips. Both are GA102 dies! What gives?! Its not even a 104.
Probably because of the expense.
If they only release a 20GB variant it should end up being at the very least 100$ more expensive and add nothing to performance (for now at least) so they might think it's better to have let's say a base price of 799 instead of 999 so people don't freak out that much.
And let's be honest here, 20GB is very much overkill and if you don't want to keep your card for 4~ years why pay more...
Posted on Reply
#122
BoboOOZ
HammerON
Alright folks, stop with the fanboy crap - pretty please.
Remember that it is okay to disagree (and just leave it at that) and it often does not work to challenge those with different opinions than yours.
Carry on.
Or just use the ignore function, it works wonders :peace:
Posted on Reply
#123
rtwjunkie
PC Gaming Enthusiast
Vayra86
That would make the appearance of the 10GB one even more questionable.



OK! Let's compare the actual chips. Both are GA102 dies! What gives?! Its not even a 104.

Devs do indeed... but now check that price tag again for this GPU. Hello? This screams bad yield and scraps and leftovers to me, sold at premium. Let's see how that VRAM is wired...
My personal opinion is Nvidia is still experimenting. You waiting another gen might be a good idea. I may either do that as well or picky up a used 2080Ti.
Posted on Reply
#124
Vayra86
rtwjunkie
My personal opinion is Nvidia is still experimenting. You waiting another gen might be a good idea. I may either do that as well or picky up a used 2080Ti.
You said it. +1.
M2B
Probably because of the expense.
If they only release a 20GB variant it should end up being at the very least 100$ more expensive and add nothing to performance (for now at least) so they might think it's better to have a let's say base price of 799 instead of 999 so people don't freak out that much.
And let's be honest here, 20GB is very much overkill and if you don't want to keep your card for 4~ years why pay more...
I agree, but then that begs the question how things are balanced in terms of margins etc.... if they can't produce that kind of GPU at 800... something is amiss. They have a node shrink and they're even pretty late with it, as well. The fact is, RT demands a much bigger die and we're paying the full price on it. The fact also is, the RT content is still rare. By the time there is sufficient content, the GPU will be ready for replacement.
Posted on Reply
#125
theoneandonlymrk
Jinxed
So is higher resolution, shadows, higher detailes models and basically all the graphical advances over the years. All just graphical updates. With your logic, we could all just stay at 640x480. In reality all those advances, including raytracing, improve immersion. And while a good story and gameplay is still the key and it's sad that some games put graphical fidelity first, I agree on that, adding more immersion and realism is a huge boost when the gameplay is good.
We sort of are with dlss , then upscaling, no?.

Anyway,
I'm eager for a tech face-off , less of the over the fence sh#t talking and more doing.

With the upcoming GPU releases plus consoles at the same time, I can't see them all being a winner,not with the Rona kicking wages to the kerb.
Posted on Reply
Add your own comment