• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 3090 and 3080 Specifications Leaked

No, I'm not running away, I'm making a choice as a consumer. Its really that simple. If AMD produces something similar, its a no buy in either camp.
You forgive me if I don't belive you. But I am willing to give it the benefit of doubt.
 
Do I really have to answer why a flagship has 24GB versus anoher down the stack has less? You've been here long enough... think about it. :)

But, it ISN'T, the flagship. there isn't a 'close enough'. 10GB = pedestrian, lolololol. It's 25% more than the 2080... and a mere 1GB less than 2080 Ti. I'd call it an improvement... especially at where it is intended to play games. ;)

That is a reaction to the third line of the post I quoted. Premature is premature. ;)

No, but you do have to answer why that gap is so insanely hugh, more then twice the ram? borderline 2.5? that is just insane.
And again, the midrange of old, RX480 had 8gb of ram and the gtx1060 had 6gb of ram...to have an RTX3080 now with 10gb is just pathetic imo with the eye on progression and placement.
 
Oh really? And this is what?

Me doing a recaling of what happend, namely that someone made the claim that 5nm was all reserved by Nvidia, but there are numerous sources claiming otherwise, but if you disagree that's fine too, it's not a big deal.

I'll be happy to move on, I would just like to respectfully ask why you felt it necessary to label me an AMD fanboy, when nothing I did made any indication of that, and also felt the need to be personally offended by my joke, and then decided to make me personally a target of your ire and make a bunch of negative assumptions about my character, that's all.

I think when I said "did you bother reading before commenting", you were assuming I was doing it in an antagonistic way when I was sincerely just asking if you had happened to miss those comments previous to your own. That's why I'm confused, because never once was I intending any of my comments to be combatitive, but they were somehow taken as such. So now, I'm trying to understand why that was, that's all. I'm more than happy to just drop it, but I'd prefer for us to come to an understanding and end on a more cordial and friendly note and walk away amicably so that I can then do my best avoid misunderstandings in the future, but if you feel that's not necessary, that's more than fine too.

Again, I'm saying all this in a purely friendly, sincere and respectful way and not in an antagonistic or confrontational way at all
 
Last edited:
You forgive me if I don't belive you. But I am willing to give it the benefit of doubt.

I'm curious what you think is my motivation then if I would lie about this. But you are free to believe whatever you like ;)

Certainly better than......
:p :D :toast:

Now you're talking. :)

Fair enough lol
 
Do you need more then 10gb of Vram....well yeah if a game developer wants to stuff massive high quality textures on your 4k or higher resolution screen it does.
Why would the 3090 have 24gb if 10 was enough....
Let me help you out. The only reason you think you “need” more than 10GB VRAM is because of lazy devs dumping textures ipon textures there just because it is there.

As to why NVIDIA is putting 24GB VRAM on the 3090? Marketing. “I’m King of the Hill.” Whatever you want to call it.
 
No, but you do have to answer why that gap is so insanely hugh, more then twice the ram? borderline 2.5? that is just insane.
And again, the midrange of old, RX480 had 8gb of ram and the gtx1060 had 6gb of ram...to have an RTX3080 now with 10gb is just pathetic imo with the eye on progression and placement.
I agree, 10GB is just not enough on xx80 GPU these days. MS flight simulator 2020 called for 12,7GB of vram usage in MS Flight simulator 2020 while flying over Dubai at 4K/ultra. 16GB of vram should be a safe amount on high end 4K capable GPUs. But I do believe Nvidia's AIBs will offer 20Gb variant (according to Igor's Lab which is one of the most trusted leakers imo), but for more $$$, price gauging where ever it can, it's Ngreedia after all ;-)
 
  • Like
Reactions: SQr
Let me help you out. The only reason you think you “need” more than 10GB VRAM is because of lazy devs dumping textures ipon textures there just because it is there.

As to why NVIDIA is putting 24GB VRAM on the 3090? Marketing. “I’m King of the Hill.” Whatever you want to call it.

Well its pretty strange to see 11GB on a 3 year old GPU and then 10GB on the newer one that is going to be much faster. That's a step back no matter how you spin it.

As to the 24GB... it speaks of a product lineup that is out of balance, not as intended, whatever you want to attribute to it. But they sure as hell don't do it just because its a nice number to win with. Previous GPUs underline that. VRAM cap was never a real battle, especially not in the high end.
 
Low quality post by medi01
Cyberpunk will have customizable genitalia (as they are misogynists, only penises).

Imagine how glorious all that will be, when RT gimmick is slapped on it.

That alone justifies paying $2000 for 3090 (although I think it would cost $1400, basically like 2080Ti)
 
I'll be happy to move on, I would just like to respectfully ask why you felt it necessary to label me an AMD fanboy, when nothing I did made any indication of that, and also felt the need to be personally offended by my joke, and then decided to make me personally a target of your ire and make a bunch of negative assumptions about my character, that's all.

What I actually said was:
Exaclty. From the attempts to downplay raytracing to the AMD fans talking about future nodes, it doesn't seem like the Red
AMD fans are not necessarily AMD fanboys. You said that, not me. Although I use that term when it's really obvious.I don't recall and can't find anywhere in this discussion I called YOU an AMD fanboy. You used that first. Are you done?
 
Last edited:
Yes, AMD fans were trying to downplay raytracing since Turing GPUs were released. So yes, 2 years LOL
Lots of rational, clear-headed Nvidia fans have downplayed RT as well.

Well its pretty strange to see 11GB on a 3 year old GPU and then 10GB on the newer one that is going to be much faster. That's a step back no matter how you spin it.
You’re comparing apples to oranges. Model to model is what you must compare. Does the 2080 have 11GB VRAM? No. It occupies the same space within its lineup as the 3080 does. 3080 improves upon that by adding 2GB.

11GB was what the top-ranked 2080Ti has. The same place in its family as the new 3090 will occupy.
 
Lots of rational, clear-headed Nvidia fans have downplayed RT as well.
Not sure which ones you're talking about. But I certainly recall a lot of poeple that were known to hate Nvidia for years to do that, only to say: "Hey, I have a Nvidia GPU, I'm unbiased / Nvidia fan / etc.". That always more than anything else shows me how bad the situation of Team Red is, if even their most adamant fans are buying Nvidia GPUs (provided they were telling the truth about that in the first place).
 
You forgive me if I don't belive you. But I am willing to give it the benefit of doubt.
Make up your mind. You are either giving him the benefit of the doubt or you don’t believe him.
 
Lots of rational, clear-headed Nvidia fans have downplayed RT as well.


You’re comparing apples to oranges. Model to model is what you must compare. Does the 2080 have 11GB VRAM? No. It occupies the same space within its lineup as the 3080 does. 3080 improves upon that by adding 2GB.

11GB was what the top-ranked 2080Ti has. The same place in its family as the new 3090 will occupy.

No, I'm comparing performance to performance because the tiers and price tiers change every gen. You know this. Please. The simple fact is, a much faster GPU gets a lower VRAM cap. And on top of that, the 2080ti compared to 3080 still does that a gen later than the 1080ti. That's a looong freeze on VRAM to then give a flagship 24GB.

Let me put it differently. It doesn't speak for the 3080's lifespan as it did for a 1080 with 8GB during Pascal. Catch my drift?
 
Make up your mind. You are either giving him the benefit of the doubt or you don’t believe him.
He actually stated multiple things in his reply. And I am replying to multiple statements. I don't believe that it's a simple as him making a consumer choice. But give him the benefit of doubt that he's not going to buy any GPU if AMD also brings a raytracing oriented GPU. Read the whole quoted part please.
 
What are you talking about, the reason why he was bringing up AMD nodes is because someone literally made the completely false claim that Nvidia had 5nm reserved to the point of blocking AMD out and then people were correcting him, that's all, seems straightforward with no alterior motives, at least to me?


Dude, it's a joke to lighten the mood, and instead of laughing, you decided to somehow take it personally again.... Also, why is making a joke about Nvidia, automatically make someone an AMD fanboy? I don't follow how that works.... are there only two choices in this world, either be personally offended for Nvidia or be an AMD fanboy? How about I have no loyalty and buy whatever offers the best value at the price I can afford? Is that OK with you, or am I forced into this false binary you've imagined?

Never would I imagine that a light hearted joke to cut the tension, directed toward someone a million miles away from this forum would result in someone here being personally offended, and then attacking me personally, when I did no such thing, why was it necessary to make it personal?

"ran out of arguments"? What argument was I making? I believe I didn't make a single argument in this entire forum, and it seems like you're imagining a fight that doesn't exist.... I'm sincerely just trying to understand what occurred here, and why you think I'm engaging in combat when all I did was make a joke to make people laugh, and haven't attacked anyone or called anyone out. So far the only thing I did was try and point out to someone that people were talking about AMD process nodes to correct someone else who had brought it up first and that made a false statement about Nvidia reserving all the 5nm production when there's numerous, reputable sources to the contrary .... Why are you attempting drag me into a fight with you for no reason I can discern?

Again, I'm not trying to be combative, never have been, I'm just sincerely and genuinely trying to understand what's going on to de-escalate the situation and find out what's occurred to remedy it

Yea he and the other guy didn't follow why the node replies were there and went finger happy on their keyboards and to appoint people a Fanboy for AMD which to me clearly shows who was fanboying. Like you said, people really need to lighten up. It's not that serious.
 
No, I'm comparing performance to performance because the tiers and price tiers change every gen. You know this. Please. The simple fact is, a much faster GPU gets a lower VRAM cap. And on top of that, the 2080ti compared to 3080 still does that a gen later than the 1080ti. That's a looong freeze on VRAM to then give a flagship 24GB.

Let me put it differently. It doesn't speak for the 3080's lifespan as it did for a 1080 with 8GB during Pascal. Catch my drift?

The 3080 most probably will have a 20GB variant.
 
No, I'm comparing performance to performance because the tiers and price tiers change every gen. You know this. Please. The simple fact is, a much faster GPU gets a lower VRAM cap. And on top of that, the 2080ti compared to 3080 still does that a gen later than the 1080ti. That's a looong freeze on VRAM to then give a flagship 24GB.

Let me put it differently. It doesn't speak for the 3080's lifespan as it did for a 1080 with 8GB during Pascal. Catch my drift?
Just don‘t compare price tiers and you’re good. Just the chips and where they are in their tiers. It keeps the picture much clearer. :)

sure you’re correct on the 3080 not being a huge leap in VRAM. But honestly, it may not mostly be needed. Don’t you frequently argue that devs are loading way more into VRAM than necessary?
 
The 3080 most probably will have a 20GB variant.

That would make the appearance of the 10GB one even more questionable.

Just don‘t compare price tiers and you’re good. Just the chips and where they are in their tiers. It keeps the picture much clearer. :)

sure you’re correct on the 3080 not being a huge leap in VRAM. But honestly, it may not mostly be needed. Don’t you frequently argue that devs are loading way more into VRAM than necessary?

OK! Let's compare the actual chips. Both are GA102 dies! What gives?! Its not even a 104.

Devs do indeed... but now check that price tag again for this GPU. Hello? This screams bad yield and scraps and leftovers to me, sold at premium. Let's see how that VRAM is wired...
 
Oh really? And this is what?


He was not the one starting the process nodes argument. Let's just end it right here and move on.

No he didn't start a process node argument. You did. He seemed to be genuinely curios or confused as to why these would be the clocks if using same node as another player who managed higher clocks. He wasn't bashing NVIDIA or anything because it seems like those numbers may or may not be right. But lets move on cause you clearly arent comprehending things properly atm.
 
Yea he and the other guy didn't follow why the node replies were there and went finger happy on their keyboards and to appoint people a Fanboy for AMD which to me clearly shows who was fanboying. Like you said, people really need to lighten up. It's not that serious.
Alright folks, stop with the fanboy crap - pretty please.
Remember that it is okay to disagree (and just leave it at that) and it often does not work to challenge those with different opinions than yours.
Carry on.
 
That would make the appearance of the 10GB one even more questionable.



OK! Let's compare the actual chips. Both are GA102 dies! What gives?! Its not even a 104.

Probably because of the expense.
If they only release a 20GB variant it should end up being at the very least 100$ more expensive and add nothing to performance (for now at least) so they might think it's better to have let's say a base price of 799 instead of 999 so people don't freak out that much.
And let's be honest here, 20GB is very much overkill and if you don't want to keep your card for 4~ years why pay more...
 
Last edited:
Alright folks, stop with the fanboy crap - pretty please.
Remember that it is okay to disagree (and just leave it at that) and it often does not work to challenge those with different opinions than yours.
Carry on.
Or just use the ignore function, it works wonders :peace:
 
That would make the appearance of the 10GB one even more questionable.



OK! Let's compare the actual chips. Both are GA102 dies! What gives?! Its not even a 104.

Devs do indeed... but now check that price tag again for this GPU. Hello? This screams bad yield and scraps and leftovers to me, sold at premium. Let's see how that VRAM is wired...
My personal opinion is Nvidia is still experimenting. You waiting another gen might be a good idea. I may either do that as well or picky up a used 2080Ti.
 
My personal opinion is Nvidia is still experimenting. You waiting another gen might be a good idea. I may either do that as well or picky up a used 2080Ti.

You said it. +1.

Probably because of the expense.
If they only release a 20GB variant it should end up being at the very least 100$ more expensive and add nothing to performance (for now at least) so they might think it's better to have a let's say base price of 799 instead of 999 so people don't freak out that much.
And let's be honest here, 20GB is very much overkill and if you don't want to keep your card for 4~ years why pay more...

I agree, but then that begs the question how things are balanced in terms of margins etc.... if they can't produce that kind of GPU at 800... something is amiss. They have a node shrink and they're even pretty late with it, as well. The fact is, RT demands a much bigger die and we're paying the full price on it. The fact also is, the RT content is still rare. By the time there is sufficient content, the GPU will be ready for replacement.
 
Back
Top