• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Possible NVIDIA GeForce RTX 3090, RTX 3080, and "TITAN Ampere" Specs Surface

As someone who is finally looking to upgrade after 5-6 years, the amount of blue balling going on here is really getting on my nerves. Just release the friggin' specs and a expected shipping date. Absolutely ridiculous, this whole charade.
Agreed. What's the point in speculating anyway? We get what we get, that's it.

Btw, if these are real specs, I suspect a Ti or SUPER version coming for each tier later on, as none of these specs seem to operate with fully enabled GPUs. But using the same GPU for 3 tiers seems unlikely to say the least (unless production is really that bad).
 
If Ampere features "GDDR6x", we should see these memory chips launching sometime before the graphics cards.

I will not focus on speculating on precise clock speeds or SM count, I believe this is a new major architecture, totally rebalanced and changed caches, so it will be hard to predict its performance characteristics.

But I do think memory bandwidth will be a deciding factor of how well these new cards scale. I saw some of the latest speculation about Navi 21 featuring 448 bit memory at 21 Gbps… It will be interesting to see how well that could be implemented.
 
oh,huge hardware scores, alot, when we remembe ,ampere is 7nm,so clocks are much higher than usuall... i say put 250mhz easily more basic hz than usuall and then add oc'd capacity alot more.
also,seens nvidia leave bank alot cores,near 3000! max cores numbers are over 8000,yes, you read right.


big navi.. i seen it again...errh, and it cant get even near rtx 3090...what dream world some2 live?

amd cant beat now even 12nm rtx 2070 super OC gpu, and then its still must beat its four big brother..,there is enough to job legendary big navi.

also, we all KNOW fact that big navi is still 7nm gpu with nothing new special upgrade...remembe this, i bet that big navi is watercooled gpu....still i cant belive it, these days..
sure im sure it should beat now old 16nm gtx 1080 ti OC gpu..hmm, must...but remembe check it tdp and efficiency.


well 'my' leak it small, and i warn amd fans,dont except it anyhting what it CANT be, rtx 3070 is too much it.


anyway,competition is always good,you will seen both september and october. nvidia and amd.

btw, i wait also intel high end gpu...exiting...

great weekends all!

If you're not a bot, you should try to better organize your ideas. To expect any competition from intel in the GPUs market is to be, at least, misinformed. AMD and Nvidia holds tons of patents on everything GPU-related, 3D rendering and etc..

@topic 4GB should be enough for low-end GPUs. but for the 4K+ market considerable Vram upgrades will be required.
 
If there is GDDR6X, we have to wait for the real GDDR7. not this half baked immitation.
 
  • Haha
Reactions: bug
If they really do come out the door with an RTX 3090, NVIDIA is concerned about Sienna.
 
Yeah, but it's coming from AMD, not Nvidia. Even 5500XT had 4 GB. I'm not purposefully defending 4 GB as games will continue to climb in min. requirements with bigger VRAM capacities, but for 1080p 60 FPS the 4 GB will remain a standard till Hopper/RDNA3 or RTX 5000/RDNA4. The simple reason is most low-end users whom the 4 GB is aimed at won't really stand a chance at acquiring 6 GB or 8 GB midrange cards (sure, we could say 5500XT is low end with 8 GBs, but other specs are bottlenecking performance) and unless AMD and Nvidia produce 1650 Super like card with 6 GB VRAM, I don't see a viable reason for 4 GB to vanish yet.
 
Low quality post by ARF
Low quality post by cucker tarlson
I hoped for something better.
The switch from 12/14nm to 7nm should allow twice the number of CUDA cores at the same clock speed and TDP, and roughy the same size. 20% extra doesn’t seem like much. If this is true, we will get significantly smaller chips than last gen or maybe nVidia used the extra space for more RT cores. I am not sure how I feel about that, when we haven’t even reached 4k at 120Hz. Even 3440x1440 at 120Hz can’t be done at max settings in most modern games on an RTX 2080 Ti.
 
I hoped for something better.
The switch from 12/14nm to 7nm should allow twice the number of CUDA cores at the same clock speed and TDP, and roughy the same size. 20% extra doesn’t seem like much. If this is true, we will get significantly smaller chips than last gen or maybe nVidia used the extra space for more RT cores. I am not sure how I feel about that, when we haven’t even reached 4k at 120Hz. Even 3440x1440 at 120Hz can’t be done at max settings in most modern games on an RTX 2080 Ti.
There's also 'IPC' improvements to be had with the move to a new architecture.

4K 120 Hz? There are literally a small handful of those monitors around and are cost prohibitive to most in the first place. I think more would rather see a performance improvement in RT (which everyone can use) than seeing 4k/120 come to fruition in this generation.
 
If there is GDDR6X, we have to wait for the real GDDR7. not this half baked immitation.

Nah man more X is always more better.
 
There's also 'IPC' improvements to be had with the move to a new architecture.

4K 120 Hz? There are literally a small handful of those monitors around and are cost prohibitive to most in the first place. I think more would rather see a performance improvement in RT (which everyone can use) than seeing 4k/120 come to fruition in this generation.
Who needs 4k/120 Hz anyway? Even 120 Hz alone at any resolution is something only a handful of gamers will ever need. I personally think the advantage over 60 Hz is just placebo.

Nah man more X is always more better.
That's why Intel processors are not as interesting as they used to be. They're still stuck with K, while AMD switched to X years ago.
 
Last edited:
If the 3070 gives 2080Ti performance and doesn't cost over $600 I may actually upgrade from my 980Ti.....I'm thinking I won't be that lucky, on the price, that is.
It’s a nice thought, but never gonna happen. The only thing that will beat a 2080ti in their new lineup is a 3080ti. Its the Nvidia way...
 
If you're not a bot, you should try to better organize your ideas. To expect any competition from intel in the GPUs market is to be, at least, misinformed. AMD and Nvidia holds tons of patents on everything GPU-related, 3D rendering and etc..

@topic 4GB should be enough for low-end GPUs. but for the 4K+ market considerable Vram upgrades will be required.


Intel can pay royalties to use the aforementioned patents - it's never an obstacle. Also, Intel has so large man power that they can literally discover their own things and still be very competitive. It hasn't been their priority, though.

Also, Intel can cross-agree with AMD the use of that IP, which they have already done.
 
There's also 'IPC' improvements to be had with the move to a new architecture.

4K 120 Hz? There are literally a small handful of those monitors around and are cost prohibitive to most in the first place. I think more would rather see a performance improvement in RT (which everyone can use) than seeing 4k/120 come to fruition in this generation.

Yes, and there will be some clock speed improvement as well, but even if we can get to 120Hz at 4k in current gen titles, it may not be enough next year (or in CP 2077 - edit: or in any Assassin’s Creed game).

As for who needs 4k at 120Hz - plenty of tvs support it, 27” 4k monitor prices are getting lower and lower too (looking at the EVE Spectrum - I paid a similar amount for a 27” 1440p 60Hz IPS monitor 9 or 10 years ago). I don’t think it is outrageous to think that in 2020 someone who pays $1000+ for a graphics card will want to pair it with a high refresh rate 4k or ultrawide monitor. Why else would you spend so much on a graphics card?
 
Yes, and there will be some clock speed improvement as well, but even if we can get to 120Hz at 4k in current gen titles, it may not be enough next year (or in CP 2077 - edit: or in any Assassin’s Creed game).

As for who needs 4k at 120Hz - plenty of tvs support it, 27” 4k monitor prices are getting lower and lower too (looking at the EVE Spectrum - I paid a similar amount for a 27” 1440p 60Hz IPS monitor 9 or 10 years ago). I don’t think it is outrageous to think that in 2020 someone who pays $1000+ for a graphics card will want to pair it with a high refresh rate 4k or ultrawide monitor. Why else would you spend so much on a graphics card?


Unfortunately, the 4K monitors prices are not getting lower. They keep the same level for dozens of months. The cheapest 4K monitor is around 230 euros, while the cheapest 1080p monitor is around 60-70 euros. This is an extremely strong prices dumping by the 1080p monitors manufacturers and it effectively distorts the whole market. And stops the innovation and progress.
 
I still think for high end gpu's like 3080 and up minimum memory 16gb, if this is true, 10gb, disappointed.

I just wish HBM was more of a thing, maybe in some sort of combo thing to keep cost down, but then with games etc being programmed with the faster memory in mind.
 
Save you wallet for 5nm. as amazing this is, I mean 7nm 65.6Mtr/mm2 jump from 14nm 25Mtr/mm2, shrinks 2080Ti under 300mm2. ffs it is just the beginning. And you don't want to be unprepared when it happens. Yeah there will be 3080, and 3080Super, and then 4080, the right time to buy is with 4080Super, or 5080 even. or 3-4 years from now. Now you can take 3070 or something under 399 if it performs like 2080Ti.

Problem is humanity hasn't invented cryogenic sleep yet...so what are you going to do during 4 years? ;)

4K TVs are already invading our living rooms from quite a while with no juice for feeding them
 
I hoped for something better.
The switch from 12/14nm to 7nm should allow twice the number of CUDA cores at the same clock speed and TDP, and roughy the same size. 20% extra doesn’t seem like much. If this is true, we will get significantly smaller chips than last gen or maybe nVidia used the extra space for more RT cores. I am not sure how I feel about that, when we haven’t even reached 4k at 120Hz. Even 3440x1440 at 120Hz can’t be done at max settings in most modern games on an RTX 2080 Ti.
You probably forgot Turing dies are at the size limit of what 12nm manufacturing allows. That's really not where you want to be, if you can help it ;)
 
Who needs 4k/120 Hz anyway? Even 120 Hz alone at any resolution is something only a handful of gamers will ever need. I personally think the advantage over 60 Hz is just placebo.


That's why Intel processors are not as interesting as they used to be. They're still stuck with K, while AMD switched to X years ago.
You would actually have to be blind to not notice a difference between 60 & 120hz.
 
I personally think the advantage over 60 Hz is just placebo.

Have you spent any serious time with 120+hz? I used to say the same thing until I bought it. 165+hz, not really worries about it.
 
So much speculation going around these days. It feels like AMD and Nvidia themselves are participating in the misinformation.

Anyways, these some of the least credible specs that I've seen.

This.

The competition for next gen is going to be so fierce neither one wants to play it's hand and I have no doubt both are leaking fake specs. Even the decision of who will release first will be causing great angst in AMD and Nvidia.
 
Who needs 4k/120 Hz anyway? Even 120 Hz alone at any resolution is something only a handful of gamers will ever need. I personally think the advantage over 60 Hz is just placebo.


That's why Intel processors are not as interesting as they used to be. They're still stuck with K, while AMD switched to X years ago.

:laugh: You better sit this one out.
 
Intel can pay royalties to use the aforementioned patents - it's never an obstacle. Also, Intel has so large man power that they can literally discover their own things and still be very competitive. It hasn't been their priority, though.

Also, Intel can cross-agree with AMD the use of that IP, which they have already done.

Just like anyone can just pay royalties for x86... wait! They can pay for lawyers as Intel sues the living daylights out of any company silly enough to even consider that.
 
Back
Top