Wednesday, October 21st 2020

NVIDIA Reportedly Cancels Launch of RTX 3080 20 GB, RTX 3070 16 GB

Fresh reports floating in the rumor mill's circulatory system claim that NVIDIA backtracked on its plans to launch higher VRAM capacity versions of their RTX 3080 and the (in the meantime, delayed) RTX 3070. These cards launched with 10 GB VRAM for the RTX 3080 and 8 GB VRAM for the RTX 3070, with reports circulating as early as their announcement that there would be double-capacity versions hitting the market just a few months later - specifically, in December of this year. Videocardz, however, claims that these long-rumored 20 GB and 16 GB SKUs have now been canceled by NVIDIA, who sent this news to its AIB partners - and the usage of canceled, not postponed, is perfunctory.

For cards theoretically shipping come December, this is indeed a small advance notice, but it might be enough for AIB partners to feed all their GA102-200 (RTX 3080) and GA104-400 (RTX 3070) silicon towards the already - if not readily - available models. This report, Videocardz claims, has been confirmed by two of their sources, and comes at the exact same day specifications for AMD's RX 6000 series leaked. It's likely NVIDIA already had knowledge of its competition's designs and performance targets, however, so this could be seen as nothing more than a coincidence. One of the publications' sources claims GDDR6X yields might be the cause for the cancellation, but this doesn't help explain why the alleged RTX 3070 16 GB card (with its GDDR6 chips) was also canceled. Remember: these are rumors on cards that were never announced by NVIDIA themselves, so take these with the appropriate salt-mine level of skepticism.
Add your own comment

88 Comments on NVIDIA Reportedly Cancels Launch of RTX 3080 20 GB, RTX 3070 16 GB

#76
5200north
mechtechI never understood the reasoning or desire for so much ram on a gaming card. Even at 4k resolution some games barely break the 4GB mark.
www.techpowerup.com/review/the-witcher-3-benchmark-performance-test/3.html
Witcher 3 didn't even break 2GB at 4k.
Especially since the typically graphics card life is 3yrs, dumping all that cash for ram, that may not get used and tossed in 3 years seems a bit wasteful to me.

I would say 8GB-12GB for 4k for the next 3yrs should be more than adequate. I think TPU should start adding gpu vram usage to it's reviews.

Maybe graphics card makers should put so-dimm slots on the cards and make 4GB modules so people can put on 32GB if they want to burn money.

As for me I am satisfied with an 8GB card, and I don't feel like spending extra cash on ram that I will never use.
what you are saying is true if you only use a pc for games. i am into 3d modeling and 20gb of video ram would have been beyond amazing. i know i could get a 3090 but that would bust my budget and the only reason to get it would be the vram.

i knew a 3080 with 20vram was too good to be true.
Posted on Reply
#77
Jdp245
repman244Anyway, IMO 20GB is overkill. By the time games saturate that the GPU will be too slow anyway.
Kind of reminds me of the HD5870 2GB edition, which never really saw the benefit of double VRAM due to the GPU becoming useless for those games that could use more VRAM.
For most games, it is overkill. But some games are already maxing out 10 GB. For example, Microsoft Flight Simulator will fill that up right now. Not sure if more VRAM on a 3080 would help that title, or if it is preloading a bunch of stuff it doesn’t need, but I can’t imagine we won’t see titles within the lifetime of the 3080 that do better with more than 10 GB at max resolutions.
Posted on Reply
#78
rtwjunkie
PC Gaming Enthusiast
Jdp245For most games, it is overkill. But some games are already maxing out 10 GB. For example, Microsoft Flight Simulator will fill that up right now. Not sure if more VRAM on a 3080 would help that title, or if it is preloading a bunch of stuff it doesn’t need, but I can’t imagine we won’t see titles within the lifetime of the 3080 that do better with more than 10 GB at max resolutions.
Again, for the thousandth time, Allocation of VRAM is NOT Usage. This has been tested repeatedly from 12GB down to 6.

If you really think in the 3080 lifetime that game developers are going to forfeit sales by deliberately making games that will basically ignore the lower VRAM amounts of lesser cards, then you are unaware that they need to make a profit. You only do that by making games that are within the capability of mainstream cards, usually of the previous generation.
Posted on Reply
#79
medi01
Just several weeks ago, DF, in their "exclusive preview" used the fact that Doom ultra textures do not fit into 8GB, to mislead people about 2080 vs 3080 performance.
Posted on Reply
#80
dasa
Jdp245some games are already maxing out 10 GB. For example, Microsoft Flight Simulator will fill that up right now.
It used up to 12.3GB VRAM with my Radeon VII at 3440x1440. But sometimes it only used about 10GB VRAM while the system RAM contained that extra 2GB.
Since My system had a lot of VRAM FS2020 never used much more than 10-14GB RAM despite having 32GB available while people with less VRAM were reporting it needed more than 16GB.

If games were made with higher texture detail and less pop in they could use 20GB no problem and with minimal performance hit to those with enough memory as the GPU performance hit comes more from lighting effects which is lighter on memory usage.

I believe AMD could beat Nvidia this round and Nvidia will release a revision of the 3000 series within six months to claim back it's edge and that will likely come with more memory.
As others have said adding more VRAM to there current line now doesnt make much sence as they are already expencive and have high enough power use but with some tweaks a new revision of the cards soon after could make more sense.
Posted on Reply
#81
xenocide
It was always a stupid idea. Anyone that's been around for a while can tell you what happened when they had "higher VRAM" models of cards back in the day. They always performed almost identical to their lower VRAM equivalents. The cards are designed for X GB of VRAM, and adding more usually means the GPU cannot properly move the all the information anyway, and most of that VRAM goes completely unutilized. It was just an excuse for them to charge $100/$200+ for a 3080/3090 from people that assume More VRAM = More Performance.
Posted on Reply
#82
Jdp245
rtwjunkieIf you really think in the 3080 lifetime that game developers are going to forfeit sales by deliberately making games that will basically ignore the lower VRAM amounts of lesser cards, then you are unaware that they need to make a profit. You only do that by making games that are within the capability of mainstream cards, usually of the previous generation.
This is not the same thing as whether more VRAM would increase performance over the lifespan of the model. Games are made to work on a wide variety of hardware. That doesn’t mean that all hardware will be able to max out all of the features of the game. It may be the case that the GPU performance will bottleneck the 3080 before having “only” 10 GB of VRAM will. But that is the question, not whether quality of games can be scaled down to lower-performing hardware. (Of course it can.)
Posted on Reply
#83
rtwjunkie
PC Gaming Enthusiast
Jdp245This is not the same thing as whether more VRAM would increase performance over the lifespan of the model. Games are made to work on a wide variety of hardware. That doesn’t mean that all hardware will be able to max out all of the features of the game. It may be the case that the GPU performance will bottleneck the 3080 before having “only” 10 GB of VRAM will. But that is the question, not whether quality of games can be scaled down to lower-performing hardware. (Of course it can.)
AGAIN, VRAM Allocation by games is NOT VRAM Usage. And I said nothing about whether games can scale down.

What you are not comprehending is the tests done right here on TPU have shown that there was no loss in game quality or performance by going to lower VRAM amounts. What this clearly shows is games are only filling up what is available, NOT using it.
Posted on Reply
#84
mechtech
5200northwhat you are saying is true if you only use a pc for games. i am into 3d modeling and 20gb of video ram would have been beyond amazing. i know i could get a 3090 but that would bust my budget and the only reason to get it would be the vram.

i knew a 3080 with 20vram was too good to be true.
Yep, hence why I specifically stated "gaming cards"
Posted on Reply
#86
John Naylor
If if was ever a plan, nvidia is obviosuly convined AMDs cards are not a threat. Multile SKUs with different VRAMs has never been about perfiormance but of user perception. That's because if perception not reality, In other words, to borrow the words of Indigo Montaoya "I don't think that word means what you think it means" and that word is VRAM "allocation". Just because a card decides to allocate a certain amount of VRAM, that doesn't mean it actually benefits from more VRAM . Every time that two versions of a card have been tested with different VRAM amounts, the only times th extra VRAM have been shown to make a difference in fps, is when the GPU was unable to produce enough fps to make the gme playable. I don't care if doubling the VRAMbrings a 33% improvement when that so called improvement takes me from 15 tp 20 fps. The game is still unplayable.
Posted on Reply
#87
Waldorf
funny how ppl think that because there is a new chip, it needs more vram.
same for vram bandwidth where ppl whine when numbers dont go up/decrease,
yet has never been an issue in real world use (as other things improve (compression) and reducing the need for more bandwidth in the first place).

- 60-80% of all gamers on the planet are running FHD, so yeah, they really need those 10gb.
- by the time you use more than 10gb, most chips run out of power, hence useless to have more, unless cards get faster.
- no Ti/S version increased vram over non-ti/S, not sure why ppl always name those models. past releases show it (how about 1060 as 3 and 6, no ti or S??)
its not like they cant say xx65/75/85, and never even release anything S/Ti.

and regarding prices:
how many of you have "protested" that porsche/lambos (or any equivalent) are overpriced, and need to be as cheap and affordable like your chevy/toyota/vw (etc)?
right.
just because "you" think something is overpriced/no value, doesnt mean thats for everyone.
buy or dont, your choice. but dont sour it for others that will (like ppl with +100k/y income, or nothing else to pay for)
Posted on Reply
#88
John Naylor
Fry178- by the time you use more than 10gb, most chips run out of power, hence useless to have more, unless cards get faster.
- no Ti/S version increased vram over non-ti/S, not sure why ppl always name those models. past releases show it (how about 1060 as 3 and 6, no ti or S??)
its not like they cant say xx65/75/85, and never even release anything S/Ti.
The reason they were cancelled is it is marketing illusion only .... no side by side comparison testing from 6xx, 7xx, 9xx, 10xx, 20xx has ever shown a significant difference between 2 different VRAM versions of the same card. As you crank up the settings to a point where VRAM makes a difference, as w1zzard has said ... "you'll run out of shading power long before memory becomes a problem" ... the 1060 illustrated this perfectly. So much so that when nVidia prepared the 3 GB card, they disabled 11% of the shaders ,,, otherwise peformance would have been identical. If the "Not enough VRAM" was true, the 6% performance difference at 1080p would have to increase at 1440p... and yet, the difference was the same 6%. Needs more RAM theory ... busted.
Posted on Reply
Add your own comment
Apr 18th, 2024 19:13 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts