Saturday, September 19th 2020

NVIDIA Readies RTX 3060 8GB and RTX 3080 20GB Models

A GIGABYTE webpage meant for redeeming the RTX 30-series Watch Dogs Legion + GeForce NOW bundle, lists out eligible graphics cards for the offer, including a large selection of those based on unannounced RTX 30-series GPUs. Among these are references to a "GeForce RTX 3060" with 8 GB of memory, and more interestingly, a 20 GB variant of the RTX 3080. The list also confirms the RTX 3070S with 16 GB of memory.

The RTX 3080 launched last week comes with 10 GB of memory across a 320-bit memory interface, using 8 Gbit memory chips, while the RTX 3090 achieves its 24 GB memory amount by piggy-backing two of these chips per 32-bit channel (chips on either side of the PCB). It's conceivable that the the RTX 3080 20 GB will adopt the same method. There exists a vast price-gap between the RTX 3080 10 GB and the RTX 3090, which NVIDIA could look to fill with the 20 GB variant of the RTX 3080. The question on whether you should wait for the 20 GB variant of the RTX 3080 or pick up th 10 GB variant right now, will depend on the performance gap between the RTX 3080 and RTX 3090. We'll answer this question next week.
Source: VideoCardz
Add your own comment

157 Comments on NVIDIA Readies RTX 3060 8GB and RTX 3080 20GB Models

#151
Raendor
Dudebro-420Yeah that's what the article said.
Hey "genious", you realize the article could be updated because it didn't say it first?
Posted on Reply
#152
efikkan
Vayra86Maybe I'm wrong, and yes, its about gut feeling more than anything. Because really thats all we have looking forward in time. GPU history is not without design and balancing failures, we all know this.
Remember that RTX 3060/3070/3080 will be Nvidia's most important models for the next two years or so. Nvidia also have a very good idea of the upcoming top games, since they have access to many of them during development. There is no one in the world more capable of predicting what should be needed in the next few years than them, but that doesn't make them immune to stupid business decisions of course.
Vayra86In the end, success or failure of advances in GPUs will be down to how devs (get to) utilize it. Whether or not they understand it and whether or not its workable. I'm seeing the weight move from tried and trusted VRAM capacity to different areas. Not all devs are going to keep up. In that sense its sort of a good sign that Nvidias system for Ampere is trying to mimick next gen console features, but still. This is going to be maintenance intensive for sure.
This simply comes down to the fact that there are very few moves by the developers which would increase the VRAM utilization without also increasing bandwidth requirements and computational requirements just as much. If you increase model/texture details, the others go up too. Increasing frame rate increases the others but not the capacity needs. Basically the main thing that could increase VRAM requirements disproportionate to other things is having more resources statically allocated and not doing any kind of resource streaming, or someone comes up with a very different rendering technique.
Posted on Reply
#153
lexluthermiester
efikkanIf you increase model/texture details, the others go up too.
Not always and even when it does happen, it's not always symmetrical.
Posted on Reply
#154
medi01
Doom is the example of "textures do not fit".
2080 performance is drastically affected, when increasing texture quality so that it doesn't fit into 8GB (but fits into 3080's 10GB, thus making it look better)
Posted on Reply
#155
londiste
medi01Doom is the example of "textures do not fit".
2080 performance is drastically affected, when increasing texture quality so that it doesn't fit into 8GB (but fits into 3080's 10GB, thus making it look better)
I bet in addition to that Doom is an example of Vulkan memory management peculiarities ;)
Posted on Reply
#156
lexluthermiester
medi01Doom is the example of "textures do not fit".
2080 performance is drastically affected, when increasing texture quality so that it doesn't fit into 8GB (but fits into 3080's 10GB, thus making it look better)
And there are a great many other games that fit in to that category of feature classification.
londisteI bet in addition to that Doom is an example of Vulkan memory management peculiarities ;)
Not really.
Posted on Reply
#157
John Naylor
BArmsMaybe they should launch the original 3080 first? I really don't consider meeting less than 1% of the demand a real launch.
Nvidia was unable to meet demand for the 2080Ti right up to the 3080 launch and is by no means an unusual thing ... part of this is the normal production line ramp iup issues, but most of it is the opportunists looking to resell
btarunrThe logical next step to DirectStorage and RTX-IO is graphics cards with resident non-volatile storage (i.e. SSDs on the card). I think there have already been some professional cards with onboard storage (though not leveraging tech like DirectStorage). You install optimized games directly onto this resident storage, and the GPU has its own downstream PCIe root complex that deals with it
See the Play Crysis on 3090 srticle
www.techpowerup.com/273007/crysis-3-installed-on-and-run-directly-from-rtx-3090-24-gb-gddr6x-vram
nikoyathey should release the 3080 20gb ASAP
I'd wait till we see the performance comparisons ....not what MSI Afterburner or anything else says is being used allocated .... I wanna see how fps is impacted. So far with 6xx, 7xx, 9xxx, there's been little observable impacts between cards with different RAM amounts when performance was above 30 fps . We have seen an effect w/ poor console ports and see illogical effects where the % difference was larger at 1080 than 1440.... which makes no sense. But every comparison I have read has not been able to show a significant fps impact.

In instances where a card comes in a X RAM version and a 2X RAM version, we have seen various utilies say that VRAM between 1X abd 2X is being used allocated, but we have not seen a significant difference in fps when fps > 30 . We have even seen games refuse to install with the X RAM, but after intsalling the game with the 2X RAm card, it plays at the same fps wiyth no impact on the user experience when you switch out to the X RAM version afterwards.

GTX 770 4GB vs 2GB Showdown - Page 3 of 4 - AlienBabelTech
ValantarWhy on earth would a 60-tier GPU in 2020 need 16GB of VRAM? Even 8GB is plenty for the types of games and settings that GPU will be capable of handling for its useful lifetime. Shader and RT performance will become a bottleneck at that tier long before 8GB of VRAM does. This is no 1060 3GB.
The 1060 3GB was purposely gimped with 11% less shaders because, otherwise, it showed no impact with half the VRAM. VRAM usage increases with resolution, so the 6% fps advangage that the 6GB card with 11% more shders had should have widened significantly at 1440p. It did not thereby proving that it was of no consequence. of the 2 games that had more significant impacts, on one of those two, the performance was closer between the 3 GB and 6 GB cards that at 1080 .... clearly this definace of logic points to the fact that other factors were at play in this instance. The 1060 was a 1080p card ... if 3 GB was inadequate at 1080p, then it should hev been a complete washout at 1440p and it wasn't ....

tpucdn.com/review/msi-gtx-1060-gaming-x-3-gb/images/perfrel_1920_1080.png
tpucdn.com/review/msi-gtx-1060-gaming-x-3-gb/images/perfrel_2560_1440.png
ppn3060 6GB 3840 cuda.
3060 8GB 4864 cuda

there you have it, 8GB version is 30% faster,
They have to cut the GPU down, otherwise it will be obvious that the VRAM has no impact on fps when within the playable range.
nguyenAt 4K FS2020 will try to eat as much VRAM as possible (>11GB) but that doesn't translate to real performance.
RAM Allocation is much different than RAM usage

Video Card Performance: 2GB vs 4GB Memory - Puget Custom Computers
GTX 770 4GB vs 2GB Showdown - Page 3 of 4 - AlienBabelTech
Is 4GB of VRAM enough? AMD’s Fury X faces off with Nvidia’s GTX 980 Ti, Titan X | ExtremeTech

From 2nd link
"There is one last thing to note with Max Payne 3: It would not normally allow one to set 4xAA at 5760×1080 with any 2GB card as it claims to require 2750MB. However, when we replaced the 4GB GTX 770 with the 2GB version, the game allowed the setting. And there were no slowdowns, stuttering, nor any performance differences that we could find between the two GTX 770s. "
Posted on Reply
Add your own comment
Apr 23rd, 2024 23:38 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts