• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Readies RTX 3060 8GB and RTX 3080 20GB Models

Joined
May 12, 2016
Messages
259 (0.09/day)
Processor Intel Core i7 11700
Motherboard Asus b560-i ROG
Cooling Thermalright Assassin King Mini
Memory G.Skill Trident Z 3600
Video Card(s) RTX 3080 FE
Display(s) Dell S2721DGF
Case Ncase M1
Power Supply Corsair SF750
Mouse HyperX
Keyboard HyperX
Joined
Jun 10, 2014
Messages
2,902 (0.80/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
Maybe I'm wrong, and yes, its about gut feeling more than anything. Because really thats all we have looking forward in time. GPU history is not without design and balancing failures, we all know this.
Remember that RTX 3060/3070/3080 will be Nvidia's most important models for the next two years or so. Nvidia also have a very good idea of the upcoming top games, since they have access to many of them during development. There is no one in the world more capable of predicting what should be needed in the next few years than them, but that doesn't make them immune to stupid business decisions of course.

In the end, success or failure of advances in GPUs will be down to how devs (get to) utilize it. Whether or not they understand it and whether or not its workable. I'm seeing the weight move from tried and trusted VRAM capacity to different areas. Not all devs are going to keep up. In that sense its sort of a good sign that Nvidias system for Ampere is trying to mimick next gen console features, but still. This is going to be maintenance intensive for sure.
This simply comes down to the fact that there are very few moves by the developers which would increase the VRAM utilization without also increasing bandwidth requirements and computational requirements just as much. If you increase model/texture details, the others go up too. Increasing frame rate increases the others but not the capacity needs. Basically the main thing that could increase VRAM requirements disproportionate to other things is having more resources statically allocated and not doing any kind of resource streaming, or someone comes up with a very different rendering technique.
 
Joined
Jul 9, 2015
Messages
3,413 (1.06/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
Doom is the example of "textures do not fit".
2080 performance is drastically affected, when increasing texture quality so that it doesn't fit into 8GB (but fits into 3080's 10GB, thus making it look better)
 
Joined
Feb 3, 2017
Messages
3,481 (1.32/day)
Processor R5 5600X
Motherboard ASUS ROG STRIX B550-I GAMING
Cooling Alpenföhn Black Ridge
Memory 2*16GB DDR4-2666 VLP @3800
Video Card(s) EVGA Geforce RTX 3080 XC3
Storage 1TB Samsung 970 Pro, 2TB Intel 660p
Display(s) ASUS PG279Q, Eizo EV2736W
Case Dan Cases A4-SFX
Power Supply Corsair SF600
Mouse Corsair Ironclaw Wireless RGB
Keyboard Corsair K60
VR HMD HTC Vive
Doom is the example of "textures do not fit".
2080 performance is drastically affected, when increasing texture quality so that it doesn't fit into 8GB (but fits into 3080's 10GB, thus making it look better)
I bet in addition to that Doom is an example of Vulkan memory management peculiarities ;)
 
Joined
Jul 5, 2013
Messages
25,559 (6.47/day)
Doom is the example of "textures do not fit".
2080 performance is drastically affected, when increasing texture quality so that it doesn't fit into 8GB (but fits into 3080's 10GB, thus making it look better)
And there are a great many other games that fit in to that category of feature classification.

I bet in addition to that Doom is an example of Vulkan memory management peculiarities ;)
Not really.
 
Joined
Mar 18, 2015
Messages
2,960 (0.89/day)
Location
Long Island
Maybe they should launch the original 3080 first? I really don't consider meeting less than 1% of the demand a real launch.

Nvidia was unable to meet demand for the 2080Ti right up to the 3080 launch and is by no means an unusual thing ... part of this is the normal production line ramp iup issues, but most of it is the opportunists looking to resell

The logical next step to DirectStorage and RTX-IO is graphics cards with resident non-volatile storage (i.e. SSDs on the card). I think there have already been some professional cards with onboard storage (though not leveraging tech like DirectStorage). You install optimized games directly onto this resident storage, and the GPU has its own downstream PCIe root complex that deals with it

See the Play Crysis on 3090 srticle

they should release the 3080 20gb ASAP

I'd wait till we see the performance comparisons ....not what MSI Afterburner or anything else says is being used allocated .... I wanna see how fps is impacted. So far with 6xx, 7xx, 9xxx, there's been little observable impacts between cards with different RAM amounts when performance was above 30 fps . We have seen an effect w/ poor console ports and see illogical effects where the % difference was larger at 1080 than 1440.... which makes no sense. But every comparison I have read has not been able to show a significant fps impact.

In instances where a card comes in a X RAM version and a 2X RAM version, we have seen various utilies say that VRAM between 1X abd 2X is being used allocated, but we have not seen a significant difference in fps when fps > 30 . We have even seen games refuse to install with the X RAM, but after intsalling the game with the 2X RAm card, it plays at the same fps wiyth no impact on the user experience when you switch out to the X RAM version afterwards.

GTX 770 4GB vs 2GB Showdown - Page 3 of 4 - AlienBabelTech

Why on earth would a 60-tier GPU in 2020 need 16GB of VRAM? Even 8GB is plenty for the types of games and settings that GPU will be capable of handling for its useful lifetime. Shader and RT performance will become a bottleneck at that tier long before 8GB of VRAM does. This is no 1060 3GB.

The 1060 3GB was purposely gimped with 11% less shaders because, otherwise, it showed no impact with half the VRAM. VRAM usage increases with resolution, so the 6% fps advangage that the 6GB card with 11% more shders had should have widened significantly at 1440p. It did not thereby proving that it was of no consequence. of the 2 games that had more significant impacts, on one of those two, the performance was closer between the 3 GB and 6 GB cards that at 1080 .... clearly this definace of logic points to the fact that other factors were at play in this instance. The 1060 was a 1080p card ... if 3 GB was inadequate at 1080p, then it should hev been a complete washout at 1440p and it wasn't ....


3060 6GB 3840 cuda.
3060 8GB 4864 cuda

there you have it, 8GB version is 30% faster,

They have to cut the GPU down, otherwise it will be obvious that the VRAM has no impact on fps when within the playable range.

At 4K FS2020 will try to eat as much VRAM as possible (>11GB) but that doesn't translate to real performance.

RAM Allocation is much different than RAM usage

Video Card Performance: 2GB vs 4GB Memory - Puget Custom Computers
GTX 770 4GB vs 2GB Showdown - Page 3 of 4 - AlienBabelTech
Is 4GB of VRAM enough? AMD’s Fury X faces off with Nvidia’s GTX 980 Ti, Titan X | ExtremeTech

From 2nd link
"There is one last thing to note with Max Payne 3: It would not normally allow one to set 4xAA at 5760×1080 with any 2GB card as it claims to require 2750MB. However, when we replaced the 4GB GTX 770 with the 2GB version, the game allowed the setting. And there were no slowdowns, stuttering, nor any performance differences that we could find between the two GTX 770s. "
 
Top