Tuesday, December 25th 2018

NVIDIA GeForce RTX 2060 to Ship in Six Variants Based on Memory Size and Type
NVIDIA drew consumer ire for differentiating its GeForce GTX 1060 into two variants based on memory, the GTX 1060 3 GB and GTX 1060 6 GB, with the two also featuring different GPU core-configurations. The company plans to double-down - or should we say, triple-down - on its sub-branding shenanigans with the upcoming GeForce RTX 2060. According to VideoCardz, citing a GIGABYTE leak about regulatory filings, NVIDIA could be carving out not two, but six variants of the RTX 2060!
There are at least two parameters that differentiate the six (that we know of anyway): memory size and memory type. There are three memory sizes, 3 GB, 4 GB, and 6 GB. Each of the three memory sizes come in two memory types, the latest GDDR6 and the older GDDR5. Based on the six RTX 2060 variants, GIGABYTE could launch up to thirty nine SKUs. When you add up similar SKU counts from NVIDIA's other AIC partners, there could be upward of 300 RTX 2060 graphics card models to choose from. It won't surprise us if in addition to memory size and type, GPU core-configurations also vary between the six RTX 2060 variants compounding consumer confusion. The 12 nm "TU106" silicon already has "A" and "non-A" ASIC classes, so there could be as many as twelve new device IDs in all! The GeForce RTX 2060 is expected to debut in January 2019.
Source:
VideoCardz
There are at least two parameters that differentiate the six (that we know of anyway): memory size and memory type. There are three memory sizes, 3 GB, 4 GB, and 6 GB. Each of the three memory sizes come in two memory types, the latest GDDR6 and the older GDDR5. Based on the six RTX 2060 variants, GIGABYTE could launch up to thirty nine SKUs. When you add up similar SKU counts from NVIDIA's other AIC partners, there could be upward of 300 RTX 2060 graphics card models to choose from. It won't surprise us if in addition to memory size and type, GPU core-configurations also vary between the six RTX 2060 variants compounding consumer confusion. The 12 nm "TU106" silicon already has "A" and "non-A" ASIC classes, so there could be as many as twelve new device IDs in all! The GeForce RTX 2060 is expected to debut in January 2019.
230 Comments on NVIDIA GeForce RTX 2060 to Ship in Six Variants Based on Memory Size and Type
Thanks though. :)
I have met people who won't play a game if they can't max it out, and also people who won't play a game since they will be getting an upgrade soon. I haven't met a person who looks at system requirements, unironically, since 2011. Majority rule is not a valid argument. It never has been, but if you want to play with that - consoles as a whole (not individual platforms) are bigger than PC Gaming. RTS is extremely demanding, what you are looking for seems to be turn-based strategy or tactics. Those you can tolerate at 30 fps.
PCs are not some ephemeral platform that is so different to consoles. The requirements in games do not mean much, that is the point. BTW, Wolfenstein The New Order used to be (before a game update which broke its wobbly engine) be playable at 60 fps, 1080p on a 6850. Even using your wacky ideas for what system requirements should mean, this is illogical.
3GB is way too little. 4GB is pathetic too. It is not a good idea at all. VRAM matters, modding matters, the 1160/2060 is fast enough for it to matter even for people who misunderstand how VRAM works.
BUT WHY?! Why there can't be just one or two models..
Nothing new here, apart from the ever increasing complexity when buying one of these GPUs. The take away is the same: when you're shopping in this territory, you will always be making tradeoffs from the moment you purchase it. And those tradeoffs can range well into uncomfortable territory for even casual gamers. As for the rest of the four-page discussion about quality settings... not very relevant I'd say.
For a meaty discussion on 3GB vs more GB, we have live examples of 780, 780ti and 1060 3GB already running into those limits at settings these GPUs can push comfortably. That says enough because the 2060 will be faster on the core. Balance is meh and 4GB and up is recommended. Its not even relevant what specs on the games' box say in that regard, its about how a GPU is balanced.
The key element is price, and knowing Nvidia, they will price it far too high, so TL DR these cards are going to be eclipsed by a more power hungry AMD alternative, /thread and Happy New Year :toast:
I base my conclusions on facts, and the fact is that GTX 1060 3GB is still a good option for many buyers.
And as I've said many times already, just because some need more memory doesn't mean everyone needs it.
Modding and high-res texture packs is a niche thing - an edge case, if you're one of those who do it, buy a card with more memory, it's as simple as that.
Different GPUs have different levels of compression, and different ways of allocating and managing memory.
Its not like there aren't any alternatives at the same price point. Back when the 780(ti) and 7970 were hot, that was different. Today, its silly to pick one. Its simply not the preferable choice especially given the often tiny price gap for something better.
Comparing Kepler directly to Turing is not fair, newer GPU architectures are more efficient.
What the hell where they thinking? And this 3gb RAM for a midrange card is an insult.
Hopefully for future buyers, it might be fake...
If that is something you feel comfy spending ~200 bucks on, by all means. I'd suggest spending 220~240 to get double VRAM and more consistency alongside higher performance. These 2060's are going to present a choice along those lines and we all know they will, so let's stop fooling each other. This is a typical penny wise / pound stupid trade off.
That extra 1GB of memory makes a noticeable difference.
"Different GPUs have different levels of compression"
That's mostly about memory bandwidth and not the frame buffer.
You can't use better textures on a 3GB card vs another 3GB card and say "my GPU has better memory compression".
In Real-World scenarios it won't work like that.
Why? I already explained it to you: the 32-bit barrier is gone. Games aren't treading lightly with their memory footprint anymore. Xbox One X ports have access to 11 GiB RAM + VRAM. As the next generation of consoles launch, it's going to go even higher. 3 GiB is like a person with a broken leg: they'll live but not well. Games want more memory. Run out and framerate tanks.
Some games will work fine, but most visually high-end titles will struggle.
Battlefield 5 is already unplayable at 1080p with less than 4GB VRAM.
But nobody of these people would actually buy such a card. Its funny, people with 2080's telling to other members that "3gb is fine".
But hey.... they are 'experienced", they know all and the rest of us we are ignorants.
P.S. Yes, i am being sarcastic.