Tuesday, December 25th 2018

NVIDIA GeForce RTX 2060 to Ship in Six Variants Based on Memory Size and Type

NVIDIA drew consumer ire for differentiating its GeForce GTX 1060 into two variants based on memory, the GTX 1060 3 GB and GTX 1060 6 GB, with the two also featuring different GPU core-configurations. The company plans to double-down - or should we say, triple-down - on its sub-branding shenanigans with the upcoming GeForce RTX 2060. According to VideoCardz, citing a GIGABYTE leak about regulatory filings, NVIDIA could be carving out not two, but six variants of the RTX 2060!

There are at least two parameters that differentiate the six (that we know of anyway): memory size and memory type. There are three memory sizes, 3 GB, 4 GB, and 6 GB. Each of the three memory sizes come in two memory types, the latest GDDR6 and the older GDDR5. Based on the six RTX 2060 variants, GIGABYTE could launch up to thirty nine SKUs. When you add up similar SKU counts from NVIDIA's other AIC partners, there could be upward of 300 RTX 2060 graphics card models to choose from. It won't surprise us if in addition to memory size and type, GPU core-configurations also vary between the six RTX 2060 variants compounding consumer confusion. The 12 nm "TU106" silicon already has "A" and "non-A" ASIC classes, so there could be as many as twelve new device IDs in all! The GeForce RTX 2060 is expected to debut in January 2019.
Source: VideoCardz
Add your own comment

230 Comments on NVIDIA GeForce RTX 2060 to Ship in Six Variants Based on Memory Size and Type

#126
lexluthermiester
CandymanGRI dont think the extra clock can cover 1/3 of bandwidth cut.
It's been done on previous gen cards of both AMD and NVidia. We'll see what happens once the review/information embargo's are lifted.
Posted on Reply
#127
Xzibit
lexluthermiesterNot based on the poll that was done a few months ago here on TPU. Based on that poll, most people tinker with their settings and turn AA down. Then there's Steam's own stats that show most people turn AA down or off, most of them running at 1080p. Pixel density 1080p, 1440p and up mostly eliminates the need for AA as the "pixel laddering" effect of the past isn't noticeable or pronounced like it was in the past with lower resolutions and simply isn't needed. Try it yourself. Turn it off and see if it matters that much to you. Be objective though.
Still can't find it. As for Pixel density the survey doesn't specify what size monitor people use either.
lexluthermiesterI know they've had those stats available. I just can't find it. Maybe they took it down?
EDIT;
There are still these;
store.steampowered.com/hwsurvey
They never have shown AA use in those surveys.
Posted on Reply
#128
EarthDog
lexluthermiesterI know they've had those stats available. I just can't find it. Maybe they took it down?
EDIT;
There are still these;
store.steampowered.com/hwsurvey
I dont ever recall seeing AA stats there. I cant even fathom how they would keep track of something so variable anyway.

Thanks though. :)
Posted on Reply
#129
FordGT90Concept
"I go fast!1!11!1!"
6 GiB (11.16%) variant is obviously more popular than 3 GiB (6.53%) and 5 GiB (1.86%).
Posted on Reply
#130
Charcharo
bugYou do however posses an uncanny ability to mix together all sorts of things barely related to the subject.

The simple truth is games happen to be playable at settings other than max. Some will look better, some will look worse, depending on budget and who coded it and made the assets. But I have never met a person who didn't play a game because they couldn't max out shadows or textures. I have met people that delayed playing a game, because they were planning an upgrade and thought they'd make the best of it.
That you can't see the relation is not my problem. It is pretty obvious to me...

I have met people who won't play a game if they can't max it out, and also people who won't play a game since they will be getting an upgrade soon. I haven't met a person who looks at system requirements, unironically, since 2011.
EarthDogOh, I'd bet good money a majority wouldn't consider 30 fps to be playable in most genres/titles. RTS, I can do... FPS...I'd cry and likely get a headache...

Playable and enjoyable I'm kind of using interchangeably. I mean... 15 is playable. The game plays...but the experience and it being enjoyable, the majority tend agree 30 fps isnt for PCs.

But....this is all a bit OT. I'd love a thread to get down to the bottom.of why 30 fps seems different on a console versus a PC (I know why movies can get away it).
Majority rule is not a valid argument. It never has been, but if you want to play with that - consoles as a whole (not individual platforms) are bigger than PC Gaming. RTS is extremely demanding, what you are looking for seems to be turn-based strategy or tactics. Those you can tolerate at 30 fps.

PCs are not some ephemeral platform that is so different to consoles. The requirements in games do not mean much, that is the point. BTW, Wolfenstein The New Order used to be (before a game update which broke its wobbly engine) be playable at 60 fps, 1080p on a 6850. Even using your wacky ideas for what system requirements should mean, this is illogical.


3GB is way too little. 4GB is pathetic too. It is not a good idea at all. VRAM matters, modding matters, the 1160/2060 is fast enough for it to matter even for people who misunderstand how VRAM works.
Posted on Reply
#131
Batailleuse
M2BYou can easily run almost every single game on a 3GB card at 1080p just fine, but you're gonna lower the texture quality in some games and that's all. the 3GB version is not ideal but is not as usless as some people think, it actually might be a good value card for those needing a fairly powerful GPU and don't care about the maximum quality textures or VRAM related settings.
if the 3GB RTX 2060 is going to be priced under 200$, I don't see a problem but anything higher than 200$ is unacceptable.
8GB VRAM on a 800$ card (RTX 2080) is more disappointing than 3GB on a budget card to me.
Point of. Buying a 2060 3gb to run low. Everything? You can buy a 4-6gb 1060 that will do better for cheaper or even a 1070 that will do better for cheaper
Posted on Reply
#132
lexluthermiester
XzibitThey never have shown AA use in those surveys.
Yes they have. Those kind of fine-grained details are available to the Steam Client and get reported to Steam telemetry service. The info is there and has been available, just doesn't seem to be there anymore.
Posted on Reply
#133
Ruru
S.T.A.R.S.
ShurikN3GB on a mid range gpu in 2019... I'm at a loss for words.
Still kicking with GTX 780 (I know, this was high-end back in the day)

BUT WHY?! Why there can't be just one or two models..
Posted on Reply
#134
Vayra86
This just goes to show that Nvidia still cannibalizes x50-x60 in every possible way they can. Been doing it since 550ti and never stopped. I'm not that into the generations prior to it when it comes to the VRAM trickery but maybe this was their thing for longer.

Nothing new here, apart from the ever increasing complexity when buying one of these GPUs. The take away is the same: when you're shopping in this territory, you will always be making tradeoffs from the moment you purchase it. And those tradeoffs can range well into uncomfortable territory for even casual gamers. As for the rest of the four-page discussion about quality settings... not very relevant I'd say.

For a meaty discussion on 3GB vs more GB, we have live examples of 780, 780ti and 1060 3GB already running into those limits at settings these GPUs can push comfortably. That says enough because the 2060 will be faster on the core. Balance is meh and 4GB and up is recommended. Its not even relevant what specs on the games' box say in that regard, its about how a GPU is balanced.


The key element is price, and knowing Nvidia, they will price it far too high, so TL DR these cards are going to be eclipsed by a more power hungry AMD alternative, /thread and Happy New Year :toast:
Posted on Reply
#135
goodeedidid
ShurikN3GB on a mid range gpu in 2019... I'm at a loss for words.
Hey they give you the choice, if you wanna go budget then do it, nobody is going to be rocking 2060 with 4K screens so 3GB shouldn't be that bad, for FHD that is.
Posted on Reply
#136
efikkan
Charcharo3GB is way too little. 4GB is pathetic too. It is not a good idea at all. VRAM matters, modding matters, the 1160/2060 is fast enough for it to matter even for people who misunderstand how VRAM works.
So how do you pick your threshold? Is it an arbitrary number? It's 2018, therefore cards needs x amount of memory?
I base my conclusions on facts, and the fact is that GTX 1060 3GB is still a good option for many buyers.

And as I've said many times already, just because some need more memory doesn't mean everyone needs it.
Modding and high-res texture packs is a niche thing - an edge case, if you're one of those who do it, buy a card with more memory, it's as simple as that.
Posted on Reply
#137
FordGT90Concept
"I go fast!1!11!1!"
4 is okay (not so much in RTX 2060 case because of gimped bandwidth), >4 is preferred.
Posted on Reply
#138
efikkan
FordGT90Concept4 is okay (not so much in RTX 2060 case because of gimped bandwidth), >4 is preferred.
Why is 4GB okay and 3GB not?
Different GPUs have different levels of compression, and different ways of allocating and managing memory.
Posted on Reply
#139
Tsukiyomi91
would just rather take the full 6GB GDDR6 variant anyways coz any smaller/slower memory types will tank the GPU a lot & more complaints will show up...
Posted on Reply
#140
Vayra86
efikkanWhy is 4GB okay and 3GB not?
Different GPUs have different levels of compression, and different ways of allocating and managing memory.
Call it the value of experience, and lacking that, 3GB is a great choice for a midrange GPU. Then you use one, you learn its not all that rosy as the reviews and benches told you, and you make the better choice next time. Or you're thát casual in gaming that you never touch a game that hits a limit or accidentally never use settings that hit a limit. In those edges cases, yes, 3GB is fine.

Its not like there aren't any alternatives at the same price point. Back when the 780(ti) and 7970 were hot, that was different. Today, its silly to pick one. Its simply not the preferable choice especially given the often tiny price gap for something better.
Posted on Reply
#141
efikkan
Vayra86Call it the value of experience, and lacking that, 3GB is a great choice for a midrange GPU. Then you use one, you learn its not all that rosy as the reviews and benches told you, and you make the better choice next time. Or you're thát casual in gaming that you never touch a game that hits a limit or accidentally never use settings that hit a limit. In those edges cases, yes, 3GB is fine.

Its not like there aren't any alternatives at the same price point. Back when the 780(ti) and 7970 were hot, that was different. Today, its silly to pick one. Its simply not the preferable choice especially given the often tiny price gap for something better.
In the real world, GTX 1060 3GB works just fine for an entry mid-range card, and that's what the experience tells us.

Comparing Kepler directly to Turing is not fair, newer GPU architectures are more efficient.
Posted on Reply
#142
HZCH
Confusing AF

What the hell where they thinking? And this 3gb RAM for a midrange card is an insult.

Hopefully for future buyers, it might be fake...
Posted on Reply
#143
Vayra86
efikkanIn the real world, GTX 1060 3GB works just fine for an entry mid-range card, and that's what the experience tells us.

Comparing Kepler directly to Turing is not fair, newer GPU architectures are more efficient.
You said it right. Entry midrange. Bottom barrel. And releasing a similar 3GB 'next gen' means it has dropped lower than that.

If that is something you feel comfy spending ~200 bucks on, by all means. I'd suggest spending 220~240 to get double VRAM and more consistency alongside higher performance. These 2060's are going to present a choice along those lines and we all know they will, so let's stop fooling each other. This is a typical penny wise / pound stupid trade off.
Posted on Reply
#144
M2B
efikkanWhy is 4GB okay and 3GB not?
Different GPUs have different levels of compression, and different ways of allocating and managing memory.
4GB allows you to use Ultra/High quality textures in most games where the Ultra textures is out of reach on a 3GB card in newer AAA titles. sometimes you need to put the texture quality on medium to avoid stuttering and other problems on a 3G card.
That extra 1GB of memory makes a noticeable difference.

"Different GPUs have different levels of compression"
That's mostly about memory bandwidth and not the frame buffer.
You can't use better textures on a 3GB card vs another 3GB card and say "my GPU has better memory compression".
In Real-World scenarios it won't work like that.
Posted on Reply
#145
FordGT90Concept
"I go fast!1!11!1!"
efikkanIn the real world, GTX 1060 3GB works just fine for an entry mid-range card, and that's what the experience tells us.

Comparing Kepler directly to Turing is not fair, newer GPU architectures are more efficient.
6 GiB version of GTX 1060 outsold the 3 GiB versions 2:1. What does that tell you? Now we're almost 2.5 years later. I wouldn't be surprised if it shifts to 4:1 to 8:1 with the RTX 2060.

Why? I already explained it to you: the 32-bit barrier is gone. Games aren't treading lightly with their memory footprint anymore. Xbox One X ports have access to 11 GiB RAM + VRAM. As the next generation of consoles launch, it's going to go even higher. 3 GiB is like a person with a broken leg: they'll live but not well. Games want more memory. Run out and framerate tanks.
Posted on Reply
#146
ShurikN
Buying 3GB in 2019 is buying for yesterday, not tomorrow. There is zero future-proofing with it. Which would be fine if that was a $120 card. In reality it'll be around $300.
Some games will work fine, but most visually high-end titles will struggle.
Battlefield 5 is already unplayable at 1080p with less than 4GB VRAM.
Posted on Reply
#147
CandymanGR
Yes, but still people here try to persuade the rest of us that.. "3gb is just fine". Like we haven't seen ourselves how far this is from the truth.
But nobody of these people would actually buy such a card. Its funny, people with 2080's telling to other members that "3gb is fine".
But hey.... they are 'experienced", they know all and the rest of us we are ignorants.

P.S. Yes, i am being sarcastic.
Posted on Reply
#148
Tatty_Two
Gone Fishing
FordGT90Concept6 GiB version of GTX 1060 outsold the 3 GiB versions 2:1. What does that tell you? Now we're almost 2.5 years later. I wouldn't be surprised if it shifts to 4:1 to 8:1 with the RTX 2060.

Why? I already explained it to you: the 32-bit barrier is gone. Games aren't treading lightly with their memory footprint anymore. Xbox One X ports have access to 11 GiB RAM + VRAM. As the next generation of consoles launch, it's going to go even higher. 3 GiB is like a person with a broken leg: they'll live but not well. Games want more memory. Run out and framerate tanks.
To be fair, that was not just due to the increased memory but possibly the 128 more shader units the 6GB version had, it was just a faster card even when not using the additional memory for some 40 or 50$ more...……. the combination of both just made so much more sense for those who could stretch to the additional cost.
Posted on Reply
#149
EarthDog
Tatty_OneTo be fair, that was not just due to the increased memory but possibly the 128 more shader units the 6GB version had, it was just a faster card even when not using the additional memory for some 40 or 50$ more...……. the combination of both just made so much more sense for those who could stretch to the additional cost.
Indeed....if anyone knew about it. That wasn't exactly advertised much. Prospective buyers would have to know or look and compare specs. Most looking at these see '6GB more than 3GB' and think it's better. You give consumers too much credit. :p
Posted on Reply
#150
bug
This is off topic (if this thread ever had one), but I was just looking at some ads for laptops. The GPU model wasn't even mentioned, it was "Nvidia graphics with 4GB of RAM". A bit scary imho.
Posted on Reply
Add your own comment
May 19th, 2025 08:06 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts