Tuesday, December 25th 2018

NVIDIA GeForce RTX 2060 to Ship in Six Variants Based on Memory Size and Type

NVIDIA drew consumer ire for differentiating its GeForce GTX 1060 into two variants based on memory, the GTX 1060 3 GB and GTX 1060 6 GB, with the two also featuring different GPU core-configurations. The company plans to double-down - or should we say, triple-down - on its sub-branding shenanigans with the upcoming GeForce RTX 2060. According to VideoCardz, citing a GIGABYTE leak about regulatory filings, NVIDIA could be carving out not two, but six variants of the RTX 2060!

There are at least two parameters that differentiate the six (that we know of anyway): memory size and memory type. There are three memory sizes, 3 GB, 4 GB, and 6 GB. Each of the three memory sizes come in two memory types, the latest GDDR6 and the older GDDR5. Based on the six RTX 2060 variants, GIGABYTE could launch up to thirty nine SKUs. When you add up similar SKU counts from NVIDIA's other AIC partners, there could be upward of 300 RTX 2060 graphics card models to choose from. It won't surprise us if in addition to memory size and type, GPU core-configurations also vary between the six RTX 2060 variants compounding consumer confusion. The 12 nm "TU106" silicon already has "A" and "non-A" ASIC classes, so there could be as many as twelve new device IDs in all! The GeForce RTX 2060 is expected to debut in January 2019.
Source: VideoCardz
Add your own comment

230 Comments on NVIDIA GeForce RTX 2060 to Ship in Six Variants Based on Memory Size and Type

#151
phill
Has Nvidia gone mad?? So much effort for a middle of the road card, I just don't get it...
Posted on Reply
#152
micropage7
sometimes i feel like they poisoning the market, i mean they create small gap to push consumer to add little more money to get higher one
Posted on Reply
#153
bug
micropage7sometimes i feel like they poisoning the market, i mean they create small gap to push consumer to add little more money to get higher one
Yeah, I hate them for that, too. I mean, before Nvidia, this tactics was almost unheard of. Oh, wait!
Posted on Reply
#154
Space Lynx
Astronaut
Can't wait for 7nm AMD gpu's to make Nvidia cry. Sick of this crap.
Posted on Reply
#155
Vayra86
micropage7sometimes i feel like they poisoning the market, i mean they create small gap to push consumer to add little more money to get higher one
You can go to a random bar in town and find the same trickery in a simple price list of booze. Or even a supermarket where all that matters is where the product is positioned - at eye level or far below.
Posted on Reply
#156
efikkan
FordGT90Concept6 GiB version of GTX 1060 outsold the 3 GiB versions 2:1. What does that tell you?
That means Nvidia sold more GTX 1060 3GB cards than AMD did of RX 400/500 combined (according to Steam survey), so that tells me more people want this than Polaris.

And don't forget the 6 GB version of GTX 1060 is like 5-6% faster too.
FordGT90ConceptWhy? I already explained it to you: the 32-bit barrier is gone.
Stop with this nonsense. 32-bit CPUs/OS' have NOTHING to do with memory capacity.
ShurikNBuying 3GB in 2019 is buying for yesterday, not tomorrow. There is zero future-proofing with it. Which would be fine if that was a $120 card. In reality it'll be around $300.

Some games will work fine, but most visually high-end titles will struggle.

Battlefield 5 is already unplayable at 1080p with less than 4GB VRAM.
Aah, the eternal "future proofing" argument.
I remember all those who bought GCN over Kepler because it was more "future proofing" in Direct3D 12. Then R9 390(X) with 8 GB for "future proofing". And then Fiji with HBM for "future proofing", but then suddenly memory capacity didn't matter any more, because HBM was so glorious. Then Polaris with 8 GB for "future proofing", because memory capacity suddenly mattered again.

In real world it's a balancing act. You'll have to guess your requirements for the immediate future. But taking "future proofing" too far is going to be wasted money in the end. History has proven that paying extra for a lot of "future proofing" has never paid off.
lynx29Can't wait for 7nm AMD gpu's to make Nvidia cry. Sick of this crap.
Then prepare yourself for disappointment.
Posted on Reply
#157
Space Lynx
Astronaut
efikkanThen prepare yourself for disappointment.
sure thing buddy.
Posted on Reply
#158
Casecutter
Just to ask, with a 192 Bit bus how would they access 4Gb with only 3 lanes.
Posted on Reply
#159
efikkan
CasecutterJust to ask, with a 192 Bit bus how would they access 4Gb with only 3 lanes.
Two options that I know of:
- Disable one memory controller and use 128-bit, possibly compensate with faster memory.
- Use an imbalanced memory configuration, like GTX 660/660 Ti.
Posted on Reply
#160
Unregistered
LOL at 3GB...

*Looks through closet for old 3GB 7970... yup... launched in 2012...*

I don't get it, but ok... guess 2060 is the "Sucker's Edition"..
Posted on Edit | Reply
#161
bug
lynx29Can't wait for 7nm AMD gpu's to make Nvidia cry. Sick of this crap.
Yeah, people have been waiting for that since AMD's nomenclature used four digits.

Edit: Oops, I forgot about the 285.
Posted on Reply
#162
Vayra86
efikkanThat means Nvidia sold more GTX 1060 3GB cards than AMD did of RX 400/500 combined (according to Steam survey), so that tells me more people want this than Polaris.

And don't forget the 6 GB version of GTX 1060 is like 5-6% faster too.


Stop with this nonsense. 32-bit CPUs/OS' have NOTHING to do with memory capacity.


Aah, the eternal "future proofing" argument.
I remember all those who bought GCN over Kepler because it was more "future proofing" in Direct3D 12. Then R9 390(X) with 8 GB for "future proofing". And then Fiji with HBM for "future proofing", but then suddenly memory capacity didn't matter any more, because HBM was so glorious. Then Polaris with 8 GB for "future proofing", because memory capacity suddenly mattered again.

In real world it's a balancing act. You'll have to guess your requirements for the immediate future. But taking "future proofing" too far is going to be wasted money in the end. History has proven that paying extra for a lot of "future proofing" has never paid off.


Then prepare yourself for disappointment.
You can believe whatever you want to believe and if you think 3GB GPUs are the shit in 2019, more power to you. The sales records show a different picture with even the midrange vastly outselling 3GB models with higher capacities. Compare that to the Kepler days where standard high end VRAM was 2GB. Nobody in their right mind bought the 4GB 670 or 680 - simply because games could barely even allocate more than 1.5GB. Today they allocate 6GB and up without issues. Game development has changed a bit. There is a reason 4GB 970's were released when Maxwell popped up (a 25% increase over a similar core power GTX 780(ti)), and that amound doubled for the next gen equivalent, GTX 1070. Its clear as day the balance has completely shifted towards the console norm in terms of VRAM. There is a reason even the midrange RX480 comes in 8GB flavors too.

Explain this, how does 5-6% performance gap translate to half or 25% less VRAM? Where is the balance in that? And why would you *not* suffer a performance hit from such a cutdown when you push data over the same, rather narrow bus?

Common sense, use it, instead of gazing endlessly at performance summaries that reduce all detail to a single percentage and rarely bases it on a fully comprehensive benchmark suite. Reviews are an indicator, not an absolute all encompassing truth. People apparently still didn't get that memo. Its the exception that makes the rule when it comes to VRAM and you only need one edge case to kill the experience.
Posted on Reply
#163
Gasaraki
M2BYou can easily run almost every single game on a 3GB card at 1080p just fine, but you're gonna lower the texture quality in some games and that's all. the 3GB version is not ideal but is not as usless as some people think, it actually might be a good value card for those needing a fairly powerful GPU and don't care about the maximum quality textures or VRAM related settings.
if the 3GB RTX 2060 is going to be priced under 200$, I don't see a problem but anything higher than 200$ is unacceptable.
8GB VRAM on a 800$ card (RTX 2080) is more disappointing than 3GB on a budget card to me.
OMG, I NEEDS 12GB of RAM!!!

The truth is that if most people game at 1080 (which it is) 3GB of RAM should be enough for the vast majority of games. I game at 3440x1440 and most games don't break 4GB of RAM.
Posted on Reply
#164
Tatty_Two
Gone Fishing
GasarakiOMG, I NEEDS 12GB of RAM!!!

The truth is that if most people game at 1080 (which it is) 3GB of RAM should be enough for the vast majority of games. I game at 3440x1440 and most games don't break 4GB of RAM.
I struggle to comprehend what games you are playing then. I moved from a 4GB 290X to a 1070 a few months back, I only play one game which is world of tanks and they completely updated their game engine and overnight the 290X on ultra settings moved from an average 2.9Gig usage to 4.4 and that game is hardly demanding even on ultra at my 2560 x 1080.
Posted on Reply
#165
cyneater
I love how everyone is triggered with the 3,4 and 6GB?

What about the 11Gb flag ship.... its the same as the last generation... :P and its an odd number.
12GB would be better or 16gb :P

Since there is no competition Nvidia could stick there logo on a turd and market that

Where is the blast processing?
Posted on Reply
#166
kanecvr
NxodusAMD = overheating, power hungry, unstable, bad software, no innovation BUT IT'S CHEAPER!!
NVIDIA = expensive

I really don't get the NVIDIA haters, AMD is pure shit, it's the Walmart variant of video cards
You're kidding right? Latest nvidia software is garbage. AMD adrenalin has better features, includes out of the box OC and fan profiles, runs and feels smoother, is less taxing then geforce experience and launches faster, doesn't nag you to create an account to use some features, and it's been bug free for years - as opposed to nvidia's drivers witch had versions with critical bugs like "forgetting" to spin up your fans when in load, miss-managing GPU voltage, failing to install over stock, signed windows drivers, and so on.

And don't get me started on overheating. My 1080FE would quickly go to 83C and throttle down to 1300mhz, causing it to perform WORSE them my old 1070. I took the FE card back and got a MSI card, witch did pretty much the same thing. I had to buy and install a 100$ cooler to get the card to stop throttling. Same for power usage - the 180w TDP on the 1080 is pure fiction. Under load my 1080 draws 200-240w on it's own (tested with an ampermeter on the 2nd 12v rail on the PSU witch only the video card is using 20 amps x 12 = 240w. I tought the card draws 180w if not allowed to boost, but at stock 1530Mhz it draws allmost 16 amps - 12v x 16a = 192W. If you're refering to the 1060, the yeah - those are cool cards. 70-75C even with cheap, crap coolers - but they are fast cards. They're OK for 1080p, but that's it. The 580 can do a lot better, especially overclocked versions like the XFX 580GTS OC Black edition (that card does get pretty freakin' hot tough).

As for instability - you've never used an AMD card right? I've had a 7950, then two, then a 280x, then bought a second one, and then a 290 (no-x) - they were all rock-solid. I also played around with a Vega64 - and while the max FPS is not as high as on a 1080 (in some games), the minimum FPS and frame times are miles better on the Vega, so much so that most games are noticeably smoother, even tough the framerate is a tiny bit slower. I've been trying to trade my 1080 for a vega64, but guess what - nobody wants to take the trade! The only reason I switched to nvidia is the minding boom witch made AMD cards climb in price to a silly degree. A Vega 64 was twice the price I payed for my 1080, so I said screw that and bought what made sense at the time.

You are DEFINITELY RIGHT on the innovation part tough... AMD needs to get of their asses and release something competitive - not that 590 (i.e. overclocked 580 BULLSHIT). And this both for AMD and Nvidia fans. Left to their own devices, nvidia will end up charging 5000$ for a high end GPU.
lexluthermiesterFor 1080p screens(which most gamers are still using), 3GB is still reasonable.


That will depend on the settings level.
The 470 and 570 are great little cards. And so is the 580. I picked up a 4GB 580 in november for 150$ for my living room PC (i5 2500k @ 4GHz, mATX form factor) and it runs 1080p @ ultra flawlesly. I even play some games in 4k (less demanding ones like civ6 and some oldies). For that price nvidia was ofering a 1050ti witch is significantly slower.
Posted on Reply
#167
CandymanGR
efikkanStop with this nonsense. 32-bit CPUs/OS' have NOTHING to do with memory capacity.
A 32bit cpu cannot address in 64 bit memory space. And you need 64bit addressing in order to have access in more than 4gigabytes of memory.
There is NO way the memory above 4g to be addressable from a 32bit cpu, not even with virtual memory paging. At least not from an x86 cpu.
efikkanTwo options that I know of:
- Disable one memory controller and use 128-bit, possibly compensate with faster memory.
- Use an imbalanced memory configuration, like GTX 660/660 Ti.
There is NO way to cover the deficit of one third of cutting the memory bus, with higher clocks, because the GDDR5 memory has limitations on the speeds it can achieve. Thats why i think they are using GDDR6 also for the same model. Because IT MATTERS for this generation, even for middle range. Propably the GDDR6 model will be much faster and maybe with slightly different core config. I guess nvidia knows the GDDR5 is not enough as the core needs its memory to be, but they dont give a damn. Milking the cow is the way for them. You will need 30-35% increase in memory speed to cover the deficit.

It so funny seeing you guys trying to defend something that sucks so hard. Really, some people here should consider a new carreer in comedy (that was a joke).
Posted on Reply
#168
bug
Tatty_OneI struggle to comprehend what games you are playing then. I moved from a 4GB 290X to a 1070 a few months back, I only play one game which is world of tanks and they completely updated their game engine and overnight the 290X on ultra settings moved from an average 2.9Gig usage to 4.4 and that game is hardly demanding even on ultra at my 2560 x 1080.
You don't have to update a game engine for that effect. Just upscale your textures 2x and you get 4x* the VRAM usage without actually improving quality.

*without factoring in compression
CandymanGRA 32bit cpu cannot address in 64 bit memory space. And you need 64bit addressing in order to have access in more than 4gigabytes of memory.
There is NO way the memory above 4g to be addressable from a 32bit cpu, not even with virtual memory paging. At least not from an x86-64 cpu.
Ah, this misconception is with us since Athlon64 days. I suggest you look up PAE, the address space hasn't been confined by the general architecture for quite some time. It's awkward to do, so this practice isn't all that widespread (afaik), but it exists.
Posted on Reply
#169
CandymanGR
bugAh, this misconception is with us since Athlon64 days. I suggest you look up PAE, the address space hasn't been confined by the general architecture for quite some time. It's awkward to do, so this practice isn't all that widespread (afaik), but it exists.
Show me an example of a 64bit application working on a 32bit cpu then. There is no way to have a 32bit application with 64bit addressing space on a 32bit cpu.
The are no 64bit memory registers on a 32bit cpu.

Edit: If my style of writing feels aggressive, sorry. I am not attacking anyone, i just disagree with passion.
Posted on Reply
#170
RichF
This is what happens when corporations feel no fear.

No fear of consumer retaliation for anti-consumer practices.

No fear of government oversight reigning anti-consumer practices in (really the same thing since governments are supposed to be people elected to do the people's work).

This is what happens when there is monopoly, duopoly, and quasi-monopoly.

The tech world has far too little competition in a lot of areas and this is what consumers get. If you don't like it you're not going to get anywhere by engaging with forum astroturfers. Organize and get political action.
Posted on Reply
#171
bug
CandymanGRShow me an example of a 64bit application working on a 32bit cpu then. There is no way to have a 32bit application with 64bit addressing space on a 32bit cpu.
The are no 64bit memory registers on a 32bit cpu.

Edit: If my style of writing feels aggressive, sorry. I am not attacking anyone, i just disagree with passion.
You refuse to educate yourself with the same passion, it would seem.
Posted on Reply
#172
CandymanGR
bugYou refuse to educate yourself with the same passion, it would seem.
Thanks for your valuable input.
Posted on Reply
#173
bug
CandymanGRThanks for your valuable input.
I gave you my input above: look up PAE and read about it. (And not because I'm too lazy to detail, but because it's a lot to read.)
You're acting as if that never happened.
Posted on Reply
#174
CandymanGR
bugI gave you my input above: look up PAE and read about it. (And not because I'm too lazy to detail, but because it's a lot to read.)
You're acting as if that never happened.
And i also told you 'it doesnt work well, not even with virtual memory paging', but you also act as like nothing happened. Those cpu's cannot "see" the whole memory at once. Memory paging sucks, its ancient techonology. Thats why we went to x86-64 technology.
You forget that PAE maybe it does support indeed 64 bit memory range, but in THEORY. Because in reality the virtual address space capabilities of those cpu's (Pentium Pro) remained 32bit. This changed with the AMD x86-64.

Edit: In any case, i wont say more about this, because i think it is off topic.
Posted on Reply
#175
FordGT90Concept
"I go fast!1!11!1!"
bugAh, this misconception is with us since Athlon64 days. I suggest you look up PAE, the address space hasn't been confined by the general architecture for quite some time. It's awkward to do, so this practice isn't all that widespread (afaik), but it exists.
PAE is not practical in gaming (as stated repeatedly). The latency is too high, framerates plummet. It's like going into the 3.5-4.0 GiB territory of a GTX 970.

The point of mentioning it is that it manifests a watershed moment. When games were developed for 32-bit, their memory usage was very restricted. The moment games switched to 64-bit, suddenly there was memory available so developers sought to use it. Fury X marks the transition. 4 GiB was okay then but it definitely isn't okay now--especially in premium cards.

Just look at the response to this thread. All but two people, by my count, are scoffing at the notion of a 3 GiB 2060. It's sad that yields are so low they feel they need to debut four extra models of sub-par cards under the same brand.
Posted on Reply
Add your own comment
Apr 26th, 2024 06:32 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts