Tuesday, December 25th 2018

NVIDIA GeForce RTX 2060 to Ship in Six Variants Based on Memory Size and Type

NVIDIA drew consumer ire for differentiating its GeForce GTX 1060 into two variants based on memory, the GTX 1060 3 GB and GTX 1060 6 GB, with the two also featuring different GPU core-configurations. The company plans to double-down - or should we say, triple-down - on its sub-branding shenanigans with the upcoming GeForce RTX 2060. According to VideoCardz, citing a GIGABYTE leak about regulatory filings, NVIDIA could be carving out not two, but six variants of the RTX 2060!

There are at least two parameters that differentiate the six (that we know of anyway): memory size and memory type. There are three memory sizes, 3 GB, 4 GB, and 6 GB. Each of the three memory sizes come in two memory types, the latest GDDR6 and the older GDDR5. Based on the six RTX 2060 variants, GIGABYTE could launch up to thirty nine SKUs. When you add up similar SKU counts from NVIDIA's other AIC partners, there could be upward of 300 RTX 2060 graphics card models to choose from. It won't surprise us if in addition to memory size and type, GPU core-configurations also vary between the six RTX 2060 variants compounding consumer confusion. The 12 nm "TU106" silicon already has "A" and "non-A" ASIC classes, so there could be as many as twelve new device IDs in all! The GeForce RTX 2060 is expected to debut in January 2019.
Source: VideoCardz
Add your own comment

230 Comments on NVIDIA GeForce RTX 2060 to Ship in Six Variants Based on Memory Size and Type

#26
Mescalamba
M2BYou can easily run almost every single game on a 3GB card at 1080p just fine, but you're gonna lower the texture quality in some games and that's all. the 3GB version is not ideal but is not as usless as some people think, it actually might be a good value card for those needing a fairly powerful GPU and don't care about the maximum quality textures or VRAM related settings.
if the 3GB RTX 2060 is going to be priced under 200$, I don't see a problem but anything higher than 200$ is unacceptable.
8GB VRAM on a 800$ card (RTX 2080) is more disappointing than 3GB on a budget card to me.
Some games will simply NOT run at 3GB. There isnt that much of them, but that said, 6GB is bare minimum. Wouldnt consider anything with less than 8GB as upgrade at this point.
Posted on Reply
#27
Recus
First AMD fanboys blaming Nvidia for holding AIBs on short leash. Then Nvidia let AIBs run loose... Still Nvidia's fault. :laugh:
Posted on Reply
#28
ArbitraryAffection
NxodusI'm not trolling, but the NVIDIA hate these days is slowly getting to me.

Just look at the forums: AMD sub-forum has 387 pages while NVIDIA has 230. AMD cards are shit-tier. Whenever a new game gets released the forums are full of AMD owners crying.
Sure thing hon.
Posted on Reply
#29
Rowsol
This is really stupid. Just test each config and pick the one that makes the most sense. I don't see them doing this crap with the more expensive models so why this one?
Posted on Reply
#30
lexluthermiester
ArbitraryAffection3GB is cutting it really fine IMO. Even at 1080p. Fallout 4 will use 3GB+ at 1080p without the HD textures (the HD pack is a joke of unoptimised BS though). and Far Cry 5 will use over 4GB at 1080p too.
That assumes full AA, which most people turn down or off, which naturally reduces the memory footprint, even with HD texturing.
NxodusAMD is the retarded child that you have to hide in the attic when guests come over to visit.
Please take your fanboy trolling elsewhere, no one here cares about such silly nonsense..
Posted on Reply
#31
ArbitraryAffection
lexluthermiesterThat assumes full AA, which most people turn down or off, which naturally reduces the memory footprint, even with HD texturing.
True, but with 2060 approaching 1070 performance, surely you'd expect people to want to play with settings maxed? I always thought the xx60 series were about max settings FHD gaming.
Posted on Reply
#32
Paganstomp
I'm going for the one that has the fastest changing RGB lighting on it.
Posted on Reply
#33
lexluthermiester
ArbitraryAffectionTrue, but with 2060 approaching 1070 performance, surely you'd expect people to want to play with settings maxed? I always thought the xx60 series were about max settings FHD gaming.
Not really. I have a 2080 and still turn off AA. No real need for it at 1080p and above. Most gamers prefer framerate VS AA.
Posted on Reply
#34
ArbitraryAffection
lexluthermiesterNot really. I have a 2080 and still turn off AA. No real need for it at 1080p and above. Most gamers prefer framerate VS AA.
Fair enough. I suppose it depends on the type of AA, too. I'm actually running TAA enabled in Fallout 4 and the 570 manages it at 60fps but I am also turning down other things like shadows. Warframe also uses TAA but that's really light and I'm at 120FPS more often than not at FHD. Hilariously, Fallout 76 uses over 7GB of vram at 1080p. Don't even ask why, the textures don't look anywhere near good enough haha. I think its caching the RAM or Bethesda just copy pasted their Fallout 4 Hi res textures to it. (really badly optimised). I would still prefer more VRAM though. I feel like 8GB on my 570 was a real bargain even though I won't be able to use it all before I run out of GPU power but at least I will have zero vram issues.
Posted on Reply
#35
efikkan
FordGT90ConceptYou can get an 8 GiB RX 580 for $220. A 3 GiB card these days shouldn't be going for much more than $120 (the realm of budget cards).
I think one RTX 2060 is plenty to fill the market. But comparing GPUs and setting prices based on memory size is pure BS. 3GB is fine for an entry mid-range card as long as proper benchmarking shows it's fine.
ArbitraryAffectionMakes me even more happy with the 8GB on the RX 570 I just picked up for £150. (plus 2x free AAA games). Yes I know 2060 will be faster but AMD offering 8GB at this price point is insane value. Oh and the card runs 1080p just fine.
RTX 2060 will perform far beyond RX 570, a low-end card. Brag all you want about 8GB, there is no way that card needs it.
ArbitraryAffection3GB is cutting it really fine IMO. Even at 1080p. Fallout 4 will use 3GB+ at 1080p without the HD textures (the HD pack is a joke of unoptimised BS though). and Far Cry 5 will use over 4GB at 1080p too.
Memory usage doesn't mean memory requirement, many applications allocate more memory than they need. What matters is performance, or lack thereof. Stuttering might be a indicator of too little memory.
Posted on Reply
#36
27MaD
BlueberriesPeople had a hard time with 3GB vs 6GB...
And now there is 6 different memory size and type.
Posted on Reply
#37
kings
A 3GB card in 2019 for $300 incoming!

I hope AMD will wake up fast! Maybe they will be able to bring some serious competition on 7nm, at least as long Nvidia continues with 12nm /16nm.
Posted on Reply
#38
oxidized
6 variants of 2060 rofl this is totally insane, i really hope this isn't true, but seeing something similar happened with 1060 doesn't really give me much hope. They should've upgraded 2060 to 8GB and maybe made a 4GB version both GDDR6 ofc or if the price was right, the 4GB variant could've had GDDR5X just to cut che costs.
Posted on Reply
#39
FordGT90Concept
"I go fast!1!11!1!"
RecusFirst AMD fanboys blaming Nvidia for holding AIBs on short leash. Then Nvidia let AIBs run loose... Still Nvidia's fault. :laugh:
NVIDIA is the one shipping 6 variants of the silicon to AIBs. AIBs install the GDDR chips that are compatible. AIBs can only control which chips they get, not what is available.
efikkanI think one RTX 2060 is plenty to fill the market. But comparing GPUs and setting prices based on memory size is pure BS. 3GB is fine for an entry mid-range card as long as proper benchmarking shows it's fine.
DRAM is a large chunk of the cost to manufacture a video card. Additionally, having so little VRAM makes the card less valuable to gamers even at 1920x1080. 64-bit games (which most are these days) will use >4 GiB of VRAM if it is available. Couldn't care less about benchmarks. More and more games not only do graphics on GPU, but computation on GPU as well.
Posted on Reply
#40
efikkan
FordGT90Concept64-bit games (which most are these days) will use >4 GiB of VRAM if it is available.
"64-bit" games have nothing to do with GPU memory usage. That makes no sense from a technical perspective.
FordGT90ConceptCouldn't care less about benchmarks.
This really says it all, doesn't it?
FordGT90ConceptMore and more games not only do graphics on GPU, but computation on GPU as well.
They do, but this is still irrelevant.
Posted on Reply
#41
Durvelle27
lexluthermiesterNot really. I have a 2080 and still turn off AA. No real need for it at 1080p and above. Most gamers prefer framerate VS AA.
I have a RTX 2070 and play everything on max settings with AA on triple monitors
Posted on Reply
#42
lexluthermiester
Durvelle27I have a RTX 2070 and play everything on max settings with AA on triple monitors
And your framerates are?
Posted on Reply
#43
bug
Write your congressman, limit the number of designs to one! Because choice is to be avoided! :kookoo:

Wth guys, this is just different memory capacity and type. I'd be more curious if this means different memory bus widths and, like 1060 before this, different internal configurations.
Posted on Reply
#44
Durvelle27
lexluthermiesterAnd your framerates are?
Depends on the game

I actually did a full mini review showings it’s performacne. You can find it under my profile content

It’s actaully holds it own very well.
Posted on Reply
#45
Vya Domus
FordGT90Concept64-bit games (which most are these days) will use >4 GiB of VRAM if it is available.
Double-precision is rarely used, if ever for shading, in game. That is unless you meant something else ? Also, just because your application is compiled for 64-bit machines that doesn't mean there is direct correlation between that and memory usage. You can have a program that uses a smidget of 64-bit data types and yet it may use just a couple of megabytes or several gigabytes. It's more about quantity, not just data types that are used.
RecusFirst AMD fanboys blaming Nvidia for holding AIBs on short leash. Then Nvidia let AIBs run loose... Still Nvidia's fault. :laugh:
You are completely out of touch with this subject.
NxodusAMD is pure shit
Nah, that'd be your comment.
Posted on Reply
#46
CandymanGR
What? No 5gb version? 3-4-6... where is the 5? I guess they missed that one.

Nvidia has fucked it up, and not matter what fanboys are telling you, the RTX was a fiasco.
And people who claim 3gb are enough for 1080p gaming are twisting the truth. That is true only for esports games, that are designed to be lightweight. You CANNOT play a lot of AAA modern games because they demand more than 3gb vram EVEN in 1080p/medium/high settings (GTA V, Far Cry 4, Shadow of mordor, etc). And this will happen more often as newer games come out. So lets cut the crap, shall we?

P.S. And before you criticize me for AMD fanboy, i will tell you that in more than 20 gpus i've had, only two were AMD.
Posted on Reply
#47
lexluthermiester
CandymanGRNvidia has fucked it up, and not matter what fanboys are telling you, the RTX was a fiasco.
No it isn't. Whiners have made a bunch of needless noise. I have one. It rocks.
Posted on Reply
#48
Vya Domus
Your threadcrapping is getting annoying.
Posted on Reply
#49
CandymanGR
lexluthermiesterNo it isn't. Whiners have made a bunch of needless noise. I have one. It rocks.
I dont trust a single word from people who say that what they bought is the best. And also, 3gb are NOT enought. You are biased as hell.
Posted on Reply
#50
Vya Domus
The point isn't that 3GB is enough or not. It's just rather pathetic to ship a card that will likely be at the very least 250$+ with just 3GB in 2019.
Posted on Reply
Add your own comment
May 11th, 2024 09:57 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts