• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce RTX 2060 to Ship in Six Variants Based on Memory Size and Type

It can't directly address anything outside of 32-bit virtual address space (includes system RAM and VRAM). Like RAM, VRAM can use a physical address extension like system to get around that but performance is terrible so not generally viable for games.

Point is, the reason why 1, 2, 3.5 GiB was okay for long is because games were developed for Xbox 360 and PlayStation 3 that only had 512 MiB of memory. Everything was kept 32-bit because AAA games had to fit in such a tiny footprint anyway on consoles. With the transition to Xbox One and PlayStation 4 having 8 GiB of memory, 64-bit games are everywhere and with it a "smoke it if you got it" approach to memory usage. That's why the GTX 970 has aged poorly and so many in this thread scoff at 3 GiB variants of graphics cards. 3 GiB was more than enough several years ago. It isn't anymore because the 32-bit/512 MiB RAM veil has finally been truly lifted.

Even the 3GB 7970/280s are getting that way, 6GB variants are doing better.
 
It can't directly address anything outside of 32-bit virtual address space (includes system RAM and VRAM). Like RAM, VRAM can use a physical address extension like system to get around that but performance is terrible so not generally viable for games.

Point is, the reason why 1, 2, 3.5 GiB was okay for long is because games were developed for Xbox 360 and PlayStation 3 that only had 512 MiB of memory. Everything was kept 32-bit because AAA games had to fit in such a tiny footprint anyway on consoles. With the transition to Xbox One and PlayStation 4 having 8 GiB of memory, 64-bit games are everywhere and with it a "smoke it if you got it" approach to memory usage. That's why the GTX 970 has aged poorly and so many in this thread scoff at 3 GiB variants of graphics cards. 3 GiB was more than enough several years ago. It isn't anymore because the 32-bit/512 MiB RAM veil has finally been truly lifted.
Please don't mix register width bits with memory capacity, it has nothing to do with it, and I cringe every time someone mixes these up.

Also be aware that GPU memory is a separate address space controlled by the GPU. In theory, there is nothing preventing you from having a GPU with >4GB memory on a 32-bit system.
 
Please don't mix register width bits with memory capacity, it has nothing to do with it, and I cringe every time someone mixes these up.
I didn't.

Also be aware that GPU memory is a separate address space controlled by the GPU.
Graphics driver. Could be WOW32 driver which is why the 4 GiB limitations may still apply.

In theory, there is nothing preventing you from having a GPU with >4GB memory on a 32-bit system.
Just like you could have 64 GiB in a 32-bit system but it will only be able to address 4 GiB of it. Moot argument.


I'm trying to get a definitive answer from engineers on this question because it's not something that's really been addressed in relation to D3D10 and newer.
 
I'll ask the simple question(s)... How does a chip built with a GDDR6 memory controller run GDDR5? Okay sure it's a 5-6 memory controller, and they fuse off the bit's not need in post, but wouldn't that make it bigger and parts wasted?
Why do I have that 970 4Gb taste back, all of a sudden?
 
Just like you could have 64 GiB in a 32-bit system but it will only be able to address 4 GiB of it. Moot argument.
That's wrong in two ways, firstly a 32-bit CPU can have a larger than 32-bit address space, like some Intel CPUs did. Similarly Intel 8086(16-bit) had a 20-bit address bus, 80286(16-bit) had 24-bit address bus. Secondly the GPU's internal addressing is not controlled from the CPU side, nor is GPU memory mirrored in system memory. You can have more GPU memory than system memory and have no troubles filling it up, and if the CPU and OS is 32-bit has nothing to do with this.

To end this, 32-/64-bit has nothing to do with graphics memory usage of games, so please keep that separate.
 
Imagine each memory variant also has different shader numbers to spice it up. Maximum consumer confusion.
 
nor is GPU memory mirrored in system memory.
Now that definitely depends on the graphics API and the application. GPU memory might be mirrored depending on the what developer is trying to do.
You can have more GPU memory than system memory and have no troubles filling it up, and if the CPU and OS is 32-bit has nothing to do with this.
GPU address space still needs to get mapped like any other peripheral. Your statement is flat out incorrect. Also, don't go using PAE as a response. We all know how rickety that barge is.
 
Write your congressman, limit the number of designs to one! Because choice is to be avoided! :kookoo:

Wth guys, this is just different memory capacity and type. I'd be more curious if this means different memory bus widths and, like 1060 before this, different internal configurations.
The fact that there is a 4gb variant is a total give away.
 
Wow...just wow. Cant say I like this many variants...holy cow.

I'd have to imagine youd be in a minority saying AA isnt needed at 1080p. Even on fps I notice AA off at that res. If I wanted to play minecraft, I would. :p

4k is the only place I can disable it and still manage to notice.

Now if it's an fps issue, of course, shut it down...but it's to the detriment of IQ big time at 1080p.
The number of variants is nuts. This must be how they plan on clearing out their GDDR5X stockpile :laugh:.
 
This is getting ridiculous, 3 GB mid range GPU in 2019? I have RX 580 4 GB GPU in my 2nd build and have problems playing AC Odyssey with high quality textures turned on. I hope AMD doesn't catch NGreedias' flu. GTX 1070TI/1080TI were the last NVidia's GPUs worth buying in 2018 and as it looks like things ain't gonna change for better in 2019.
 
I see a paid shill showed up in short order to protect nvidia. Cmon, nvidia. You gotta hire Clinton levels of shills to have a chance (note: not a political remark, literally a reference in scale).
Why would Nvidia need protection in this case?
You don't want a 3GB mid range card, you don't buy one. That how it was before, that's how it still is.
 
Last edited:
Great news! I was afraid the GTX 2060 would turn out to be like the half dozen different variants of the GTX 1060 and confuse people....

You don't want a 3GB mid range card, you don't buy. That how it was before, that's how it still is.

So let it be written, so let it be done
 
Please note that I did not say there is no use cases for having more memory, but just because some may need it, doesn't mean everyone needs it. For many, the cheaper cards offer much more value. It's no accident that both AMD and Nvidia offer their RX 480/580 and GTX 1060 in low and high memory configurations respectively.

And as I've mentioned, the fact that a game allocates more memory doesn't mean it needs it. To evaluate that, you need to evaluate reductions in performance and/or rendering quality.


If you keep expanding it, it's no longer the mid-range. ;)
In the mid-range, GTX 1060 3GB/6GB have been the better choice over RX 480/580 4GB/8GB. The only slice in there where AMD have no direct competition from Nvidia is the new RX 590, but the only argument for this one is if you can't afford GTX 1070. RX 590 is still a little odd, considering how close to GTX 1060 and RX 580 it really is. In general, AMD is hardly competitive in the mid-range, and except for possibly the RX 590, they can't claim to offer better value. This is only going to get more challenging as RTX 2060 arrives.
In the low-end, AMD have more compelling options, like the RX 570 vs. GTX 1050 Ti.

Video games these days definitely DO need more than 3GB. Even 4GB is not enough, I am an ex-Fury owner, trust me.

We know the tired argument on allocation and actually needing it, but games actually do need it at this point in time. Also, the "GPU too weak" argument is moot - the 7970 is weaker and still sees improvement going from 3GB to 6GB models, besides textures are one of the cheapest and best ways to improve immersion, provided one has the VRAM. Id rather lower lighting on a weaker GPU than textures. Using more VRAM for real wont tank performance in the vast majority of cases.

To add more to the argument - mods. We are talking about PC gaming GPUs, so modifications matter. Many mods improve graphics and textures but are VRAM heavy, so for a lot of people, the VRAM size is important due to that as well.

3GB on an RTX 2060, a GPU likely above 1060, maybe something between Fury and 1070 in performance, with 3GB of VRAM? yeah it is bad. Even 4GB is bad there. Even 6GB is not great. And there are games and mods that can crush 8GB of VRAM at 1440p as well so...

I will ignore the part about the RX 570 and 580 not being competitive since its likely bait.
 
Video games these days definitely DO need more than 3GB. Even 4GB is not enough, I am an ex-Fury owner, trust me.

It depends on what you play, really. Just look at the list of games TPU has reviewed, most of them will run fine with 3GB of VRAM. Sure, that raises the question of you needing an RTX 2060 if you play mainly those games, but that's another discussion.

And yes, I would prefer mid range cards to start at 4GB, but until we see prices and internal configuration, we really don't know what's going on here. We're just foaming at the sight on Nvidia in the title.
 
Would you honestly recommend a 1060 3GB, at all? I mean seriously? the 6GB, yes, I get that. But the 3GB, no. The card is a joke.
Weren't you the one talking about hostility a few posts back? ;)

GTX 1060 3GB is certainly no joke, and no more a joke than e.g. RX 580 4GB. You are too fixated on specifications, specifications which you may not fully understand. Pascal and Turing have more advanced memory compression than Polaris, and the GPUs allocate memory differently. Comparing across architectures is not that simple, that's why I keep reminding people to look at benchmarks matching their use case. I even have a 3GB version in one of my PCs, and I've put them into several other builds as well.

And you keep missing the point; is more better? Yes, certainly! But what if you don't need more? There is a good price difference for GTX 1060 3GB vs. 6GB and RX 580 4GB vs. 8GB. A gamer on a very tight budget on 1080p who want higher framerate rather than higher details would certainly appreciate having the option.

My issue with the various versions of GTX 1060 is the naming, I would have called the 3 GB version GTX 1060 and the 6GB version GTX 1065/1060 Ti etc.
 
Last edited:
It depends on what you play, really. Just look at the list of games TPU has reviewed, most of them will run fine with 3GB of VRAM. Sure, that raises the question of you needing an RTX 2060 if you play mainly those games, but that's another discussion.

And yes, I would prefer mid range cards to start at 4GB, but until we see prices and internal configuration, we really don't know what's going on here. We're just foaming at the sight on Nvidia in the title.

This is silly. People do not go "Well I have X GPU so I wont play games A through C". Almost all modern GPUs can achieve decent performance in all games, without exceptions. If you accept console settings, a 1050 Ti will do 50-60 fps at 1080p. If you accept 30 fps, the RX 570 can do high settings, 4K or Ultra 1080p. People with RX 560s wont just say "Well I wont play DOOM at all!" they will lower a few settings and have a great experience yet again.

And for what its worth - VRAM choking takes time to truly become an issue. I can tell you games like DOOM can and do choke my old Fury due to a demand for more memory, at 1440P, despite what reviews show. They likely test scenes for shorter amounts of time, do not force Nightmare settings and/or are in parts of the game that are heavy, but not on memory.

And the modding thing is always a factor. Especially since it is popular across the entire PC Gaming community. From people with 2080 Tis to people with GT 1030s.
 
Would you honestly recommend a 1060 3GB, at all? I mean seriously? the 6GB, yes, I get that. But the 3GB, no. The card is a joke.
The 3GB version was $40 less for like 6% less performance at launch. I'm not sure how things have changed since then.
 
Almost all modern GPUs can achieve decent performance in all games, without exceptions.
They don't? There isn't minimum and recommended requirements on games to attain the optimal gaming experience?

The only people that 'accept' console settings are those who have to. You can lower settings in game and make it worse than a console. You can crank settings up and get 30 FPS and on a PC in most genre's, is NOT an acceptable average FPS (and 1% would be abhorrent). So... that is not silly to say I have a XXX GPU, can I play this game well? But then again not all PC users prefer or can afford a PC that will look better than a console. half the point of a PC is because in many cases, it looks better than a console. The lines have blurred over time, but... there is plenty of reason to look at a GPU and see if I can play a game.
 
They don't? There isn't minimum and recommended requirements on games to attain the optimal gaming experience?

The only people that 'accept' console settings are those who have to. You can lower settings in game and make it worse than a console. You can crank settings up and get 30 FPS and on a PC in most genre's, is NOT an acceptable average FPS (and 1% would be abhorrent). So... that is not silly to say I have a XXX GPU, can I play this game well? But then again not all PC users prefer or can afford a PC that will look better than a console. half the point of a PC is because in many cases, it looks better than a console. The lines have blurred over time, but... there is plenty of reason to look at a GPU and see if I can play a game.


Do not look at minimum and recommended system requirements. They are made up and make no sense at all. I do not even watch them these days and have not for over a decade.

The difference between Medium and Ultra settings... is overrated most of the time. Plus, most of the time, Ultra settings is not even true Ultra these days. Witcher 3's settings menu does not compare to what even a pleb like me can do in the ini file in 2 minutes without any real modding. That is real Ultra settings :) .

If you can afford an RX 560/ GTX 1050 then you are already quite a bit above the PS4 and Xbox One basic. PS4 Pro is about matched by 1050 Ti and Xbox One X always loses in games vs the GTX 1060 6 GB.

I do not like using different standards for different things. One standard and zero hypocrisy or the discussion is worthless.
 
I do not like using different standards for different things. One standard and zero hypocrisy or the discussion is worthless.
Me either. In this case you are setting up a second standard (yours) and dismissing what the majority follows. So......... there is that. :)

I didn't say there is a huge difference between settings, but certainly noticeable (which will depend on the title of course). Again, people don't PC game to need to turn settings down (unless they have to...$). What you as an enthusiast does to manipulate games files isn't what I would call 'standard' either. So to keep on the one standard and zero hypocrisy line of thinking.... people don't generally do that.

Again, those requirements are there for a reason. You choosing not to follow them is your choice, however they are there to attempt to set expectations on game play with hardware.
 
If you can afford an RX 560/ GTX 1050 then you are already quite a bit above the PS4 and Xbox One basic. PS4 Pro is about matched by 1050 Ti and Xbox One X always loses in games vs the GTX 1060 6 GB.
Don't forget that many budget gamers upgrade old computers with a new cheap graphics card, just saying…
 
Can the RTX 2060 play crysis?
 
Me either. In this case you are setting up a second standard (yours) and dismissing what the majority follows. So......... there is that. :)

I didn't say there is a huge difference between settings, but certainly noticeable (which will depend on the title of course). Again, people don't PC game to need to turn settings down (unless they have to...$). What you as an enthusiast does to manipulate games files isn't what I would call 'standard' either. So to keep on the one standard and zero hypocrisy line of thinking.... people don't generally do that.

Again, those requirements are there for a reason. You choosing not to follow them is your choice, however they are there to attempt to set expectations on game play with hardware.


The majority buys AAA games like Origins, Odyssey, Wolfenstein 2, DOOM, Witcher 3 and others just fine...

What I am doing as an enthusiast is enabling what Ultra used to mean 15-20 years ago. :) Nothing more, just a return to how things used to be. I do not think people have to do that since I think the "Ultra or nothing" mentality in PC Gaming is insane.

Those requirements do not reflect reality so the reason they are there is not objectively correct and has not been in the past decade. Show me at least 2-3 times when it meant something from the last 5 years and I will give you some props. As it is, it is often just lies.
 
Back
Top