• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce RTX 2060 to Ship in Six Variants Based on Memory Size and Type

I don't think this is enough variants. Please lets add GTX 1160's to this. :kookoo:
 
Makes me even more happy with the 8GB on the RX 570 I just picked up for £150. (plus 2x free AAA games). Yes I know 2060 will be faster but AMD offering 8GB at this price point is insane value. Oh and the card runs 1080p just fine.

I had to sell my ENTIRE pc because I got behind on the rent. In fact the only thing new in my current PC is this graphics card. Bargain basement Pc. Check my pc specs if you wanna know but 99% of it is either super cheap 2nd hand or freebies I got given by friends.
 
Low quality post by Argyr
AMD = overheating, power hungry, unstable, bad software, no innovation BUT IT'S CHEAPER!!
NVIDIA = expensive

I really don't get the NVIDIA haters, AMD is pure shit, it's the Walmart variant of video cards
 
For 1080p screens(which most gamers are still using), 3GB is still reasonable
3GB is cutting it really fine IMO. Even at 1080p. Fallout 4 will use 3GB+ at 1080p without the HD textures (the HD pack is a joke of unoptimised BS though). and Far Cry 5 will use over 4GB at 1080p too. While most games could be played easily on 3GB with medium or less-than-ultra textures, it is completely lacking any form of future proofing at all. A bad investment IMO.

AMD = overheating, power hungry, unstable, bad software, no innovation BUT IT'S CHEAPER!!
NVIDIA = expensive

I really don't get the NVIDIA haters, AMD is pure shit, it's the Walmart variant of video cards

Low quality troll is low quality.
 
You can easily run almost every single game on a 3GB card at 1080p just fine, but you're gonna lower the texture quality in some games and that's all. the 3GB version is not ideal but is not as usless as some people think, it actually might be a good value card for those needing a fairly powerful GPU and don't care about the maximum quality textures or VRAM related settings.
if the 3GB RTX 2060 is going to be priced under 200$, I don't see a problem but anything higher than 200$ is unacceptable.
8GB VRAM on a 800$ card (RTX 2080) is more disappointing than 3GB on a budget card to me.

Some games will simply NOT run at 3GB. There isnt that much of them, but that said, 6GB is bare minimum. Wouldnt consider anything with less than 8GB as upgrade at this point.
 
First AMD fanboys blaming Nvidia for holding AIBs on short leash. Then Nvidia let AIBs run loose... Still Nvidia's fault. :laugh:
 
Low quality post by Argyr
Low quality troll is low quality.

I'm not trolling, but the NVIDIA hate these days is slowly getting to me.

Just look at the forums: AMD sub-forum has 387 pages while NVIDIA has 230. AMD cards are shit-tier. Whenever a new game gets released the internet forums are full of AMD owners crying.
 
I'm not trolling, but the NVIDIA hate these days is slowly getting to me.

Just look at the forums: AMD sub-forum has 387 pages while NVIDIA has 230. AMD cards are shit-tier. Whenever a new game gets released the forums are full of AMD owners crying.
Sure thing hon.
 
This is really stupid. Just test each config and pick the one that makes the most sense. I don't see them doing this crap with the more expensive models so why this one?
 
3GB is cutting it really fine IMO. Even at 1080p. Fallout 4 will use 3GB+ at 1080p without the HD textures (the HD pack is a joke of unoptimised BS though). and Far Cry 5 will use over 4GB at 1080p too.
That assumes full AA, which most people turn down or off, which naturally reduces the memory footprint, even with HD texturing.

AMD is the retarded child that you have to hide in the attic when guests come over to visit.
Please take your fanboy trolling elsewhere, no one here cares about such silly nonsense..
 
That assumes full AA, which most people turn down or off, which naturally reduces the memory footprint, even with HD texturing.
True, but with 2060 approaching 1070 performance, surely you'd expect people to want to play with settings maxed? I always thought the xx60 series were about max settings FHD gaming.
 
I'm going for the one that has the fastest changing RGB lighting on it.
 
True, but with 2060 approaching 1070 performance, surely you'd expect people to want to play with settings maxed? I always thought the xx60 series were about max settings FHD gaming.
Not really. I have a 2080 and still turn off AA. No real need for it at 1080p and above. Most gamers prefer framerate VS AA.
 
Not really. I have a 2080 and still turn off AA. No real need for it at 1080p and above. Most gamers prefer framerate VS AA.
Fair enough. I suppose it depends on the type of AA, too. I'm actually running TAA enabled in Fallout 4 and the 570 manages it at 60fps but I am also turning down other things like shadows. Warframe also uses TAA but that's really light and I'm at 120FPS more often than not at FHD. Hilariously, Fallout 76 uses over 7GB of vram at 1080p. Don't even ask why, the textures don't look anywhere near good enough haha. I think its caching the RAM or Bethesda just copy pasted their Fallout 4 Hi res textures to it. (really badly optimised). I would still prefer more VRAM though. I feel like 8GB on my 570 was a real bargain even though I won't be able to use it all before I run out of GPU power but at least I will have zero vram issues.
 
You can get an 8 GiB RX 580 for $220. A 3 GiB card these days shouldn't be going for much more than $120 (the realm of budget cards).
I think one RTX 2060 is plenty to fill the market. But comparing GPUs and setting prices based on memory size is pure BS. 3GB is fine for an entry mid-range card as long as proper benchmarking shows it's fine.

Makes me even more happy with the 8GB on the RX 570 I just picked up for £150. (plus 2x free AAA games). Yes I know 2060 will be faster but AMD offering 8GB at this price point is insane value. Oh and the card runs 1080p just fine.
RTX 2060 will perform far beyond RX 570, a low-end card. Brag all you want about 8GB, there is no way that card needs it.

3GB is cutting it really fine IMO. Even at 1080p. Fallout 4 will use 3GB+ at 1080p without the HD textures (the HD pack is a joke of unoptimised BS though). and Far Cry 5 will use over 4GB at 1080p too.
Memory usage doesn't mean memory requirement, many applications allocate more memory than they need. What matters is performance, or lack thereof. Stuttering might be a indicator of too little memory.
 
A 3GB card in 2019 for $300 incoming!

I hope AMD will wake up fast! Maybe they will be able to bring some serious competition on 7nm, at least as long Nvidia continues with 12nm /16nm.
 
6 variants of 2060 rofl this is totally insane, i really hope this isn't true, but seeing something similar happened with 1060 doesn't really give me much hope. They should've upgraded 2060 to 8GB and maybe made a 4GB version both GDDR6 ofc or if the price was right, the 4GB variant could've had GDDR5X just to cut che costs.
 
First AMD fanboys blaming Nvidia for holding AIBs on short leash. Then Nvidia let AIBs run loose... Still Nvidia's fault. :laugh:
NVIDIA is the one shipping 6 variants of the silicon to AIBs. AIBs install the GDDR chips that are compatible. AIBs can only control which chips they get, not what is available.

I think one RTX 2060 is plenty to fill the market. But comparing GPUs and setting prices based on memory size is pure BS. 3GB is fine for an entry mid-range card as long as proper benchmarking shows it's fine.
DRAM is a large chunk of the cost to manufacture a video card. Additionally, having so little VRAM makes the card less valuable to gamers even at 1920x1080. 64-bit games (which most are these days) will use >4 GiB of VRAM if it is available. Couldn't care less about benchmarks. More and more games not only do graphics on GPU, but computation on GPU as well.
 
Last edited:
64-bit games (which most are these days) will use >4 GiB of VRAM if it is available.
"64-bit" games have nothing to do with GPU memory usage. That makes no sense from a technical perspective.

Couldn't care less about benchmarks.
This really says it all, doesn't it?

More and more games not only do graphics on GPU, but computation on GPU as well.
They do, but this is still irrelevant.
 
Not really. I have a 2080 and still turn off AA. No real need for it at 1080p and above. Most gamers prefer framerate VS AA.
I have a RTX 2070 and play everything on max settings with AA on triple monitors
 
Write your congressman, limit the number of designs to one! Because choice is to be avoided! :kookoo:

Wth guys, this is just different memory capacity and type. I'd be more curious if this means different memory bus widths and, like 1060 before this, different internal configurations.
 
Back
Top