Monday, August 10th 2020

Video Memory Sizes Set to Swell as NVIDIA Readies 20GB and 24GB GeForce Amperes

NVIDIA's GeForce RTX 20-series "Turing" graphics card series did not increase video memory sizes in comparison to GeForce GTX 10-series "Pascal," although the memory itself is faster on account of GDDR6. This could change with the GeForce RTX 30-series "Ampere," as the company looks to increase memory sizes across the board in a bid to shore up ray-tracing performance. WCCFTech has learned that in addition to a variety of strange new memory bus widths, such as 320-bit, NVIDIA could introduce certain higher variants of its RTX 30-series cards with video memory sizes as high as 20 GB and 24 GB.

Memory sizes of 20 GB or 24 GB aren't new for NVIDIA's professional-segment Quadro products, but it's certainly new for GeForce, with only the company's TITAN-series products breaking the 20 GB-mark at prices due north of $2,000. Much of NVIDIA's high-end appears to be resting on segmentation of the PG132 common board design, coupled with the GA102 silicon, from which the company could carve out several SKUs spaced far apart in the company's product stack. NVIDIA's next-generation GeForce "Ampere" family is expected to debut in September 2020, with product launches in the higher-end running through late-Q3 and Q4 of 2020.
Sources: WCCFTech, VideoCardz
Add your own comment

65 Comments on Video Memory Sizes Set to Swell as NVIDIA Readies 20GB and 24GB GeForce Amperes

#26
B-Real
Xex360Didn't ATI use 512bit bus? I believe it was on the HD2900XT.
Yes, they did in many products, like the 290-290X, 390-390X. The weird is not the big number, but the number itself: 128 bit, 256 bit, 384 bit, 512 bit. But 320 bit? However, as earlier someone wrote, the 8800 GTS had 320 bit interface.
Posted on Reply
#27
zlobby
DuxCroWCCF is the crappiest tech tabloid on the net. They just keep pressing F5 on reddit and posting absolutely every rumor they find.
Yeah, I'm pretty sure no one forces you to read them.
Posted on Reply
#28
bug
Xex360Didn't ATI use 512bit bus? I believe it was on the HD2900XT.
Yeah, it's not new. Back in the day, manufacturing the board to support that was prohibitively expensive (traces, layers), everyone just cut back to 256 bit memory interfaces.
Posted on Reply
#29
Athlonite
Xex360Didn't ATI use 512bit bus? I believe it was on the HD2900XT.

32 GB, HBM2, 4096 bit


I'll just leave this here
Posted on Reply
#30
Th3pwn3r
yeeeeman2080ti owners crying incoming...
And why are we supposed to be crying? I'll get a 3080ti too. Ahhhh, I know now, tears of joy :D
Posted on Reply
#31
bug
Athlonite

32 GB, HBM2, 4096 bit

I'll just leave this here
Because HBM is totally the same as GDDR, right? :wtf:
Posted on Reply
#32
Assimilator
TheDeeGeeMore memory means they have an excuse for the insane prices.
NVIDIA isn't going to add VRAM to their GPUs just to charge consumers more. That would be a suicidally stupid move, and NVIDIA is run by people who are neither stupid nor suicidal. Somewhat greedy, yes, but not that greedy.
Posted on Reply
#33
Athlonite
bugBecause HBM is totally the same as GDDR, right? :wtf:
still designed to do the same job isn't it it's just built differently
Posted on Reply
#34
efikkan
In theory, I wouldn't mind cards having more memory. But I'm worried it will drive up prices, so I would prefer if they did it like back in the "old" days; let the AiBs choose between two memory capacities. I once bought a GTX 680 4GB (instead of the usual 2GB) because I needed it for a project, so it's nice to have the option, but most people don't need it. For the current gaming landscape, 8 GB for the mid-range and 12 GB for high-end should be plenty.

If Nvidia choose to double their memory it might be due to what their expecting from AMD, or because of games they know are coming. But I'm doubtful of the latter, since games are more starved for bandwidth than memory capacity as of now.
Posted on Reply
#36
EarthDog
efikkanIf Nvidia choose to double their memory it might be due to what their expecting from AMD, or because of games they know are coming. But I'm doubtful of the latter, since games are more starved for bandwidth than memory capacity as of now.
im doubtful of either. Id be surprised if amd comes with more than 16...
CrAsHnBuRnXpNvidia just teased Ampere for Aug 31st.
links.......?
Posted on Reply
#37
efikkan
EarthDogim doubtful of either. Id be surprised if amd comes with more than 16...
You might be right, but it also depends on which card they serve with 16GB. I don't think many outside either company have a good idea of what's coming.
Perhaps "big navi" is a ~60 CU 256-bit memory card, affordable, decent, but no high-end card. ;)
Posted on Reply
#39
bug
I'm not sure where GDDR6 stands, but afaik 8GB of fast memory will draw 5-10W.
My money's on this being made up entirely, or something for datacenter parts.
Posted on Reply
#40
ThrashZone
Hi,
Mainstream 20 series cards were pretty skimpy on memory only 8gb's compared to 1080ti 11gb's so any additional memory would be welcomed
But then again they aren't talking or guessing on mainstream cards are they and these would be north of any mainstream gaming budget.
Posted on Reply
#41
Tomorrow
ThrashZoneHi,
Mainstream 20 series cards were pretty skimpy on memory only 8gb's compared to 1080ti 11gb's so any additional memory would be welcomed
But then again they aren't talking or guessing on mainstream cards are they and these would be north of any mainstream gaming budget.
2080Ti was 11GB just like 1080Ti. Tho yes 1080Ti's direct replacement 2080 had 8GB.

Imho Turing is/was one of the worst Nvidia series. Not because it was late, slow or hot. Rather it's price/perf was very poor. It did not offer meaningful upgrade over Pascal (unless you upgraded from 1060 to 2080 or something). The launch was botched, unprofessional and very haphazard. Very unusual for Nvidia who is usually known for well executed launches.
The new features are only now somewhat worth it but game support is very limited still.

I expect Ampere to be much bigger step due to 7/8nm and possible AMD competition.
Posted on Reply
#42
lexluthermiester
btarunrWCCFTech has learned that in addition to a variety of strange new memory bus widths, such as 320-bit, NVIDIA could introduce certain higher variants of its RTX 30-series cards with video memory sizes as high as 20 GB and 24 GB.
Gonna have to agree with many of the above users, citing WCCFTech as a reference source is very bad idea as they are a bit of a trash site. Additionally, the use of a 320bit memory bus is uncommon but certainly not new. NVidia has used it several times in the past, specifically with the Geforce 8800GTS, the GTX470, the GTX570 and the Quadro 5000. The first usage dates back 14 years.
Posted on Reply
#44
ThrashZone
Tomorrow2080Ti was 11GB just like 1080Ti. Tho yes 1080Ti's direct replacement 2080 had 8GB.

Imho Turing is/was one of the worst Nvidia series. Not because it was late, slow or hot. Rather it's price/perf was very poor. It did not offer meaningful upgrade over Pascal (unless you upgraded from 1060 to 2080 or something). The launch was botched, unprofessional and very haphazard. Very unusual for Nvidia who is usually known for well executed launches.
The new features are only now somewhat worth it but game support is very limited still.

I expect Ampere to be much bigger step due to 7/8nm and possible AMD competition.
Hi,
Yep
I recently got a 150.00 evga 980ti sc just to hold me over on a meager machine and a friend said something about a 2060KO a 350.00 card lol both 6gb cards :)
But yeah I'm passing on 20 series completely by design
All the early dead ones pretty much made that policy.
Posted on Reply
#45
Metroid
That is good, the more the merrier, 8GB is already the norm and if we look at the future, 16gb or more is ideal.
Posted on Reply
#46
Th3pwn3r
Tomorrow2080Ti was 11GB just like 1080Ti. Tho yes 1080Ti's direct replacement 2080 had 8GB.

Imho Turing is/was one of the worst Nvidia series. Not because it was late, slow or hot. Rather it's price/perf was very poor. It did not offer meaningful upgrade over Pascal (unless you upgraded from 1060 to 2080 or something). The launch was botched, unprofessional and very haphazard. Very unusual for Nvidia who is usually known for well executed launches.
The new features are only now somewhat worth it but game support is very limited still.

I expect Ampere to be much bigger step due to 7/8nm and possible AMD competition.
If you are 4k gaming the upgrade from a 1080ti to a 2080ti is definitely worth it. Pricing was so bad for both 10 and 20 series because of the miners and shortages+ Nvidia being greedy.
Posted on Reply
#47
ThrashZone
Th3pwn3rIf you are 4k gaming the upgrade from a 1080ti to a 2080ti is definitely worth it. Pricing was so bad for both 10 and 20 series because of the miners and shortages+ Nvidia being greedy.
Hi,
When 1080ti released mining wasn't a problem
It was months after release evga 1080ti ftw3 very close to top of the line gpu was 770.us at micro center where i got mine other 1080ti's were much lower 600.us range.
But yes later mining completed exaggerated prices.
Posted on Reply
#48
Franzen4Real
AssimilatorI've been beating this particular drum of quality over quantity for a while now. Good to know I'm not alone.
Exactly. And quality articles tend to also lead to quality discussion/comments ;)
EarthDogI really don't see a point in more than 12GB/16GB for this generation.
I agree. This would be especially true if asset streaming from ssd's becomes widely used this gen.
Posted on Reply
#49
QUANTUMPHYSICS
As I sit here with my 2080Ti 11GB RAM...I wonder what I could possibly need 20GB or 24GB for?

and all I'm playing is flight sims and Counter Strike GO.
Posted on Reply
#50
medi01
Soo... both next gen consoles go with 16GB for GPU + CPU, while targeting 4k. For Sony it's unified, Microsoft has 10GB of higher throughput, 6GB of lower throughput RAM.

Much faster SSD to GPU RAM transfers hence no need to pre-load data that is not immediately displayed is stated as the reason for going with conservative sizes (Cerny @ Sony)

Why would PC graphic cards of the next gen need more than 16GB? FOr people without SSDs?
Even going beyond 10GB seems an overkill.
Posted on Reply
Add your own comment
Apr 16th, 2024 06:16 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts