Wednesday, November 4th 2020

NVIDIA Reportedly Working on GeForce RTX 3080 Ti Graphics Card with 20 GB GDDR6X VRAM

A leak from renowned (and usually on-point) leaker Kopite7kimi claims that NVIDIA has finally settled on new graphics cards to combat AMD's RX 6800 threat after all. After the company has been reported (and never confirmed) to be working on double-memory configurations for their RTX 3070 and RTX 3080 graphics cards (with 16 GB GDDR6 and 20 GB GDDR6X, respectively), the company is now reported to have settled for a 20 GB RTX 3080 Ti to face a (apparently; pending independent reviews) resurgent AMD.

The RTX 3080 Ti specs paint a card with the same CUDA core count as the RTX 3090, with 10496 FP32 cores over the same 320-bit memory bus as the RTX 3080. Kopite includes board and SKU numbers (PG133 SKU 15) along a new GPU codename: GA102-250. The performance differentiator against the RTX 3090 stands to be the memory amount, bus, and eventually core clockspeed; memory speed and board TGP are reported to mirror those of the RTX 3080, so some reduced clocks compared to that graphics card are expected. That amount of CUDA cores means NVIDIA is essentially divvying-up the same GA-102 die between its RTX 3090 (good luck finding one in stock) and the reported RTX 3080 Ti (so good luck finding one of those in stock as well, should the time come). It is unclear how pricing would work out for this SKU, but pricing comparable to that of the RX 6900 XT is the more sensible speculation. Take this report with the usual amount of NaCl.
Sources: Kopite7kimi @ Twitter, via Videocardz
Add your own comment

140 Comments on NVIDIA Reportedly Working on GeForce RTX 3080 Ti Graphics Card with 20 GB GDDR6X VRAM

#1
Vayra86
Again? Oh of course, the rumored Nvidia Ampere stack is full, as they've already had a 3060ti, 3070ti... next week we'll probably see another bit of rumor about the 3060ti. Rinse and repeat?
Posted on Reply
#2
ExcuseMeWtf
Announced Godfall system requirements and suddenly borderline vaporware RTX 3080 non-Ti not looking so hot anymore with its 10 GB :roll:
Posted on Reply
#4
Accelerator
It might be one of earlier ES cards. Just be discoverd recently and be considered as 3080Ti. In 20 series, we have ES cards of 2080Ti, which have 4352CUDA 12GB, 4480CUDA 11GB&12GB, 4608CUDA 11GB&12GB. However, these cards never be released. We can only see them in some second hand trading platform.
Posted on Reply
#5
the54thvoid
Intoxicated Moderator
It's all a bit silly when there's absolutely minimal stock in the channel. I'm guessing we'll see these in proper numbers early 2021?
Posted on Reply
#6
Chomiq
They need to start working on getting the cards to stores at msrp instead of scalping left and right on pre-orders that take forever to deliver.
Posted on Reply
#7
delshay
Man, I really love competition. I'm expecting a price cut of the 3090 some time in the future (within next three months).
Posted on Reply
#8
AnarchoPrimitiv
Nvidia might as well announce five more cards that are $1000+ to target a market that represents an exceedingly small amount of consumers and with virtually no stock, so that for all intents and purposes, it only exists on paper, all for the sake of.... Pride? Fuel for online fanboy/trolls? Maybe instead they should work on actually getting cards on the shelves so people can actually buy them instead of spending efforts on creating more paper launches.
Posted on Reply
#9
londiste
Vayra86Again? Oh of course, the rumored Nvidia Ampere stack is full, as they've already had a 3060ti, 3070ti... next week we'll probably see another bit of rumor about the 3060ti. Rinse and repeat?
Interesting how Nvidia plans are leaking these days. AIBs, I suppose?
After all the rumors, this 20GB 3080Ti does kind of make sense though. From specs it has the sole purpose of matching RX 6900XT (that undercuts 3090 primarily with price).
Posted on Reply
#10
ZoneDymo
I wonder if the article below is somehow related….
Posted on Reply
#11
Space Lynx
Astronaut
lol nvidia just got a fire lit under their butt. its glorious. jensen get skill boi! Lisa Su got them skills, chick is straight up level 90 Ninjitsu
Posted on Reply
#12
Vya Domus
There is something quite surreal about planning to launch a new product that provides maybe 10% more performance at best when the rest of the lineup is still almost nonexistent in stores. Their strategy is also bewildering , this is clearly going to be a 1000$ card competing with the 6900XT but why ? AMD was never going to sell many 6900XTs, why not lower the price of the 3080 or 3070, Nvidia is still acting arrogantly refusing to compete in terms of price but instead offering a largely worthless single digit performance differential. You know who kept doing that as well ? Intel and look what happened, AMD gained a colossal amount of mindshare in just 2-3 years, everyone marvels at every one of their products and scoffs whenever Intel comes out with something new, plus they ended up having to lower their prices anyway.
londisteInteresting how Nvidia plans are leaking these days. AIBs, I suppose?
That is indeed quite odd, there are usually no significant leaks with regards to Nvidia products not even right up to their launch. But this time it was very different, for instance the FE design was leaked months before release.

But the thing is most of these leaks are more often than not intentional, meant to drive interest in upcoming products.
Posted on Reply
#13
ShurikN
delshayMan, I really love competition. I'm expecting a price cut of the 3090 some time in the future (within next three months).
3090 is never going to get price cut that soon if ever, that's why we are getting a 3080Ti
Posted on Reply
#14
ratirt
londisteInteresting how Nvidia plans are leaking these days. AIBs, I suppose?
After all the rumors, this 20GB 3080Ti does kind of make sense though. From specs it has the sole purpose of matching RX 6900XT (that undercuts 3090 primarily with price).
It kinda makes sense but these cards will still be a niche. I'd rather focus on the 3070 and 6800 type of performance mostly.
Posted on Reply
#15
medi01
:D:D:D

I guess this means:

Posted on Reply
#16
Vayra86
the54thvoidIt's all a bit silly when there's absolutely minimal stock in the channel. I'm guessing we'll see these in proper numbers early 2021?
I'm having a strong Vega Deja-Vu where company in question realizes they've got a released stack with weak margins that will still get overrun completely. Ampere seems to move to that space rapidly as more information gets out in both camps.

The result was pretty weak availability until much later when the card's performance was hardly competitive. Sure, that was touted as a demand / production problem, too.

Smoke > Fire.
londisteInteresting how Nvidia plans are leaking these days. AIBs, I suppose?
After all the rumors, this 20GB 3080Ti does kind of make sense though. From specs it has the sole purpose of matching RX 6900XT (that undercuts 3090 primarily with price).
The 20GB card made sense to begin with, but only because the 3080 is somehow based on 10GB. The VRAM capacity still makes no sense in the larger picture. 12 or 16 would have been much more suitable. Realistically, their 20GB release would completely cannibalize their halo 3090, even before they managed to sell them proper, and also makes the 3080 less competitive in a way.

Watch this gen's VRAM requirements unfold... already its looking like that 3070 might not even be in an optimal place either. This also fits right in with Nvidia pre-empting those console announcements and overpromising on availability. They know they're screwed.
Posted on Reply
#17
Vya Domus
Vayra86The VRAM capacity still makes no sense in the larger picture.
It makes sense to them, see 10 GB appears to be too little and 20 GB too much. Physiologically you need to either settle for something that you know it's not quite enough or bite the bullet and pay the extra 300$ or whatever this one will cost.
Posted on Reply
#18
Vayra86
Vya DomusIt makes sense to them, see 10 GB appears to be too little and 20 GB too much. Physiologically you need to either settle for something that you know it's not quite enough or bite the bullet and pay the extra 300$ or whatever this one will cost.
Doesn't fit in with the previous Nvidia gens where every time they offered sufficient VRAM with some wiggle room on top in the high end. Nvidia still has some outs like DLSS and other proprietary stuff they announced with Ampere but still... I think they got blindsided here. Above all Nvidia is a company that wants to sell and wants a positive mindshare. That is what brought them where they are now. Not shitty releases that aren't quite enough. This is an exception to a rule.
Posted on Reply
#19
TumbleGeorge
10GB...12GB via 384 bit vs 20GB via 320(?) bit vs 24GB via 384 bit?

Guess:
RTX 3080 10GB -$100-150 from $699 to $599-549 to compete with RX 6700 XT?
RTX 3080 12GB for compete with RX 6800(without "XT")
RTX 3080 ti 20GB for compete with RX 6800 XT; RX 6900 XT and with RTX 3090 too :D for good compete with RX 6800 XT, RTX 3080 ti 20 GB must got price equal to MSRP of RTX 3080 10GB in release date ($699) or something close $749-799(?)


Panic defense?
Posted on Reply
#20
ne6togadno
marketing wars.
nvida has troubles to provide quantities from their new lineup.
ppl on the internet consider canceling preorders and switching to "big navi"
so amd puts oil in the fire by asking partner to announce 12gb vram requirement (which obsoletes 2/3 of current nvidia lineup and 100% of last gen) for their upcoming game.
minutes later leaker leaks news for 3080ti with 20gb vram (because you know 20>16).
i'll put a pint on bet that we'll see more ti/super leaks soon.
and another pint that godfall with 4k ultra settings will play @60 fps just fine on 8gb card if gpu itself has enough hp to do 4k@60
Posted on Reply
#21
The Quim Reaper
People missing out on early 3080 stock haven't just dodged a bullet but a full on artillery barrage... :p
Posted on Reply
#22
Vayra86
ne6togadnomarketing wars.
nvida has troubles to provide quantities from their new lineup.
ppl on the internet consider canceling preorders and switching to "big navi"
so amd puts oil in the fire by asking partner to announce 12gb vram requirement (which obsoletes 2/3 of current nvidia lineup and 100% of last gen) for their upcoming game.
minutes later leaker leaks news for 3080ti with 20gb vram (because you know 20>16).
i'll put a pint on bet that we'll see more ti/super leaks soon.
and another pint that godfall with 4k ultra settings will play @60 fps just fine on 8gb card if gpu itself has enough hp to do 4k@60
You can run games with insufficient VRAM just fine. Stutter doesn't always appear in reviews and canned benchmarks. But I'm sure the W1zz is right on the money trying to figure that out if the moment arrives. Nvidia can deploy a lot of driver trickery to still provide a decent experience, they've done similar with the 970 for example - all the VRAM related bugs were fixed on that GPU.

Does that all TRULY mean there is enough VRAM though? That is debatable. The real comparison is side by side and careful analysis of frametimes. Gonna be interesting.
Posted on Reply
#23
londiste
Vayra86The 20GB card made sense to begin with, but only because the 3080 is somehow based on 10GB. The VRAM capacity still makes no sense in the larger picture. 12 or 16 would have been much more suitable. Realistically, their 20GB release would completely cannibalize their halo 3090, even before they managed to sell them proper, and also makes the 3080 less competitive in a way.
You are looking at the reasoning from the wrong side. 12 or 16 do not make inherently more sense than 10 or 20, we are just more used to seeing these capacities.

GA102 has a 384bit memory bus that Nvidia can play with. For some reason they did not want to do a full-width card as x80, product segmentation is definitely a big reason but I would suspect not the only one (also power or yields perhaps) especially with the rumored 3080Ti still having 320-bit bus. 16gb would mean going down to 256-bit memory bus and they seem to want to avoid going there probably because of sizable hit to bandwidth. Basically, lots of considerations.
Vayra86Nvidia can deploy a lot of driver trickery to still provide a decent experience, they've done similar with the 970 for example - all the VRAM related bugs were fixed on that GPU.
There were no real bugs to be fixed. that whole 3.5+0.5GB was a design problem in hardware. From what I could see at the time what Nvidia eventually did as a workaround was forcing the basic usage like Windows' Aero stuff and other non-gaming stuff (that did not care about speed that much) to that last 0.5GB and using that for 3D load only when absolutely necessary. It did help that there were only a couple games in first couple years of GTX970's life span that actually had memory consumption in that 3.5-4GB range effectively making the impact surface surprisingly small.
Posted on Reply
#24
Vayra86
londisteYou are looking at the reasoning from the wrong side. 12 or 16 do not make inherently more sense than 10 or 20, we are just more used to seeing these capacities.

GA102 has a 384bit memory bus that Nvidia can play with. For some reason they did not want to do a full-width card as x80, product segmentation is definitely a big reason but I would suspect not the only one (also power or yields perhaps) especially with the rumored 3080Ti still having 320-bit bus. 16gb would mean going down to 256-bit memory bus and they seem to want to avoid going there probably because of sizable hit to bandwidth. Basically, lots of considerations.
I agree, I think this started with the choice for Samsung and having to adapt to those limitations. But in the end we're looking at these products as customers right? 10GB doesn't seem like the optimal selling point for capacity, why would Nvidia design something that's a choice of evils? They even know they have work to do to gain parity with consoles if you look at some of the tech they announced, like RTX IO. You're right, lots of considerations, I question whether they made the right ones.
Posted on Reply
#25
owen10578
londisteGA102 has a 384bit memory bus that Nvidia can play with. For some reason they did not want to do a full-width card as x80, product segmentation is definitely a big reason but I would suspect not the only one (also power or yields perhaps) especially with the rumored 3080Ti still having 320-bit bus. 16gb would mean going down to 256-bit memory bus and they seem to want to avoid going there probably because of sizable hit to bandwidth. Basically, lots of considerations.
Nah definitely not yield reasons, if you look at all the 3080 cards they all have missing VRAM chips on the same spot. Now if the GPU dies are binned for bad memory controllers then its impossible for all of them to have a defective controller all in the same channel. This is purely a product segmentation reason. Power reasons not so much either, adding an extra 64-bit channel won't add much more than probably 10W at the most. They kept the "3080Ti" if true at 320-bit just so the 3090 still have some sort of advantage at least on paper, but I suspect it won't have a real performance impact other than in some edge cases.
Posted on Reply
Add your own comment
Apr 29th, 2024 00:28 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts