• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA Reportedly Working on GeForce RTX 3080 Ti Graphics Card with 20 GB GDDR6X VRAM

GA102 has a 384bit memory bus that Nvidia can play with. For some reason they did not want to do a full-width card as x80, product segmentation is definitely a big reason but I would suspect not the only one (also power or yields perhaps) especially with the rumored 3080Ti still having 320-bit bus. 16gb would mean going down to 256-bit memory bus and they seem to want to avoid going there probably because of sizable hit to bandwidth. Basically, lots of considerations.

Nah definitely not yield reasons, if you look at all the 3080 cards they all have missing VRAM chips on the same spot. Now if the GPU dies are binned for bad memory controllers then its impossible for all of them to have a defective controller all in the same channel. This is purely a product segmentation reason. Power reasons not so much either, adding an extra 64-bit channel won't add much more than probably 10W at the most. They kept the "3080Ti" if true at 320-bit just so the 3090 still have some sort of advantage at least on paper, but I suspect it won't have a real performance impact other than in some edge cases.
 
so amd puts oil in the fire by asking partner to announce 12gb vram requirement (which obsoletes 2/3 of current nvidia lineup and 100% of last gen) for their upcoming game.
Imagine the outrage if Nvidia did something like that ;)

Power reasons not so much either, adding an extra 64-bit channel won't add much more than probably 10W at the most.
This is something that seems very strange to me for 3080/3090. Unless GPU-Z returns crap data, MVDDC usage is quite high and GPU seems to draw less power than I would expect from total. Check the GPU Chip Power Draw and MVDDC Power Draw results in GPU-Z. The numbers themselves vary but the relative amounts seem surprising to me. Also, 3090 cards seem to have heavy-duty backplates, I do not remember seeing backplates with heatpipes before.

For comparison, my 2080 has 200-210W of the reported power draw on the GPU and about 20W on RAM plus minor amounts left over for other stuff. I seem to remember GPU taking the majority of power budget from earlier generations as well. Was something changed in how this stuff is reported or is GDDR6X really this power hungry?

Just as an example, the first 3080 GPU-Z screenshot Google search returned:
nvidia-geforce-rtx-3080-memory-oc-20-gbps-teaser-videocardz-5.png
 
Last edited:
3090 is never going to get price cut that soon if ever, that's why we are getting a 3080Ti

sadly this^ Nvidia does not drop pricing because that would mean admitting fault to a degree.
Instead they just launch new SKU's to compete.
 
This is something that seems very strange to me for 3080/3090. Unless GPU-Z returns crap data, MVDDC usage is quite high and GPU seems to draw less power than I would expect from total. Check the GPU Chip Power Draw and MVDDC Power Draw results in GPU-Z. The numbers themselves vary but the relative amounts seem surprising to me. Also, 3090 cards seem to have heavy-duty backplates, I do not remember seeing backplates with heatpipes before.

For comparison, my 2080 has 200-210W of the reported power draw on the GPU and about 20W on RAM plus minor amounts left over for other stuff. I seem to remember GPU taking the majority of power budget from earlier generations as well. Was something changed in how this stuff is reported or is GDDR6X really this power hungry?

Yes GDDR6X is much more power hungry on the controller and memory chips it seems, seeing all the more thought out memory cooling solutions. But again just one extra chip and an extra 64-bit channel won't draw that much more power. I'd be very surprised if someone can prove me wrong on this.
 
So.. basically, nvidia have been scrambling trying various different configs to fill in the perf/price gap with AMD.
 
Nvidia's problem is complacency in the past has fallen over the years. When they led and everything was allowed to them, including playing as they should with the prices of their products. They were literally zombies to themselves that this would go on forever and that they shouldn't even make much of an effort to maintain the status quo.
 
LOL, Jensen has done it again! Price hikes of Turing are here to stay and no one seems to be pissed about it anymore.

2080TI $999 -> 3080TI +$999 probably,
2080 $699 -> 3080 $699
2070 $499 -> 3070 $499
2060S 399$ -> 3060TI $399 probably

xx60 MID RANGE class GPU costing the same amount as an entire console. Pure madness.
 
I am not surprised this is properly the reason they cancelled the 20GB version of the RTX 2080 and it would get too expensive.

Welcome to the Nvidia Screw-over train, we take your money and screw you over after a little while :roll:

Same happened to Titan Pascal owners with the when GTX 1080 Ti launched it was cheaper, faster for a lot of things then the Titan Pascal was then we out phase because we can't have a card that can out perform our Titan, and then a new Titan Pascal was launched with even more CUDA cores :laugh:
 
This might be the first Ampere card that's not hamstrung by small memory outside of the 3090.

Unfortunately it'll be priced $200 too high most probably.
 
Price hikes of Turing are here to stay and no one seems to be pissed about it anymore.
I'm pissed a lot, but there is nothing to do, considering AMD will follow suit with prices. The price war has died.
 
Panic mode.
I am so happy with this news. haha.
another nuclear reactor to warm up the house in the upcoming winter season.
surviver? :(
 
This is going to grind the gears of early adopters, want to know what will grind them even more? When refreshes pop up on the 7nm node next year, they will probably be called, the super hyper ultra edition for lolz. Then we can also have more models to pick from for extra confusion, at prices that aren't mainstream... and of course, you can have one for a mere $1 000 000 from a scalper near you.
 
You can run games with insufficient VRAM just fine. Stutter doesn't always appear in reviews and canned benchmarks. But I'm sure the W1zz is right on the money trying to figure that out if the moment arrives. Nvidia can deploy a lot of driver trickery to still provide a decent experience, they've done similar with the 970 for example - all the VRAM related bugs were fixed on that GPU.

Does that all TRULY mean there is enough VRAM though? That is debatable. The real comparison is side by side and careful analysis of frametimes. Gonna be interesting.
trough the years of playing games and following games requirements i have impression that devs decide game requirements and vram in particular either by vga cards available or soon to come (a month or 2 max) on the market or more often just by playing darts.

on my 290x 4gb i've played titles that "required" 6 or even 8gb vram just fine. i've dialed up texture quality above recommended if i didnt liked how the game looked and still hadnt problems because of lack of vram. so judging what is enough based on game requirements is a bit pointless.
set a price range. check what meets your performance requirements. buy the card with highest amount of vram that fit your price and you are good to go. by the time the games look too ugly because you had to lower textures the card would be long dead fps wise.
as for 970 the problems never was in the amount of vram. slow 0.5gb was what caused the problems as it tanked performance very hard. when nivida isolated those 0.5gb with drivers 970s worked fine even with titles that required 4+ gb vram.

on tech level both camps have different approach for solving vram limitation.
nvidia's lossless compression allows them to have lower capacities and bus but to preserve higher performance. so they fit as min as possible memory for bigger margins.

with gnc amd had to throw a lot of memory bandwidth (bus for 7970 was 384bit, 290x was 512bit, fury, vega and vii were 1024bit) to provide enough "fuel" for gpu but it never was enough. from rdna amd have memory bus topped at 256bit which before was for their midrange cards (no doubts 5700xt itself is midrange card) and now with rdna2 even their top tier 6900 has 256bit bus. sure new cash provides higher speeds but still you need to feed this cash with adequate speeds and amd thinks that what was before bus suitable for mid range cards is now enough even for flag ship.
i think 16gb ram in amd's cards is more targeted at feeding the cash (like load all textures in vram so cash can have instant access w/o need of calls from ram/storage) and/or they believe the can have significant performance boost from direct cpu access to vram so they made sure they provide enough vram for devs to play with.
it will be interesting to see if those thing will really help amd

Imagine the outrage if Nvidia did something like that ;)
i dont have to imagine anything. they already did it with hairworks, forced tessellation and gameworks (or wharever it was called) extensions and physx. i dont remember the outrage thou. :rolleyes:
now that amd holds consoles and devs has to do optimization for amd's hardware the coin has flipped and nvidia is quite jumpy when something becomes close to take away "performance crown".
a single game announcement is enough to cause... leakages :rolleyes:
btw physx is open source for some time now ;)
 
I'm pissed a lot, but there is nothing to do, considering AMD will follow suit with prices. The price war has died.

Yeah, it seems like AMD has chosen higher profit margins over gaining larger market share. I'm getting out of the GPU market, holding on to 1080TI as long as I can and then buy something for 300 bucks on 2nd hard market. I'm unwilling to support greed.
 
You all think this is just another gaming card? How naive...

The 3090 is a workstation class GPU that will be sold in droves to VFX studios and freelancers, who will buy them in pairs to use NVlink and get a much needed 48 GB for 3D rendering.

The 3080 Ti fills the gap for low end workstations. The 3080 10 GB just don't cut it for rendering or even complex video editing and FX. AMD's 6900 XT was looking like the right purchase until this announcement.

That's why this 3080 Ti makes tons of sense outside gaming. I for one, will buy it the instant I can find it in stock.
 
The 3090 is a workstation class GPU that will be sold in droves to VFX studios and freelancers, who will buy them in pairs to use NVlink and get a much needed 48 GB for 3D rendering.

lol, no.
Those VFX studios are buying Mac Pro's
 
Who cares.

"Auto-Notify"
"Not-In-Stock"
"Backordered"
 
Called it. I said from the start when the 3080 20GB was rumored it would be a TI version.
 
Nvidia is still acting arrogantly refusing to compete in terms of price but instead offering a largely worthless single digit performance differential.

Pride.
 
LOL, Jensen has done it again! Price hikes of Turing are here to stay and no one seems to be pissed about it anymore.

1080Ti $700 - 2080TI $1200 -> 3080TI +$1200 probably,
1080 $500 - 2080 $699 -> 3080 $699
1070 $350 - 2070 $499 -> 3070 $499
1060 $250 - 2060S 399$ -> 3060TI $399 probably

xx60 MID RANGE class GPU costing the same amount as an entire console. Pure madness.

Added for perspective.
 
Back
Top