• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

MSI GeForce RTX 3070 Ti SUPRIM X and Ventus 3X Pictured, 8GB GDDR6X Confirmed, GA104-based

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,853 (7.39/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Here are some of the first press-shots of the upcoming MSI GeForce RTX 3070 Ti SUPRIM X and MSI GeForce RTX 3070 Ti Ventus 3X graphics cards. Boxes of the cards confirm 8 GB GDDR6X as the memory configuration of the RTX 3070 Ti. Taking a close look at the press-shot of the RTX 3070 Ti SUPRIM X, and comparing it with those of the already-launched RTX 3070 SUPRIM X and RTX 3080 SUPRIM X, we find that the card looks closer to the RTX 3070 SUPRIM X. This would indicate that MSI is reusing the PCB and cooler design from that card, which means that the RTX 3070 Ti likely maxes out the GA104 silicon, rather than being a heavily cut-down GA102.

A maxed out GA104 would mean 6,144 CUDA cores spread across 48 streaming multiprocessors, 192 tensor cores, 48 RT cores, 192 TMUs, and 96 ROPs. The chip also features a 256-bit wide memory interface, which we now know is capable of handling fast GDDR6X memory. Besides significantly increased memory bandwidth, the RTX 3070 Ti could also dial up GPU clock-speeds. NVIDIA probably finds these changes sufficient to compete with the Radeon RX 6800, which outclasses the RTX 3070 in non-raytraced gaming.



View at TechPowerUp Main Site
 
Compete with 8GB vs 16GB, like how. 10-12% faster than 3070. Same as 2070 october to 2070 Super july next year release +12%, arrived at the same price. So if this comes at a $699 multiplied by 3x for the mining craze it would be a disaster.
 
Did they figure out a way to properly cool gddr6x? If this isnt a LHR model the 3070ti would be a godesend for mining.
 
I'm surprised they're sticking to 8GB and more surprised that the rumored 3080ti will have only 12GB. Nvidia is really dropping the ball on the RAM front, we've had 8GB cards since the Radeon R9 290x from 2014. At this point that amount should be standard for low end cards.
 
I'm surprised they're sticking to 8GB and more surprised that the rumored 3080ti will have only 12GB. Nvidia is really dropping the ball on the RAM front, we've had 8GB cards since the Radeon R9 290x from 2014. At this point that amount should be standard for low end cards.
Well the most probable problem is the use of those GDDR6x vrams. Micron is the only manufacturer of those and there's no 16Gb chips out there, so double the chip amount with clam shell or leave it 8GB.
 
I'm surprised they're sticking to 8GB and more surprised that the rumored 3080ti will have only 12GB. Nvidia is really dropping the ball on the RAM front, we've had 8GB cards since the Radeon R9 290x from 2014. At this point that amount should be standard for low end cards.
High density DDR6X is still incredibly expensive, hencw why it is only featured on the 3090. The 3080ti would have to either have 12GB, or 24gB rendering the 3090 obsolete. And if they dont want to render the 3090 obsolete, then they cant use the expensive double density on the 3070ti either, because that would render the 3080ti obsolete.
 
did they lose an "E" somewhere?

it is nice to look at I suppose...
 
Well, I'm quite surprised they've stuck with 8GB VRAM, I play Metro Exodus at 3840x1080 (~4.15MP) and already see about 7.9GB of VRAM being used. Yes, I get that nVidia is slightly more efficient with VRAM usage, but my 1/2 4K res has already eaten up almost 8GB, I wonder how much more VRAM is needed for a slightly higher res of 3440x1440 (~4.95MP). Regardless, I recall one of my games hitting just <10GB VRAM, I'll have to check which game did that...
 
High density DDR6X is still incredibly expensive, hencw why it is only featured on the 3090. The 3080ti would have to either have 12GB, or 24gB rendering the 3090 obsolete. And if they dont want to render the 3090 obsolete, then they cant use the expensive double density on the 3070ti either, because that would render the 3080ti obsolete.

True it would be expensive to double the amount and it puts the 3090 in a really rough spot. But with how insane prices have been and the willingness of gamers to get cards at the $1000 mark they could easily recover costs.

I guess the 'upgrade' to GDDR6X chips is biting them in the a$$ this generation.
 
Everybody is bitching about the VRAM thing, but they don't realize that almost no game is using even half of that on 1080p and even 1440p, which those cards are designed for.... And I'm talking about ACTUAL usage not about caching more VRAM for paging.
 
8 GB huh? Knew it. Well, at least I won't regret my 3070. Time to be glad my card doesn't have G6Xtra Hot. :p

And it seems people still don't know the difference between VRAM allocation and actual usage. Many AAA games allocate all of the VRAM, doesn't mean it's using all of it. Good luck actually using over 8 GB in anything at 1440p.
 
Last edited by a moderator:
Well the most probable problem is the use of those GDDR6x vrams. Micron is the only manufacturer of those and there's no 16Gb chips out there, so double the chip amount with clam shell or leave it 8GB.
We dont need GDDR6Xtreme temperature, we just want future proof VRM for current gen gaming, consoles and AMD GPU have 16gb ram, its time for Nvidia to catch up, this reeks of a monopely to force people to keep buying new cards when there not that old one already struggle to max out details years from now
 
Everybody is bitching about the VRAM thing, but they don't realize that almost no game is using even half of that on 1080p and even 1440p, which those cards are designed for.... And I'm talking about ACTUAL usage not about caching more VRAM for paging.

No, you just have to take Resident Evil Village as an example, we warned that 8GB VRAM won't be enough for future games, especially with RTX features, and behold, we were right. So now people need to upgrade again next year if they don't want less performance than a 3060 with RTX on, RTX which these cards were made for, oh the irony, did I mention how much I HATE IRONY?!!!!!?!!??!?!?!?!
 
Everybody is bitching about the VRAM thing, but they don't realize that almost no game is using even half of that on 1080p and even 1440p, which those cards are designed for.... And I'm talking about ACTUAL usage not about caching more VRAM for paging.
Are you kidding ?

Go looks PC port reviews of AAA games from 2019 till now, almost all of them consumes 5.7 - 6 Gb minimum, and can consume up to 8 or 12gb when they are further enhanced on PC like Doom eternal or Resident evil village, those are PS4 games, what will happen when PC ports of only PS5 games releases ?
Vrma requirements will increase higher

New COD titles will eat as much Vram as you have available for allocating too, so we are starting to see devs increasing Vram requirements because of new consoles yet Nvidia is still releasing upper mid range cards with 8Gb like its 2016, ridiculous
 
Everybody is bitching about the VRAM thing, but they don't realize that almost no game is using even half of that on 1080p and even 1440p, which those cards are designed for.... And I'm talking about ACTUAL usage not about caching more VRAM for paging.
who on earth thinks that the 3070ti is a 1080p gaming card? LMFAO, dude, 1080p is the realm of the 5500xt these days, 5600xt for high refresh rate systems. The 3070/ti are 1440p144/4k60 cards, and they'llneed the VRAM to last. GPUs have long lifespans these days, especially given the high cost of modern cards.
 
Back
Top