• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

MSI GeForce RTX 3090 Ti Suprim X

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
28,986 (3.74/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
The MSI GeForce RTX 3090 Ti is an impressive quad-slot graphics card. In our review, we found that it runs quieter than other RTX 3090 Ti tested today. MSI also included a large factory overclock with their card, which handles 4K60 with ease.

Show full review
 
A a Porsche 911 class card for bragging rights and those who need ultimate performance no matter what (overclockers, people with a ton of money, etc.) A last hooray of the Ampere architecture.

There's nothing to discuss really.

I'm looking forward to Ada Lovelace. Hopefully we'll get decent performance below 200W. This trend of an increased power consumption doesn't look healthy.
 
Interesting: "power consumption is very high, but when taking the achieved performance into account, it roughly matches RTX 3090. Compared to other Ampere card this means efficiency is 10% reduced, 25% worse than AMD's RDNA2 offerings."

That 469W whilst gaming is earth shattering.
 
Interesting: "power consumption is very high, but when taking the achieved performance into account, it roughly matches RTX 3090. Compared to other Ampere card this means efficiency is 10% reduced, 25% worse than AMD's RDNA2 offerings."

That 469W whilst gaming is earth shattering.

People do have a very short memory, and before you say "these are two cards", well, according to rumors, both NVIDIA and AMD will soon release MCM GPUs which are basically dual-die products.

power_maximum.gif
power_peak.gif
power_average.gif
 
Last edited:
People do have a very short memory, and before you say "these are two cards", well, according to rumors, both NVIDIA and AMD will soon release MCM GPUs which are basically dual-die products.

power_maximum.gif
power_peak.gif
power_average.gif

The 7990 was a dual chip card too. Only pulled in 277w typical. The R290 was rubbished at the time for its power consumption.

I'd not criticise people who wish to buy the card, it's down to personal liberty, but the production of it by Nvidia is a backwards step, IMO.
 
People do have a very short memory, and before you say "these are two cards", well, according to rumors, both NVIDIA and AMD will soon release MCM GPUs which are basically dual-die products.
power_maximum.gif
power_peak.gif
power_average.gif

Do they? “The Radeon R9 295X2 combines two graphics processors to increase performance.” The 3090 Ti is not 2x the 3090 on one PCB.

I'm looking forward to Ada Lovelace. Hopefully we'll get decent performance below 200W. This trend of an increased power consumption doesn't look healthy.
I hope the next Gen by the 2H of the year (Ada Lovelace/ RDN3 / Alchemist) or at least the gen after start to really innovate again.
The past couple of years introduced any gains in performance with a massive tax on price, heat output and energy consumption, effectively stagnating or in some cases regressing as a net result.
 
Can it handle 4K120, No, 60Hz is only playable on OLED.. only if you get the newest 4090Ti 6 slot 600W monster truck, that only lasts for a year until the new games hit.

This is blatantly false. 1080 Ti released 5 years ago still allows to play all modern games (sans RTX) at 1080p/1440p and most modern triple-A titles at 4K albeit at a slightly decreased visual quality. It roughly matches the performance of RTX 3060.

Considering that GPUs have been getting more and more expensive recently, game developers are not rushing to obsolete older cards because that will mean lost sales.

Again enthusiasts on tech websites continue to prove how little they care about the world outside. Absolute most people have GPUs which cost way less than $300.

steam hw survey 2022-02.png


Do they? “The Radeon R9 295X2 combines two graphics processors to increase performance.” The 3090 Ti is not 2x the 3090 on one PCB.

Check the second comment to the article. Also reread my comment about dual-die upcoming GPUs.

I'm preparing to quit this hobby looking at the power consumption of these cards. I'm just done.

Do. Not. Buy. What a drama. Not.
 
Last edited:
If the 3090 was absurd at $1500, then the 3090 Ti.........WTF?
 
The 7990 was a dual chip card too. Only pulled in 277w typical. The R290 was rubbished at the time for its power consumption.

I'd not criticise people who wish to buy the card, it's down to personal liberty, but the production of it by Nvidia is a backwards step, IMO.
The fact that he had to pull out some 8 year old dual 28nm GPU card and conveniently forgets about the Titan Z. ;)
1648565169969.png
 
Last edited:
My 3080 (375W Aorus Master) outputs too much heat in my room to be viable without UV and/or 60fps caps when gaming april to september. The next GPU I buy won't have a TBP above 250W. Could probably game for 20-30 minutes with a 3090ti in my PC before giving up.
 
I'm trying to understand how the efficiency is comparable to the 3090 given the power and performance numbers from your charts. The 3090 Ti consumes 32% more power (469w/355w) but the performance uplift is only 12% (92fps/82fps). Am I missing something?
 
man that power recommendation.....
 
I'm trying to understand how the efficiency is comparable to the 3090 given the power and performance numbers from your charts. The 3090 Ti consumes 32% more power (469w/355w) but the performance uplift is only 12% (92fps/82fps). Am I missing something?
This is normal, if you limit the consumption of the 3090 ti, you will get same or better performance per watt. If you oc 3090 you will hit same power consuption as 3090ti and will have worse perf/watt
 
efficiency is 50% more power for 25% performance, so 3080/10 should be 120% efficient in 4K, but in Vsync60 is 110%. 110W vs 100W
This is blatantly false. 1080 Ti released 5 years ago still allows to play all modern games (sans RTX) at 1080p/1440p and most modern triple-A titles at 4K albeit at a slightly decreased visual quality. It roughly matches the performance of RTX 3060.
Yes, but I expect no compromise with this beefy GPU for years to come in all quad AA titles with all eye candy enabled. And To think that RTX 4060Ti can dethrone it in less than 1 year with only 24 gb buffer to show for itself, so much for future proof.
 
Yeah, but that is two GPUs.

We're supposed to be moving forward, not backwards.

Physics is standing in the way, sorry. It's only gonna get worse unless we find a way to send signals without losing a ton of energy in the process. Optical computing is unfortunately limited to data transfer and not much more nowadays.

I still don't understand the need to come and shit on products you don't need/can't afford/find inappropriate. Why?? People normally don't get riled up about luxury cars, houses which cost tens of millions dollars, etc. etc. etc. Why go crazy about this particular card which is basically a status item and not much more?

efficiency is 50% more power for 25% performance, so 3080/10 should be 120% efficient in 4K, but in Vsync60 is 110%. 110W vs 100W

Yes, but I expect no compromise with this beefy GPU for years to come in all quad AA titles with all eye candy enabled. And To think that RTX 4060Ti can dethrone it in less than 1 year with only 24 gb buffer to show for itself, so much for future proof.

You don't know that. We have nothing but rumors about AD. It may indeed have similar performance with a slightly lower power consumption.
 
Last edited:
It seems like a pointless release when there are signs of Ada coming later this year. A regular 3090 would be very close to this after some overclocking. The power consumption is also obscene.
 
I'm trying to understand how the efficiency is comparable to the 3090 given the power and performance numbers from your charts. The 3090 Ti consumes 32% more power (469w/355w) but the performance uplift is only 12% (92fps/82fps). Am I missing something?
MSI 3090 Ti: 149.9 FPS @ 469 W
RTX 3090 FE: 112.3 FPS @ 355 W
 
Last edited:
MSI 3090 Ti: 149.9 FPS @ 469 W
RTX 3090: 112.3 FPS @ 355 W
I thought you used CyberPunk at 1440p to calculate the efficiency metric. If that is the case, then it's:

MSI 3090 Ti: 92 fps @ 469 W
RTX 3090: 82 fps @ 355 W
 
I thought you used CyberPunk at 1440p to calculate the efficiency metric. If that is the case, then it's:

MSI 3090 Ti: 92 fps @ 469 W
RTX 3090: 82 fps @ 355 W
I'm doing a separate run for power on a different machine (the one with the power testing capability)
 
At least it's good to see that none of these space heaters are getting any kind of recommendation stamp - though IMO the criticism for excessive power draws ought to be amped up a bit.

As I see it, the only reason Nvidia is launching this is to ease the transition to the rumored power draws of the next generation. Nothing else makes sense. Sure, they're squeezing a few hundred dollars more out of a few thousand people, but that pales in comparison to the R&D costs for making these SKUs even if they are essentially identical to higher clocked 3090s. But the main point has to be the normalization of cards nearing 500W at the high end.
 
Finally there is graphics card(no matter what is the price) on which we can play few years old games(like RDR2) at 1080p@60Hz full details at "stupidly" low consumption :)
EDIT: Can I ask what game was used in that V-Sync 60Hz power consumption summary? ...I just read it in testing details :oops: @W1zzard
 
Back
Top