• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

MSI GeForce RTX 3090 Ti Suprim X

Joined
May 2, 2017
Messages
7,762 (3.03/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Finally there is graphics card(no matter what is the price) on which we can play few years old games(like RDR2) at 1080p@60Hz full details at "stupidly" low consumption :)
EDIT: Can I ask what game was used in that V-Sync 60Hz power consumption summary? ...I just read it in testing details :oops: @W1zzard
Considering how little power my 6900 XT consumes at 1440p60 in most games (75W-ish in Elden Ring, though that's hardly very demanding, just buggy AF), it could be pretty much anything - though I kind of expect it to be at 4k given the seeming advantage of Ampere over RDNA2 in that graph. Care to share some details, @W1zzard ?
 
Joined
Jun 21, 2013
Messages
541 (0.14/day)
Processor Ryzen 9 3900x
Motherboard MSI B550 Gaming Plus
Cooling be quiet! Dark Rock Pro 4
Memory 32GB GSkill Ripjaws V 3600CL16
Video Card(s) 3060Ti FE 0.9v
Storage Samsung 970 EVO 1TB, 2x Samsung 840 EVO 1TB
Display(s) ASUS ProArt PA278QV
Case be quiet! Pure Base 500
Audio Device(s) Edifier R1850DB
Power Supply Super Flower Leadex III 650W
Mouse A4Tech X-748K
Keyboard Logitech K300
Software Win 10 Pro 64bit
These are so bad, that the Founders Editions in France have been posted for sale 3.5 hours ago and they are still in stock, lol.
 
D

Deleted member 202104

Guest
Considering how little power my 6900 XT consumes at 1440p60 in most games (75W-ish in Elden Ring, though that's hardly very demanding, just buggy AF), it could be pretty much anything - though I kind of expect it to be at 4k given the seeming advantage of Ampere over RDNA2 in that graph. Care to share some details, @W1zzard ?

From the Power consumption page (click the Power Consumption Testing Details button near the top):

V-Sync: If you don't need the highest framerate and want to conserve power, running at 60 FPS is a good option. In this test, we run Cyberpunk 2077 at 1920x1080, capped to 60 FPS. This test is also useful in testing a graphic card's ability to react to situations with only low power requirements. For graphics card that can't reach 60 FPS at 1080p, we report the power draw at the highest achievable frame rate.
 
Joined
Jan 18, 2020
Messages
686 (0.44/day)
You just know they wanted to push these to 550w+ but the design just couldn't handle it.

Got to wait for 4090 till we see the full 600w monster!

In terms of performance, be interested to see this up against the xfx 6900 xt zero wb with power limit also pushed up to 450w odd.
 
Last edited:
Joined
Sep 20, 2014
Messages
36 (0.01/day)
3 reason for this release.

1. To ensure the 6950XT does not get performance crown.
2. Inflate value of next gen RTX 4070/4080. Also pumps up their performance per watt improvements.
3. To milk the more money than sense crowd who believe this card being the fastest card on the market is worth 2k while not realizing this privilege will only be for 5 months or so. This is the least important factor simply due to the low volume of this product. Marketing purposes of the first two points are far more valuable.

If this had been an AMD product with similar performance differences, we would be mostly praising AMD for finally getting the performance crown back. But with this launch and the 20% gap in performance, Nvidia has likely succeeded in staving off AMD from taking the performance crown for now.
 
Joined
Jun 14, 2020
Messages
2,678 (1.88/day)
System Name Mean machine
Processor 13900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Im a bit confused with the comments. Im an owner of an aftermarket 3090 that can reach 470w with stock bios. Currently running a 550w bios. What is new about this? Most 3090's with 3x8pin could reach roundabout the same consumption. Why are people going crazy all of a sudden? Did they expect the 3090ti to consumes less than the 3090? Im deeply confused...
 
Joined
Feb 18, 2005
Messages
5,292 (0.75/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Logitech G613
Software Windows 10 Professional x64
Obviously fake review, because it doesn't pull the 600+W that the REALLY REALLY SMART people have been claiming for months.

/s, for those who aren't REALLY REALLY SMART.

Yeah, but that is two GPUs.

We're supposed to be moving forward, not backwards.
GA102 has 28.3 billion transistors in 628 mm², or ~45 million transistors per mm².
2x Vesuvius have 12.4 billion transistors in 876 mm², or ~14 million transistors per mm².

Apparently, fundamental physics escapes you.

These are so bad, that the Founders Editions in France have been posted for sale 3.5 hours ago and they are still in stock, lol.
I'm sure that has nothing at all to do with a price that very few can afford.

Im a bit confused with the comments. Im an owner of an aftermarket 3090 that can reach 470w with stock bios. Currently running a 550w bios. What is new about this? Most 3090's with 3x8pin could reach roundabout the same consumption. Why are people going crazy all of a sudden? Did they expect the 3090ti to consumes less than the 3090? Im deeply confused...
Your confusion will abate once you realise that most of the people pretending they're horrified, are just AMD fanboys.
 
Last edited:
Joined
Dec 5, 2013
Messages
606 (0.16/day)
Location
UK
I'm preparing to quit this hobby looking at the power consumption of these cards. I'm just done.
You don't need to throw the whole PC gaming hobby away just to keep up someone else's high-end epeen / industry "FOMO hype". Personally I'm perfectly fine with 1080p-1440p + 100-160w GPU's and have zero interest in 4k gaming (so fps's never plummeted for me in the first place that the 4k crowd need 4-digit (price & wattage) GPU's to 'brute-force' back up again). Likewise, having gone through my whole collection of almost 2,000 PC games last weekend, I can't find a single modern AAA performance turd amongst what forms the bulk of 500x highest gameplay / hours played / most fun or memorable games. So "I need a 500w GPU or I can't have fun" is definitely not true.
 
Last edited:
Joined
Nov 21, 2009
Messages
20 (0.00/day)
People do have a very short memory, and before you say "these are two cards", well, according to rumors, both NVIDIA and AMD will soon release MCM GPUs which are basically dual-die products.

You do make a point when comparing this two halo products but,

Power consumption has already got too damn high across the board compared to this historical chart. If I take into account the current leaks - > it will stay the same or get even higher.

This is bad for pc gaming overall and it will just push people like myself(200w gpu and 200w cpu max that I can take) to consoles: prices will probably be too high, high power consumption and high heat output.
 
Joined
Sep 17, 2014
Messages
20,993 (5.96/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
And people in the 4090 topic over yonder saying 'muh muh 600W of course it won't'... but this one already hits 480. And yes, 'you don't have to buy it'... but sooner rather than later, we've set the norm for much higher TDP GPUs. Turing was up from half a decade of stable top end TDPs. Ampere was up and away. What's next? Mars? Those nodes aren't getting a whole lot smaller, so perhaps GPUs need some fundamental changes to make their generational jump worthwhile.

You do make a point when comparing this two halo products but,

Power consumption has already got too damn high across the board compared to this historical chart. If I take into account the current leaks - > it will stay the same or get even higher.

This is bad for pc gaming overall and it will just push people like myself(200w gpu and 200w cpu max that I can take) to consoles: prices will probably be too high, high power consumption and high heat output.
Important take away from that chart: top end SKUs are circling 200-225W, with 240 the upper end. Where are we now? :) 240W is x70-x80 territory. This 3090ti doubles it.
 
Last edited:
Joined
Sep 17, 2014
Messages
20,993 (5.96/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
Double power use, yet SLI is "gone". Makes sense to me.
A big culprit is the limitation of die size. Those dies are big already. The clocks need to be high. Where are those chiplet GPUs...

You don't need to throw the whole PC gaming hobby away just to keep up someone else's high-end epeen / industry "FOMO hype". Personally I'm perfectly fine with 1080p-1440p + 100-160w GPU's and have zero interest in 4k gaming (so fps's never plummeted for me in the first place that the 4k crowd need 4-digit (price & wattage) GPU's to 'brute-force' back up again). Likewise, having gone through my whole collection of almost 2,000 PC games last weekend, I can't find a single modern AAA performance turd amongst what forms the bulk of 500x highest gameplay / hours played / most fun or memorable games. So "I need a 500w GPU or I can't have fun" is definitely not true.
This is absolutely true as well... the price of entry into gaming isn't increasing a whole lot to be fair, the base line of 'quality' is in a good place even at sub mid range. That is, now that GPU prices are going down again... just a little more pls...
 
Joined
Jan 24, 2022
Messages
456 (0.55/day)
You don't need to throw the whole PC gaming hobby away just to keep up someone else's high-end epeen / industry "FOMO hype". Personally I'm perfectly fine with 1080p-1440p + 100-160w GPU's and have zero interest in 4k gaming (so fps's never plummeted for me in the first place that the 4k crowd need 4-digit (price & wattage) GPU's to 'brute-force' back up again). Likewise, having gone through my whole collection of almost 2,000 PC games last weekend, I can't find a single modern AAA performance turd amongst what forms the bulk of 500x highest gameplay / hours played / most fun or memorable games. So "I need a 500w GPU or I can't have fun" is definitely not true.
I'm not putting any GPU above 250W in my machine. They're going to become a rarity at this point. My 3070 has been undervolted to 160W from 240W stock, and it has much better performance over stock & over a 2080 Ti. That's what I want.

Latest and greatest is not what I'm after - but good performance at a reasonable wattage is. And that too is becoming a rarity.

Why are we ditching efficiency for balls to the wall wattage just to get a mere 5-10% increase (if even that) in performance? Just because it's a desktop it doesn't mean that you should crank up the wattage and that it doesn't matter since desktops have good cooling. Efficiency still matters. I've heard that 4060 = 3090 and 7600 XT = 6900 XT. Of course they have comparable performance when they also probably have comparable power draw...

Wake me up when we go back to innovating, performance, and efficiency - and not simply turning power sliders up until the GPU is at its limit and sell it as a new model.
 
Last edited:
Joined
Dec 5, 2013
Messages
606 (0.16/day)
Location
UK
Latest and greatest is not what I'm after - but good performance at a reasonable wattage is. And that too is becoming a rarity.
Why are we ditching efficiency for balls to the wall wattage just to get a mere 5-10% increase (if even that) in performance? Just because it's a desktop it doesn't mean that you should crank up the wattage and that it doesn't matter since desktops have good cooling. Efficiency still matters. I've heard that 4060 = 3090 and 7600 XT = 6900 XT. Of course they have comparable performance when they also probably have comparable power draw...

Wake me up when we go back to innovating, performance, and efficiency - and not simply turning power sliders up until the GPU is at its limit and sell it as a new model.
Oh I agree. In one of my rigs I have a GTX 1660 (120w but I even undervolted that to 88w), that runs 99% of what I want to play these days. But I think they've simply hit the wall. 4k and Ray-Tracing drove up demand (as does ever decreasingly optimised games) just after all the easy per generation efficiency leaps we had with Maxwell, Pascal, etc, ended. So the only way of meeting "I need triple the horsepower for my 4k ray-tracing" now is to triple the wattage. Personally, I find the whole rat-race ridiculous and wouldn't touch a +250w GPU with a barge pole either (it's made easier for me by losing a lot of interest in many "must have" AAA + multi-player games), but I can see why a lot of people are considering switching to console if the PC industry doesn't get its act together over the next couple of years (and start making games more efficient if the hardware's architectural efficiency has genuinely hit a hard wall).

Edit: The "canary in the coal-mine" as to 'the party is over' for massive efficiency gains has been the low-end, ie, when you ignore GPU's of different wattage (and nVidia branding-drift) and just compare "same wattage across generations", the GTX 1060 (120w, 2016) was a huge jump over the GTX 960 (120w, 2015) after just 1 year, the GTX 1660S (120w, 2019) was much less even after 3 years, and the RTX 3050 (120w, 2022) is hardly any improvement at all after another 3 years. The only reason the RTX 2060 was faster than the GTX 1660 was to up the wattage to 160w. If you were to take the RTX 2060 and RTX 3060 and benchmark both capped to 120-130w, that would highlight just how "like for like" efficiency gains have slowed to a crawl since Turing...
 
Last edited:
Joined
Jun 8, 2015
Messages
4 (0.00/day)
Processor 10875H
Motherboard Tong-fang
Memory 32GB Corsair Vengeance 3200MHz
Video Card(s) RTX 3070 140w
Storage Samsung 970 Evo Plus 1TB + Crucial P2 1TB
Display(s) 17.3 IPS 1440p 165Hz
"Significantly faster than RTX 3090 non-Ti"

It's less than 10% when you compare an aftermarket 3090 vs aftermarket 3090Ti, how on earth can less than 10% be deemed significant?
 
Joined
Jun 18, 2018
Messages
150 (0.07/day)
Processor i5 3570K @ 5GHz (1.32V) - de-lid
Motherboard ASRock Z77 Extrem 4
Cooling Prolimatech Genesis (3x Thermalright TY-141 PWM) - direct die
Memory 2x 4GB Corsair VENGEANCE DDR3 1600 MHz CL 9
Video Card(s) MSI GTX 980 Gaming 4G (Alpenföhn Peter + 2x Noctua NF-A12) @1547 Mhz Core / 2000 MHz Mem
Storage 500GB Crucial MX500 / 4 TB Seagate - BarraCuda
Case IT-9001 Smasher -modified with a 140mm top exhaust
Audio Device(s) AKG K240 Studio
Power Supply be quiet! E9 Straight Power 600W
Do. Not. Buy. What a drama. Not.

I still don't understand the need to come and shit on products you don't need/can't afford/find inappropriate. Why?? People normally don't get riled up about luxury cars, houses which cost tens of millions dollars, etc. etc. etc. Why go crazy about this particular card which is basically a status item and not much more?
The issue here is, that they don't correlate. A luxurious Mansion is not driving up normal house prices, an expensive Bentley won't affect the prices of a VW Up in the market.
Nvidia on the other hand is using Halo products like the Titan and now the xx90 (Ti) branding to establish higher prices throughout the whole line-up.

Sure, you don't have to buy them. However, not pointing out that they are charging more and more for less just normalizes the process.
 
Last edited:

Youlocalbox

New Member
Joined
Mar 29, 2022
Messages
2 (0.00/day)
11% better performance on 4K vs the 3090 isn’t really that awful but isn’t really impressive specially because the 3090 ti is gonna be way more expensive in retail price,

the ray tracing surprised me a lot, it was good to see a good jump difference in the ray tracing performance vs the 3090,

hopefully the the retail price isn’t too far out of the 3090 average price atm
 
Joined
Jul 19, 2016
Messages
476 (0.17/day)
You just know they wanted to push these to 550w+ but the design just couldn't handle it.

Got to wait for 4090 till we see the full 600w monster!

In terms of performance, be interested to see this up against the xfx 6900 xt zero wb with power limit also pushed up to 450w odd.

There might be some aftermarket 4090's that push closer to 700W+, which would be staggeringly stupid to use anywhere outside Syberia or the North Pole.

I'm not putting any GPU above 250W in my machine. They're going to become a rarity at this point. My 3070 has been undervolted to 160W from 240W stock, and it has much better performance over stock & over a 2080 Ti. That's what I want.

Latest and greatest is not what I'm after - but good performance at a reasonable wattage is. And that too is becoming a rarity.

Why are we ditching efficiency for balls to the wall wattage just to get a mere 5-10% increase (if even that) in performance? Just because it's a desktop it doesn't mean that you should crank up the wattage and that it doesn't matter since desktops have good cooling. Efficiency still matters. I've heard that 4060 = 3090 and 7600 XT = 6900 XT. Of course they have comparable performance when they also probably have comparable power draw...

Wake me up when we go back to innovating, performance, and efficiency - and not simply turning power sliders up until the GPU is at its limit and sell it as a new model.

I agree with you, 250W is the absolute ceiling for me because of the heat output and noise such a card would spit out. Ideally sub 200W for high end, which is where I'd thought we'd be now but the reality is going to be 3X that for high end instead.
 
Joined
Jan 20, 2019
Messages
1,290 (0.67/day)
Location
London, UK
System Name ❶ Oooh (2024) ❷ Aaaah (2021) ❸ Ahemm (2017)
Processor ❶ 5800X3D ❷ i7-9700K ❸ i7-7700K
Motherboard ❶ X570-F ❷ Z390-E ❸ Z270-E
Cooling ❶ ALFIII 360 ❷ X62 + X72 (GPU mod) ❸ X62
Memory ❶ 32-3600/16 ❷ 32-3200/16 ❸ 16-3200/16
Video Card(s) ❶ 3080 X Trio ❷ 2080TI (AIOmod) ❸ 1080TI
Storage ❶ NVME/SSD/HDD ❷ <SAME ❸ SSD/HDD
Display(s) ❶ 1440/165/IPS ❷ 1440/144/IPS ❸ 1080/144/IPS
Case ❶ BQ Silent 601 ❷ Cors 465X ❸ Frac Mesh C
Audio Device(s) ❶ HyperX C2 ❷ HyperX C2 ❸ Logi G432
Power Supply ❶ HX1200 Plat ❷ RM750X ❸ EVGA 650W G2
Mouse ❶ Logi G Pro ❷ Razer Bas V3 ❸ Logi G502
Keyboard ❶ Logi G915 TKL ❷ Anne P2 ❸ Logi G610
Benchmark Scores I have wrestled bandwidths, Tussled with voltages, Handcuffed Overclocks, Thrown Gigahertz in Jail
With energy prices on the high... performance for the asking price on the low... ITS MADNESS!!!

Honestly a few years back when purchasing a 1080 TI for an odd £600/700 i thought i was losing the plot. I was under the impression eventually prices will be more reasonable with later generation product stacks. How on earth did we end up going above this sort of price range? Forget the pandemic, shortages or whatnot... the trend was already set it just got pushed a little ahead. I can't see myself paying more than £600 for a decent gaming card and that too while feeling i'm being ripped off.

So i have to ask... (forget relative pricing) are these manufacturers pulling more profit with each generational upgrade or is it in line with costs? If its the latter, i get it otherwise i'm pulling a finger (whilst buying their cards lol) at these manufacturers and retailers .
 
Joined
Nov 26, 2021
Messages
1,367 (1.53/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
With energy prices on the high... performance for the asking price on the low... ITS MADNESS!!!

Honestly a few years back when purchasing a 1080 TI for an odd £600/700 i thought i was losing the plot. I was under the impression eventually prices will be more reasonable with later generation product stacks. How on earth did we end up going above this sort of price range? Forget the pandemic, shortages or whatnot... the trend was already set it just got pushed a little ahead. I can't see myself paying more than £600 for a decent gaming card and that too while feeling i'm being ripped off.

So i have to ask... (forget relative pricing) are these manufacturers pulling more profit with each generational upgrade or is it in line with costs? If its the latter, i get it otherwise i'm pulling a finger (whilst buying their cards lol) at these manufacturers and retailers .
Profit margins.
 
Joined
Nov 6, 2016
Messages
1,588 (0.58/day)
Location
NH, USA
System Name Lightbringer
Processor Ryzen 7 2700X
Motherboard Asus ROG Strix X470-F Gaming
Cooling Enermax Liqmax Iii 360mm AIO
Memory G.Skill Trident Z RGB 32GB (8GBx4) 3200Mhz CL 14
Video Card(s) Sapphire RX 5700XT Nitro+
Storage Hp EX950 2TB NVMe M.2, HP EX950 1TB NVMe M.2, Samsung 860 EVO 2TB
Display(s) LG 34BK95U-W 34" 5120 x 2160
Case Lian Li PC-O11 Dynamic (White)
Power Supply BeQuiet Straight Power 11 850w Gold Rated PSU
Mouse Glorious Model O (Matte White)
Keyboard Royal Kludge RK71
Software Windows 10
People do have a very short memory, and before you say "these are two cards", well, according to rumors, both NVIDIA and AMD will soon release MCM GPUs which are basically dual-die products.

Actually, ONLY AMD will have an MCM GPU the next generation and from all the leaks, Nvidia 4000 series is slated to be even more power hungry.

3 reason for this release.

1. To ensure the 6950XT does not get performance crown.
2. Inflate value of next gen RTX 4070/4080. Also pumps up their performance per watt improvements.
3. To milk the more money than sense crowd who believe this card being the fastest card on the market is worth 2k while not realizing this privilege will only be for 5 months or so. This is the least important factor simply due to the low volume of this product. Marketing purposes of the first two points are far more valuable.

If this had been an AMD product with similar performance differences, we would be mostly praising AMD for finally getting the performance crown back. But with this launch and the 20% gap in performance, Nvidia has likely succeeded in staving off AMD from taking the performance crown for now.
I don't know....check HWBOT and all the single card GPU world records are for the 6900xt

Obviously fake review, because it doesn't pull the 600+W that the REALLY REALLY SMART people have been claiming for months.

/s, for those who aren't REALLY REALLY SMART.


GA102 has 28.3 billion transistors in 628 mm², or ~45 million transistors per mm².
2x Vesuvius have 12.4 billion transistors in 876 mm², or ~14 million transistors per mm².

Apparently, fundamental physics escapes you.


I'm sure that has nothing at all to do with a price that very few can afford.


Your confusion will abate once you realise that most of the people pretending they're horrified, are just AMD fanboys.
If I remember correctly, it was all the Nvidia fanboys who couldn't stop talking about efficiency when Maxwell was around, and after that....they never brought it up again.
 
Joined
Jan 20, 2019
Messages
1,290 (0.67/day)
Location
London, UK
System Name ❶ Oooh (2024) ❷ Aaaah (2021) ❸ Ahemm (2017)
Processor ❶ 5800X3D ❷ i7-9700K ❸ i7-7700K
Motherboard ❶ X570-F ❷ Z390-E ❸ Z270-E
Cooling ❶ ALFIII 360 ❷ X62 + X72 (GPU mod) ❸ X62
Memory ❶ 32-3600/16 ❷ 32-3200/16 ❸ 16-3200/16
Video Card(s) ❶ 3080 X Trio ❷ 2080TI (AIOmod) ❸ 1080TI
Storage ❶ NVME/SSD/HDD ❷ <SAME ❸ SSD/HDD
Display(s) ❶ 1440/165/IPS ❷ 1440/144/IPS ❸ 1080/144/IPS
Case ❶ BQ Silent 601 ❷ Cors 465X ❸ Frac Mesh C
Audio Device(s) ❶ HyperX C2 ❷ HyperX C2 ❸ Logi G432
Power Supply ❶ HX1200 Plat ❷ RM750X ❸ EVGA 650W G2
Mouse ❶ Logi G Pro ❷ Razer Bas V3 ❸ Logi G502
Keyboard ❶ Logi G915 TKL ❷ Anne P2 ❸ Logi G610
Benchmark Scores I have wrestled bandwidths, Tussled with voltages, Handcuffed Overclocks, Thrown Gigahertz in Jail
Profit margins.

That sucks! Honestly, if I had crazy amounts of cash to splurge I still wouldn't buy these top end cards. I'm just happy to buy anything that gives me around 100-120fps in the games I play at high settings on 1440p... gonna stick with that performance target! I recently purchased a build from a trusted family friend with a used 2080 TI at a decent price and i'm over the moon. Speaking of "trusted" sellers even the used market is a difficult place not knowing what depth of the cryto crunch these cards have been running on.
 
Joined
Apr 18, 2013
Messages
1,260 (0.31/day)
Location
Artem S. Tashkinov
The issue here is, that they don't correlate. A luxurious Mansion is not driving up normal house prices, an expensive Bentley won't affect the prices of a VW Up in the market.
Nvidia on the other hand is using Halo products like the Titan and now the xx90 (Ti) branding to establish higher prices throughout the whole line-up.

Sure, you don't have to buy them. However, not pointing out that they are charging more and more for less just normalizes the process.

And there's ... zero reasons for the 3090 Ti to drive prices up. If other GPUs are released at the prices people cannot afford, those cards will not be sold and the company will go out of business. That's called logic. You're exactly right about a halo status which also means a halo price point no one cares about except some people here in the comments who never wanted to buy this GPU anyways.

If not for the miners and weird logistic issues which I cannot really explain (neither I've read anything satisfactory as to why we have a semiconductor crisis - we didn't have it and then it suddenly emerged, WTF?), we wouldn't have had these insane prices over the past a year and a half. The old law of demand and supply at work.
 
Last edited:
Joined
Nov 26, 2021
Messages
1,367 (1.53/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
That sucks! Honestly, if I had crazy amounts of cash to splurge I still wouldn't buy these top end cards. I'm just happy to buy anything that gives me around 100-120fps in the games I play at high settings on 1440p... gonna stick with that performance target! I recently purchased a build from a trusted family friend with a used 2080 TI at a decent price and i'm over the moon. Speaking of "trusted" sellers even the used market is a difficult place not knowing what depth of the cryto crunch these cards have been running on.
It sounds like you got a great deal. To be fair, at MRSP, all cards below the $700 mark, i.e. RTX 3080/6800 XT and lower were decently priced. Unfortunately, MRSP turned out to be a mirage.
 

wolf

Performance Enthusiast
Joined
May 7, 2007
Messages
7,783 (1.25/day)
System Name MightyX
Processor Ryzen 5800X3D
Motherboard Gigabyte X570 I Aorus Pro WiFi
Cooling Scythe Fuma 2
Memory 32GB DDR4 3600 CL16
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
This really isn't a new thing, we've had late cycle refreshes before, we've had big power consumption before, we've had gunning for the crown before... Oh well to each their own outrage it seems.

RTX 40 series will not have it's line-ups price performance ratio based off the 3090Ti, nor will the entire lineup have 450w+ power consumption, maybe the silly halo product that we always knew was stupid for gamers, like the 3090 is and was. Samsung 8nm was sub-optimal compared to TSMC's 7nm, everyone already seems to agree with that and we know 40 series is back on TSMC.

There will be <250w cards that likely offer 3080+ performance, with hot competition from AMD they have every chance of reasonable MSRP's too. Beyond the 3080 10/12GB things just get silly and it's nothing new, except in name and outright performance..
 
Top