• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 3090 Founders Edition Potentially Pictured: 3-slot Behemoth!

Mitsman

New Member
Joined
Jun 30, 2020
Messages
4 (0.00/day)
Haven't nvidia x90 cards historically been dual gpu cards? Could explain the high price and the cards size. I can't see any other reason why they would bring back a xx90 series card unless its a dual gpu
 
Joined
Aug 5, 2008
Messages
557 (0.10/day)
Location
Hampshire, UK
System Name If you name your systems, get a boy/girlfriend...
Processor i7 4770k
Motherboard Asus Maximus VI Formula
Cooling Custom waterloop around Black Ice GTX 360
Memory 16GB DDR3
Video Card(s) GTX 1080 FE
Storage Samsung 850 Pro 1TB
Case HAF 932
Audio Device(s) Onboard
Power Supply Corsair HX750
Software Windows 10 x64
So, as the cooler is becoming larger and larger chunk of the card's cost, we should expect watercooled versions to become cheaper (ie: not way more expensive than FE). Right ? Right....
 
Joined
Jun 28, 2018
Messages
299 (0.14/day)
Haven't nvidia x90 cards historically been dual gpu cards? Could explain the high price and the cards size. I can't see any other reason why they would bring back a xx90 series card unless its a dual gpu

Very unlikely, but it would be a good plot twist. :cool:

Both AMD and Nvidia have been increasingly distancing themselves from the dual-GPU thing, including SLI/Crossfire.

It´s a solution that has many disadvantages. The future will be MCM designs and that is what both companies are working on.
 
Joined
Mar 7, 2010
Messages
954 (0.19/day)
Location
Michigan
System Name Daves
Processor AMD Ryzen 3900x
Motherboard AsRock X570 Taichi
Cooling Enermax LIQMAX III 360
Memory 32 GiG Team Group B Die 3600
Video Card(s) Powercolor 5700 xt Red Devil
Storage Crucial MX 500 SSD and Intel P660 NVME 2TB for games
Display(s) Acer 144htz 27in. 2560x1440
Case Phanteks P600S
Audio Device(s) N/A
Power Supply Corsair RM 750
Mouse EVGA
Keyboard Corsair Strafe
Software Windows 10 Pro
Massive card with a massive price. I doubt AMD will be able to compete. Maybe with RDNA 3 which will bring the chiplet GPU design and hopefully a massive boost in performance.

AMD don't need to compete with the three people who can afford that..
 
Joined
Dec 29, 2010
Messages
3,449 (0.71/day)
Processor AMD 5900x
Motherboard Asus x570 Strix-E
Cooling Hardware Labs
Memory G.Skill 4000c17 2x16gb
Video Card(s) RTX 3090
Storage Sabrent
Display(s) Samsung G9
Case Phanteks 719
Audio Device(s) Fiio K5 Pro
Power Supply EVGA 1000 P2
Mouse Logitech G600
Keyboard Corsair K95
That is obnoxiously large, ridonkulous!
 
Joined
Jul 5, 2013
Messages
25,559 (6.49/day)
honestly though, if done neatly that could be a pretty cool case mod, literally cutting in the side panel to have the videocard stick out, basically one of those muscle cars but then for PC
As cool as I could likely make it, the base hardware is already slightly bottlenecking the RTX2080 I have in it now, While there would still be a boost in GPU performance, the CPU bottleneck would become very pronounced. I think the RTX2080 is the best GPU one can pair with an unlocked socket 1366 Xeon and expect to get reasonably balanced performance.

Some people are exaggerating a lot, the dimensions don't seem to be very different from most 2080Ti AIB versions.
While you make a good point, it is big enough that fitting it into some systems will be a challenge and impossible for others.
 
Last edited:
Joined
Mar 18, 2007
Messages
16 (0.00/day)
Processor Phenom II 940 @ 3.6 Ghz
Motherboard Gigabyte MA790X-UD4
Cooling OCZ Vendetta 2
Memory 8 GB DDR2-800 Mushkin Extreme
Video Card(s) Geforce GTX 580
Storage A lot of them. Over 10 TB
Display(s) Benq GW2400D 192x1200
Case Coolermaster Dominator CM690
Audio Device(s) HDA X-Mystique
Power Supply OCZ StealthXStream 600W+TT VGA Power 250
Software Windows 7 x64
Very unlikely, but it would be a good plot twist. :cool:

Both AMD and Nvidia have been increasingly distancing themselves from the dual-GPU thing, including SLI/Crossfire.

It´s a solution that has many disadvantages. The future will be MCM designs and that is what both companies are working on.

Yes, it has SOME disavantages, but also a lot of advantages. The reason to kill SLI has a lot to do with the overinflated prices of the 2080Ti. If SLI were to work in most games like in the past, would you rather get a 2080Ti, or get a second 1080Ti that would be a lot cheaper and much more powerful?.
 
Joined
Sep 8, 2018
Messages
51 (0.02/day)
GeForce GTX 960 = $199 -> fast forward 6 year -> RTX 3060 = $399. (+100%)
GeForce GTX 980TI = $649-> fast forward 6 year -> RTX 3090 = $1,399 (+100%)
RX 470 = $179 -> fast forward 4 years -> RX 5700 = $349 (+98%)
WTF just happened??? 100 % price increase in 6 years... It's not inflation (inflation 2014-2020 = 9,7%), it's GREED Inc.

F that, I'm out of DIY PC building until Leather Jacket One and his niece come to their senses which will probably never happen.

Well they inserted more tier compared to before

960/970/980
1650/1660/2060/2070/2080/2080ti/titan

Thats literally over double the amount of reference per generation (I do not even count the updated super and ti when it happens)

Your previous 960 is more akin to either 1650ti or 1660 (200-300$ bracket)


Same goes with amd

5500xt/5600xt/5700xt they did not even drop a 5800xt

But basically your rx470 would match 5600xt not 5700xt since there was an rx480.

Pick the tier that match compared to the rest of the offers currently proposed, not simply by naming conventions.

Exactly. I got my 1080Ti for 670€ on an EVGA deal. I'm not paying more for lower end cards. I'll keep it as long as I can and if then things are still the same, I'm out too. Vote with our wallets.

Yeah but 1080ti was an upgrade to the 1080

They dropped an unmatched and no upgrade 2080ti on launch day the 1080ti is more akin to a 2080 super which is around the same pricing actually.

2080ti/titan are purely for bragging. Not that RTX had any use this generation, a totally gimmick feature up until now. I have a 2080ti and the only game that had it at launch was control. Too many game said they would support RTX and maybe they had it but I'm not waiting 6mo post release of a game just to see rtx, it either at launch or not at all for me.

Hopefully thst changes with next gen console but we'll see.
 
Joined
Jun 28, 2018
Messages
299 (0.14/day)
Yes, it has SOME disavantages, but also a lot of advantages. The reason to kill SLI has a lot to do with the overinflated prices of the 2080Ti. If SLI were to work in most games like in the past, would you rather get a 2080Ti, or get a second 1080Ti that would be a lot cheaper and much more powerful?.

SLI/Crossfire, even in its bright days, has always had a lot of problems with support, it is very dependent on the drivers for each game and the goodwill of game developers.

Then there was the problem of poor scalability. Adding a second GPU, at best, could bring 40% or 50% more performance, but mostly it was well below that. That is, you paid for a GPU to have half or 1/3 the performance of it, depending on the game. And it got worse, the more GPUs we put in.

There could be rares cases where it paid off, but as a general rule, after a few years, it was better to sell the card and buy one of the new generation, avoiding a lot of hassle. Not to mention the heat, noise and power consumption that SLI/Crossfire usually caused. Often, the GPU that was on top was constantly throttling due to not having room to breathe, further decreasing the performance gain.

I don't think it has anything to do with the price of the RTX 2080Ti, Nvidia started to follow this path long before that, for example, the GTX 1060 in 2016 no longer had support for this. AMD did not embark on cards over $1000 and also abandoned dual-GPU.
 
Joined
Nov 21, 2010
Messages
2,229 (0.46/day)
Location
Right where I want to be
System Name Miami
Processor Ryzen 3800X
Motherboard Asus Crosshair VII Formula
Cooling Ek Velocity/ 2x 280mm Radiators/ Alphacool fullcover
Memory F4-3600C16Q-32GTZNC
Video Card(s) XFX 6900 XT Speedster 0
Storage 1TB WD M.2 SSD/ 2TB WD SN750/ 4TB WD Black HDD
Display(s) DELL AW3420DW / HP ZR24w
Case Lian Li O11 Dynamic XL
Audio Device(s) EVGA Nu Audio
Power Supply Seasonic Prime Gold 1000W+750W
Mouse Corsair Scimitar/Glorious Model O-
Keyboard Corsair K95 Platinum
Software Windows 10 Pro
Have they ever voiced power consumption figures, by the way?


How dare they.


A lovely strawman.

oops typo, fixed. Besides that strawman? When hasn't that been the case in that price segment? That they come with 10-25% of an Nvdia card and price it proportionally? with worse power consumption might I add? Also when Nvidia at added a tier at bottom to hide their price hike, what did AMD do? Matched Nvidia's new pricing.
 
Last edited:
Joined
Feb 1, 2013
Messages
1,248 (0.30/day)
System Name Gentoo64 /w Cold Coffee
Processor 9900K 5.2GHz @1.312v
Motherboard MXI APEX
Cooling Raystorm Pro + 1260mm Super Nova
Memory 2x16GB TridentZ 4000-14-14-28-2T @1.6v
Video Card(s) RTX 4090 LiquidX Barrow 3015MHz @1.1v
Storage 660P 1TB, 860 QVO 2TB
Display(s) LG C1 + Predator XB1 QHD
Case Open Benchtable V2
Audio Device(s) SB X-Fi
Power Supply MSI A1000G
Mouse G502
Keyboard G815
Software Gentoo/Windows 10
Benchmark Scores Always only ever very fast
Haven't nvidia x90 cards historically been dual gpu cards? Could explain the high price and the cards size. I can't see any other reason why they would bring back a xx90 series card unless its a dual gpu
It might be so obvious we are not seeing it. Hardware is going MCM. DX12 was meant to support multi-GPU effortlessly, same as Vulkan. So the APIs already had it. Games just need to implement it. May be NVidia is ready to push, for the glory of 4K and RT?


75% scaling in the best case with heaviest draw calls: 1070 + 980 Ti in Explicit Multi-GPU mode.

NVidia silently added checker board rendering for multiple GPUs into drivers 8 months ago.
 
Last edited:

rtwjunkie

PC Gaming Enthusiast
Supporter
Joined
Jul 25, 2008
Messages
13,909 (2.42/day)
Location
Louisiana -Laissez les bons temps rouler!
System Name Bayou Phantom
Processor Core i7-8700k 4.4Ghz @ 1.18v
Motherboard ASRock Z390 Phantom Gaming 6
Cooling All air: 2x140mm Fractal exhaust; 3x 140mm Cougar Intake; Enermax T40F Black CPU cooler
Memory 2x 16GB Mushkin Redline DDR-4 3200
Video Card(s) EVGA RTX 2080 Ti Xc
Storage 1x 500 MX500 SSD; 2x 6TB WD Black; 1x 4TB WD Black; 1x400GB VelRptr; 1x 4TB WD Blue storage (eSATA)
Display(s) HP 27q 27" IPS @ 2560 x 1440
Case Fractal Design Define R4 Black w/Titanium front -windowed
Audio Device(s) Soundblaster Z
Power Supply Seasonic X-850
Mouse Coolermaster Sentinel III (large palm grip!)
Keyboard Logitech G610 Orion mechanical (Cherry Brown switches)
Software Windows 10 Pro 64-bit (Start10 & Fences 3.0 installed)

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
15,888 (4.58/day)
Location
Kepler-186f
May be NVidia is ready to push, for the glory of 4K and RT?

for the glory!!!!

Good luck finding a use for this in 12 years!?.

Be lucky to get five, things tend to double in performance making these cards pretty wall mounts

normally I would agree with you, but I think those days are gone now, moving forward after this bump will be 5% max gains on fps ends, but improving DLSS and RT and any other new gimmicks they come up with to keep us spending money and not worrying about the 5%.
 
Joined
May 3, 2018
Messages
2,281 (1.05/day)
Stupid price, stupid size, stupid power consumption. Pathetically desperate effort from Nvidia to cling to the performance crown no matter what. Again so much time devoted to an ultra niche product only 0.01% of gamers will actually buy despite all the keyboard warriors claiming otherwise.
 
Joined
Jan 13, 2011
Messages
219 (0.05/day)
Stupid price, stupid size, stupid power consumption. Pathetically desperate effort from Nvidia to cling to the performance crown no matter what. Again so much time devoted to an ultra niche product only 0.01% of gamers will actually buy despite all the keyboard warriors claiming otherwise.
Halo branding works. Even if it loses them money halo products raise the perceived worth of products downline due to it raising the perceived worth of the brand itself.

It's unclear if this what appears to be a reaching product is due to trying to justify RTX/4K and people actually buying new top end gpus because a 1080ti still gets you there at the top end at 1440p and 4k so upgrading has been a wash with RTX. Or this reach is due to anticipating actual competition from AMD even though nvidia hasn't had competition for the performance crown since the 1080ti was released.
 
Joined
Jul 2, 2015
Messages
117 (0.04/day)
Location
USA
System Name The Mini Nuke
Processor Intel i9-9900K at 4.9GHz -1AVX Vffset 1.27V
Motherboard Asus Maximus XI Hero WiFi
Cooling NZXT - Kraken X62 Liquid CPU Cooler
Memory G.SKILL TridentZ 32 GB (4 x 8 GB) DDR4-4000 Memory Model F4-4000C17Q-32GTZRB
Video Card(s) ASUS RTX 3080 TUF-RTX3080-O10G-GAMING
Storage Samsung 850 EVO 500GB SSD, Intel 660p M.2 2280 2TB NVMe
Display(s) 1 LG 27GL850-B and 2 ROG Swift PG278Q
Case NZXT - H700 (Black) ATX Mid Tower Case
Power Supply EVGA SuperNOVA 1000 P2 - 1000W Platinum
Mouse Razer Naga Trinity
Keyboard Razer BlackWidow Chroma
Software Windows 10 Pro
They probably leaked the $2000 price so when $1400 was leaked we’d be happy. But still these prices are getting a bit nuts for a part to play games. Console gaming is starting to look a lot more attractive to me.
 
Joined
Nov 1, 2008
Messages
4,213 (0.75/day)
Location
Vietnam
System Name Gaming System / HTPC-Server
Processor i7 8700K (@4.8 Ghz All-Core) / R7 5900X
Motherboard Z370 Aorus Ultra Gaming / MSI B450 Mortar Max
Cooling CM ML360 / CM ML240L
Memory 16Gb Hynix @3200 MHz / 16Gb Hynix @3000Mhz
Video Card(s) Zotac 3080 / Colorful 1060
Storage 750G MX300 + 2x500G NVMe / 40Tb Reds + 1Tb WD Blue NVMe
Display(s) LG 27GN800-B 27'' 2K 144Hz / Sony TV
Case Xigmatek Aquarius Plus / Corsair Air 240
Audio Device(s) On Board Realtek
Power Supply Super Flower Leadex III Gold 750W / Andyson TX-700 Platinum
Mouse Logitech G502 Hero / K400+
Keyboard Wooting Two / K400+
Software Windows 10 x64
Benchmark Scores Cinebench R15 = 1542 3D Mark Timespy = 9758
I wonder if there are any ITX cases that are big enough to fit this beast.
 
Joined
Mar 14, 2008
Messages
511 (0.09/day)
Location
DK
System Name Main setup
Processor i9 12900K
Motherboard Gigabyte z690 Gaming X
Cooling Water
Memory Kingston 32GB 5200@cl30
Video Card(s) Asus Tuf RTS 4090
Storage Adata SX8200 PRO 1 adn 2 TB, Samsung 960EVO, Crucial MX300 750GB Limited edition
Display(s) HP "cheapass" 34" 3440x1440
Case CM H500P Mesh
Audio Device(s) Logitech G933
Power Supply Corsair RX850i
Mouse G502
Keyboard SteelSeries Apex Pro
Software W11
That beast will not be going into my Dell T3500. It literally won't fit in physically. I'm sure it would work though. Might be time to build a new system... Been dragging my feet for almost a year anyway... I've settled on Threadripper, just trying to decide on which one..

Well as a temp setup while building hard tubing in my primary pc I put my Asus Rog Strix OC 1080TI in a T3500 I had to tweak the Closing mechanism for the cards a little but it worked like a charm :)
 
Joined
Jul 9, 2015
Messages
3,413 (1.06/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
DX12 was meant to support multi-GPU effortlessly, same as Vulkan.
Effortlessly for the card manufacturer. (no need to support it in driver)

When hasn't that been the case in that price segment?
It is hard find examples that match your weird take, than examples that don't.
5700XT is about 10% slower, but 20%+ cheaper than 2070sup and simply faster than 2070 which it matches price wise.
There were times when 10-20% slower AMD gpus were selling for the half of NV price.

May be NVidia is ready to push, for the glory of 4K and RT?
What the heck, dude, seriously?

There is a baseline of "resolution and framerate acceptable to the users", which determines how much complexity could be dedicated to graphics fidelity.
Now next gen consoles are officially targeting 4k (with 2080+ kind of GPUs).
FPS wise, many studios will settle with 30.
So achieving 4k 30fps on PC will be doable with GPUs somewhat faster than those mentioned above.
There will be no need to "overpower" baseline 4 times to go 4k.

As for RT, it's very close to mere marketing push by NV at this point.
This was done without any hardware RT whatsoever and, wait for it, people had to ask if RT was used or not.
Speaks volumes about tech itself.

 
Last edited:
Joined
Feb 1, 2013
Messages
1,248 (0.30/day)
System Name Gentoo64 /w Cold Coffee
Processor 9900K 5.2GHz @1.312v
Motherboard MXI APEX
Cooling Raystorm Pro + 1260mm Super Nova
Memory 2x16GB TridentZ 4000-14-14-28-2T @1.6v
Video Card(s) RTX 4090 LiquidX Barrow 3015MHz @1.1v
Storage 660P 1TB, 860 QVO 2TB
Display(s) LG C1 + Predator XB1 QHD
Case Open Benchtable V2
Audio Device(s) SB X-Fi
Power Supply MSI A1000G
Mouse G502
Keyboard G815
Software Gentoo/Windows 10
Benchmark Scores Always only ever very fast
Effortlessly for the card manufacturer. (no need to support it in driver)
They can make it easier to use, such as GameWorks or some such library that is uniform.

There is a baseline of "resolution and framerate acceptable to the users", which determines how much complexity could be dedicated to graphics fidelity.
Now next gen consoles are officially targeting 4k (with 2080+ kind of GPUs).
FPS wise, many studios will settle with 30.
So achieving 4k 30fps on PC will be doable with GPUs somewhat faster than those mentioned above.
There will be no need to "overpower" baseline 4 times to go 4k.
Most people who care about the high end cards are looking to go past 60fps at 4K (because we like our 120-144Hz monitors on PC, mind you). And NVidia likely cares too from their side, because they may really want to deliver on BFGD someday. You put all the pieces together, from API readiness to driver optimizations to tech demos and the new MCM hardware roadmap, there is no doubt that explicit multi-GPU will be the answer to go beyond 60fps at 4K. (It's that classic throw money at the problem solution, because additional hardware does scale -- it's just a matter of software support, as demo'ed by Ashes.)
 
Joined
Jul 9, 2015
Messages
3,413 (1.06/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
They can make it easier to use, such as GameWorks or some such library that is uniform.
Possibly, although I doubt it. Besides, only a tiny part of the market using that, means it's a wasted effort.


there is no doubt that explicit multi-GPU will be the answer to go beyond 60fps at 4K
Definitely not.
As with 4k, you could get that by lowering complexity of the rendered objects.
If devs, for some far from obvious reason (and "I have a device that could show more fps" is one hell of a funny argument). decide to target 60fps, well, you'll have it.

The most recent time someone tried to highlight 60fps was that awkward Halo Infinite demo that born countles memes:

1598166559924.png


Now, most games are cross-platform.
95% of the steam survey PC market is slower than PS5/XSeX.

Getting a card that is roughly faster than 2080sup would do it.
Now let's do the napkin math: 2080sup * 1.11 = 2080Ti., 2080sup * 2 = GPUThatShouldBring4k60fps

GPUThatShouldBring4k60fps = a card that is 80% faster than 2080Ti.

Just stresses how nonsensical PC Master Race becomes. You won't be able to vastly overpower consoles, but if you do not have that weirs penis size to fps neural entanglement, why would you.
 
Joined
Jul 18, 2017
Messages
575 (0.23/day)
So the reference cooler is finally onpar with highend AIB coolers. Unlike the angry mobs, I commend Nvidia for not cheapening out on reference cards. Thank you!
 
Joined
Aug 13, 2010
Messages
5,383 (1.08/day)
So the reference cooler is finally onpar with highend AIB coolers. Unlike the angry mobs, I commend Nvidia for not cheapening out on reference cards. Thank you!

There are no reference cards, that's the joke with NVIDIA reference cards. It will now be FE and command a higher than MSRP price without having an actually cheap version with that MSRP, just like with high end RTX 20 cards.
 
Top