Saturday, August 22nd 2020

NVIDIA GeForce RTX 3090 Founders Edition Potentially Pictured: 3-slot Behemoth!

The rumor mill has no weekend break, and it churned out photos of what appears to be an NVIDIA Founders Edition version of the upcoming GeForce RTX 3090 next to the equivalent FE RTX 2080, with the latter looking like a toy compared to the massive triple slotter. The cooler comprises of the same design we discussed in detail in June, with the unique obverse dual-fan + aluminium heatsink seen in the images below. We also covered alleged PCB photos, in case you missed them before, and all lines up with the most recent leaks. The only difference here is that pricing for the RTX 3090 FE is claimed to be $1400, a far cry from the $2000 mark we saw for certain aftermarket offerings in the makings, and yet significantly higher from the previous generation- a worrying trend that we eagerly await to see justified with performance, before we even get into case compatibility concerns with the increased length here. Either way, if the images below are accurate, we are equally curious about the cooling capability and how it affects partner solutions and pricing.
Source: Twitter user @GarnetSunset
Add your own comment

183 Comments on NVIDIA GeForce RTX 3090 Founders Edition Potentially Pictured: 3-slot Behemoth!

#101
robal
So, as the cooler is becoming larger and larger chunk of the card's cost, we should expect watercooled versions to become cheaper (ie: not way more expensive than FE). Right ? Right....
Posted on Reply
#102
kings
MitsmanHaven't nvidia x90 cards historically been dual gpu cards? Could explain the high price and the cards size. I can't see any other reason why they would bring back a xx90 series card unless its a dual gpu
Very unlikely, but it would be a good plot twist. :cool:

Both AMD and Nvidia have been increasingly distancing themselves from the dual-GPU thing, including SLI/Crossfire.

It´s a solution that has many disadvantages. The future will be MCM designs and that is what both companies are working on.
Posted on Reply
#103
Dave65
DuxCroMassive card with a massive price. I doubt AMD will be able to compete. Maybe with RDNA 3 which will bring the chiplet GPU design and hopefully a massive boost in performance.
AMD don't need to compete with the three people who can afford that..
Posted on Reply
#105
lexluthermiester
ZoneDymohonestly though, if done neatly that could be a pretty cool case mod, literally cutting in the side panel to have the videocard stick out, basically one of those muscle cars but then for PC
As cool as I could likely make it, the base hardware is already slightly bottlenecking the RTX2080 I have in it now, While there would still be a boost in GPU performance, the CPU bottleneck would become very pronounced. I think the RTX2080 is the best GPU one can pair with an unlocked socket 1366 Xeon and expect to get reasonably balanced performance.
kingsSome people are exaggerating a lot, the dimensions don't seem to be very different from most 2080Ti AIB versions.
While you make a good point, it is big enough that fitting it into some systems will be a challenge and impossible for others.
Posted on Reply
#106
valrond
kingsVery unlikely, but it would be a good plot twist. :cool:

Both AMD and Nvidia have been increasingly distancing themselves from the dual-GPU thing, including SLI/Crossfire.

It´s a solution that has many disadvantages. The future will be MCM designs and that is what both companies are working on.
Yes, it has SOME disavantages, but also a lot of advantages. The reason to kill SLI has a lot to do with the overinflated prices of the 2080Ti. If SLI were to work in most games like in the past, would you rather get a 2080Ti, or get a second 1080Ti that would be a lot cheaper and much more powerful?.
Posted on Reply
#107
Batailleuse
RedelZaVednoGeForce GTX 960 = $199 -> fast forward 6 year -> RTX 3060 = $399. (+100%)
GeForce GTX 980TI = $649-> fast forward 6 year -> RTX 3090 = $1,399 (+100%)
RX 470 = $179 -> fast forward 4 years -> RX 5700 = $349 (+98%)
WTF just happened??? 100 % price increase in 6 years... It's not inflation (inflation 2014-2020 = 9,7%), it's GREED Inc.

F that, I'm out of DIY PC building until Leather Jacket One and his niece come to their senses which will probably never happen.
Well they inserted more tier compared to before

960/970/980
1650/1660/2060/2070/2080/2080ti/titan

Thats literally over double the amount of reference per generation (I do not even count the updated super and ti when it happens)

Your previous 960 is more akin to either 1650ti or 1660 (200-300$ bracket)


Same goes with amd

5500xt/5600xt/5700xt they did not even drop a 5800xt

But basically your rx470 would match 5600xt not 5700xt since there was an rx480.

Pick the tier that match compared to the rest of the offers currently proposed, not simply by naming conventions.
GeorgeManExactly. I got my 1080Ti for 670€ on an EVGA deal. I'm not paying more for lower end cards. I'll keep it as long as I can and if then things are still the same, I'm out too. Vote with our wallets.
Yeah but 1080ti was an upgrade to the 1080

They dropped an unmatched and no upgrade 2080ti on launch day the 1080ti is more akin to a 2080 super which is around the same pricing actually.

2080ti/titan are purely for bragging. Not that RTX had any use this generation, a totally gimmick feature up until now. I have a 2080ti and the only game that had it at launch was control. Too many game said they would support RTX and maybe they had it but I'm not waiting 6mo post release of a game just to see rtx, it either at launch or not at all for me.

Hopefully thst changes with next gen console but we'll see.
Posted on Reply
#108
kings
valrondYes, it has SOME disavantages, but also a lot of advantages. The reason to kill SLI has a lot to do with the overinflated prices of the 2080Ti. If SLI were to work in most games like in the past, would you rather get a 2080Ti, or get a second 1080Ti that would be a lot cheaper and much more powerful?.
SLI/Crossfire, even in its bright days, has always had a lot of problems with support, it is very dependent on the drivers for each game and the goodwill of game developers.

Then there was the problem of poor scalability. Adding a second GPU, at best, could bring 40% or 50% more performance, but mostly it was well below that. That is, you paid for a GPU to have half or 1/3 the performance of it, depending on the game. And it got worse, the more GPUs we put in.

There could be rares cases where it paid off, but as a general rule, after a few years, it was better to sell the card and buy one of the new generation, avoiding a lot of hassle. Not to mention the heat, noise and power consumption that SLI/Crossfire usually caused. Often, the GPU that was on top was constantly throttling due to not having room to breathe, further decreasing the performance gain.

I don't think it has anything to do with the price of the RTX 2080Ti, Nvidia started to follow this path long before that, for example, the GTX 1060 in 2016 no longer had support for this. AMD did not embark on cards over $1000 and also abandoned dual-GPU.
Posted on Reply
#109
Totally
medi01Have they ever voiced power consumption figures, by the way?


How dare they.


A lovely strawman.
oops typo, fixed. Besides that strawman? When hasn't that been the case in that price segment? That they come with 10-25% of an Nvdia card and price it proportionally? with worse power consumption might I add? Also when Nvidia at added a tier at bottom to hide their price hike, what did AMD do? Matched Nvidia's new pricing.
Posted on Reply
#111
mouacyk
MitsmanHaven't nvidia x90 cards historically been dual gpu cards? Could explain the high price and the cards size. I can't see any other reason why they would bring back a xx90 series card unless its a dual gpu
It might be so obvious we are not seeing it. Hardware is going MCM. DX12 was meant to support multi-GPU effortlessly, same as Vulkan. So the APIs already had it. Games just need to implement it. May be NVidia is ready to push, for the glory of 4K and RT?


75% scaling in the best case with heaviest draw calls: 1070 + 980 Ti in Explicit Multi-GPU mode.

NVidia silently added checker board rendering for multiple GPUs into drivers 8 months ago.
Posted on Reply
#113
Space Lynx
Astronaut
mouacykMay be NVidia is ready to push, for the glory of 4K and RT?
for the glory!!!!
theoneandonlymrkGood luck finding a use for this in 12 years!?.

Be lucky to get five, things tend to double in performance making these cards pretty wall mounts
normally I would agree with you, but I think those days are gone now, moving forward after this bump will be 5% max gains on fps ends, but improving DLSS and RT and any other new gimmicks they come up with to keep us spending money and not worrying about the 5%.
Posted on Reply
#114
Minus Infinity
Stupid price, stupid size, stupid power consumption. Pathetically desperate effort from Nvidia to cling to the performance crown no matter what. Again so much time devoted to an ultra niche product only 0.01% of gamers will actually buy despite all the keyboard warriors claiming otherwise.
Posted on Reply
#115
semantics
Minus InfinityStupid price, stupid size, stupid power consumption. Pathetically desperate effort from Nvidia to cling to the performance crown no matter what. Again so much time devoted to an ultra niche product only 0.01% of gamers will actually buy despite all the keyboard warriors claiming otherwise.
Halo branding works. Even if it loses them money halo products raise the perceived worth of products downline due to it raising the perceived worth of the brand itself.

It's unclear if this what appears to be a reaching product is due to trying to justify RTX/4K and people actually buying new top end gpus because a 1080ti still gets you there at the top end at 1440p and 4k so upgrading has been a wash with RTX. Or this reach is due to anticipating actual competition from AMD even though nvidia hasn't had competition for the performance crown since the 1080ti was released.
Posted on Reply
#116
Agentbb007
They probably leaked the $2000 price so when $1400 was leaked we’d be happy. But still these prices are getting a bit nuts for a part to play games. Console gaming is starting to look a lot more attractive to me.
Posted on Reply
#118
silkstone
I wonder if there are any ITX cases that are big enough to fit this beast.
Posted on Reply
#119
JalleR
lexluthermiesterThat beast will not be going into my Dell T3500. It literally won't fit in physically. I'm sure it would work though. Might be time to build a new system... Been dragging my feet for almost a year anyway... I've settled on Threadripper, just trying to decide on which one..
Well as a temp setup while building hard tubing in my primary pc I put my Asus Rog Strix OC 1080TI in a T3500 I had to tweak the Closing mechanism for the cards a little but it worked like a charm :)
Posted on Reply
#120
medi01
mouacykDX12 was meant to support multi-GPU effortlessly, same as Vulkan.
Effortlessly for the card manufacturer. (no need to support it in driver)
TotallyWhen hasn't that been the case in that price segment?
It is hard find examples that match your weird take, than examples that don't.
5700XT is about 10% slower, but 20%+ cheaper than 2070sup and simply faster than 2070 which it matches price wise.
There were times when 10-20% slower AMD gpus were selling for the half of NV price.
mouacykMay be NVidia is ready to push, for the glory of 4K and RT?
What the heck, dude, seriously?

There is a baseline of "resolution and framerate acceptable to the users", which determines how much complexity could be dedicated to graphics fidelity.
Now next gen consoles are officially targeting 4k (with 2080+ kind of GPUs).
FPS wise, many studios will settle with 30.
So achieving 4k 30fps on PC will be doable with GPUs somewhat faster than those mentioned above.
There will be no need to "overpower" baseline 4 times to go 4k.

As for RT, it's very close to mere marketing push by NV at this point.
This was done without any hardware RT whatsoever and, wait for it, people had to ask if RT was used or not.
Speaks volumes about tech itself.

Posted on Reply
#121
mouacyk
medi01Effortlessly for the card manufacturer. (no need to support it in driver)
They can make it easier to use, such as GameWorks or some such library that is uniform.
medi01There is a baseline of "resolution and framerate acceptable to the users", which determines how much complexity could be dedicated to graphics fidelity.
Now next gen consoles are officially targeting 4k (with 2080+ kind of GPUs).
FPS wise, many studios will settle with 30.
So achieving 4k 30fps on PC will be doable with GPUs somewhat faster than those mentioned above.
There will be no need to "overpower" baseline 4 times to go 4k.
Most people who care about the high end cards are looking to go past 60fps at 4K (because we like our 120-144Hz monitors on PC, mind you). And NVidia likely cares too from their side, because they may really want to deliver on BFGD someday. You put all the pieces together, from API readiness to driver optimizations to tech demos and the new MCM hardware roadmap, there is no doubt that explicit multi-GPU will be the answer to go beyond 60fps at 4K. (It's that classic throw money at the problem solution, because additional hardware does scale -- it's just a matter of software support, as demo'ed by Ashes.)
Posted on Reply
#122
medi01
mouacykThey can make it easier to use, such as GameWorks or some such library that is uniform.
Possibly, although I doubt it. Besides, only a tiny part of the market using that, means it's a wasted effort.
mouacykthere is no doubt that explicit multi-GPU will be the answer to go beyond 60fps at 4K
Definitely not.
As with 4k, you could get that by lowering complexity of the rendered objects.
If devs, for some far from obvious reason (and "I have a device that could show more fps" is one hell of a funny argument). decide to target 60fps, well, you'll have it.

The most recent time someone tried to highlight 60fps was that awkward Halo Infinite demo that born countles memes:



Now, most games are cross-platform.
95% of the steam survey PC market is slower than PS5/XSeX.

Getting a card that is roughly faster than 2080sup would do it.
Now let's do the napkin math: 2080sup * 1.11 = 2080Ti., 2080sup * 2 = GPUThatShouldBring4k60fps

GPUThatShouldBring4k60fps = a card that is 80% faster than 2080Ti.

Just stresses how nonsensical PC Master Race becomes. You won't be able to vastly overpower consoles, but if you do not have that weirs penis size to fps neural entanglement, why would you.
Posted on Reply
#123
dicktracy
So the reference cooler is finally onpar with highend AIB coolers. Unlike the angry mobs, I commend Nvidia for not cheapening out on reference cards. Thank you!
Posted on Reply
#124
dj-electric
dicktracySo the reference cooler is finally onpar with highend AIB coolers. Unlike the angry mobs, I commend Nvidia for not cheapening out on reference cards. Thank you!
There are no reference cards, that's the joke with NVIDIA reference cards. It will now be FE and command a higher than MSRP price without having an actually cheap version with that MSRP, just like with high end RTX 20 cards.
Posted on Reply
#125
Blueberries
I've been craving another power monster since the 295x2.
Posted on Reply
Add your own comment
Apr 19th, 2024 08:32 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts