Saturday, August 22nd 2020
NVIDIA GeForce RTX 3090 Founders Edition Potentially Pictured: 3-slot Behemoth!
The rumor mill has no weekend break, and it churned out photos of what appears to be an NVIDIA Founders Edition version of the upcoming GeForce RTX 3090 next to the equivalent FE RTX 2080, with the latter looking like a toy compared to the massive triple slotter. The cooler comprises of the same design we discussed in detail in June, with the unique obverse dual-fan + aluminium heatsink seen in the images below. We also covered alleged PCB photos, in case you missed them before, and all lines up with the most recent leaks. The only difference here is that pricing for the RTX 3090 FE is claimed to be $1400, a far cry from the $2000 mark we saw for certain aftermarket offerings in the makings, and yet significantly higher from the previous generation- a worrying trend that we eagerly await to see justified with performance, before we even get into case compatibility concerns with the increased length here. Either way, if the images below are accurate, we are equally curious about the cooling capability and how it affects partner solutions and pricing.
Source:
Twitter user @GarnetSunset
183 Comments on NVIDIA GeForce RTX 3090 Founders Edition Potentially Pictured: 3-slot Behemoth!
Both AMD and Nvidia have been increasingly distancing themselves from the dual-GPU thing, including SLI/Crossfire.
It´s a solution that has many disadvantages. The future will be MCM designs and that is what both companies are working on.
960/970/980
1650/1660/2060/2070/2080/2080ti/titan
Thats literally over double the amount of reference per generation (I do not even count the updated super and ti when it happens)
Your previous 960 is more akin to either 1650ti or 1660 (200-300$ bracket)
Same goes with amd
5500xt/5600xt/5700xt they did not even drop a 5800xt
But basically your rx470 would match 5600xt not 5700xt since there was an rx480.
Pick the tier that match compared to the rest of the offers currently proposed, not simply by naming conventions. Yeah but 1080ti was an upgrade to the 1080
They dropped an unmatched and no upgrade 2080ti on launch day the 1080ti is more akin to a 2080 super which is around the same pricing actually.
2080ti/titan are purely for bragging. Not that RTX had any use this generation, a totally gimmick feature up until now. I have a 2080ti and the only game that had it at launch was control. Too many game said they would support RTX and maybe they had it but I'm not waiting 6mo post release of a game just to see rtx, it either at launch or not at all for me.
Hopefully thst changes with next gen console but we'll see.
Then there was the problem of poor scalability. Adding a second GPU, at best, could bring 40% or 50% more performance, but mostly it was well below that. That is, you paid for a GPU to have half or 1/3 the performance of it, depending on the game. And it got worse, the more GPUs we put in.
There could be rares cases where it paid off, but as a general rule, after a few years, it was better to sell the card and buy one of the new generation, avoiding a lot of hassle. Not to mention the heat, noise and power consumption that SLI/Crossfire usually caused. Often, the GPU that was on top was constantly throttling due to not having room to breathe, further decreasing the performance gain.
I don't think it has anything to do with the price of the RTX 2080Ti, Nvidia started to follow this path long before that, for example, the GTX 1060 in 2016 no longer had support for this. AMD did not embark on cards over $1000 and also abandoned dual-GPU.
www.videocardbenchmark.net/compare/GeForce-GTX-960-vs-GeForce-GTX-1650-Ti-vs-GeForce-GTX-1660/3114vs4195vs4062
75% scaling in the best case with heaviest draw calls: 1070 + 980 Ti in Explicit Multi-GPU mode.
It's unclear if this what appears to be a reaching product is due to trying to justify RTX/4K and people actually buying new top end gpus because a 1080ti still gets you there at the top end at 1440p and 4k so upgrading has been a wash with RTX. Or this reach is due to anticipating actual competition from AMD even though nvidia hasn't had competition for the performance crown since the 1080ti was released.
5700XT is about 10% slower, but 20%+ cheaper than 2070sup and simply faster than 2070 which it matches price wise.
There were times when 10-20% slower AMD gpus were selling for the half of NV price. What the heck, dude, seriously?
There is a baseline of "resolution and framerate acceptable to the users", which determines how much complexity could be dedicated to graphics fidelity.
Now next gen consoles are officially targeting 4k (with 2080+ kind of GPUs).
FPS wise, many studios will settle with 30.
So achieving 4k 30fps on PC will be doable with GPUs somewhat faster than those mentioned above.
There will be no need to "overpower" baseline 4 times to go 4k.
As for RT, it's very close to mere marketing push by NV at this point.
This was done without any hardware RT whatsoever and, wait for it, people had to ask if RT was used or not.
Speaks volumes about tech itself.
As with 4k, you could get that by lowering complexity of the rendered objects.
If devs, for some far from obvious reason (and "I have a device that could show more fps" is one hell of a funny argument). decide to target 60fps, well, you'll have it.
The most recent time someone tried to highlight 60fps was that awkward Halo Infinite demo that born countles memes:
Now, most games are cross-platform.
95% of the steam survey PC market is slower than PS5/XSeX.
Getting a card that is roughly faster than 2080sup would do it.
Now let's do the napkin math: 2080sup * 1.11 = 2080Ti., 2080sup * 2 = GPUThatShouldBring4k60fps
GPUThatShouldBring4k60fps = a card that is 80% faster than 2080Ti.
Just stresses how nonsensical PC Master Race becomes. You won't be able to vastly overpower consoles, but if you do not have that weirs penis size to fps neural entanglement, why would you.