Thursday, September 10th 2020

NVIDIA GeForce RTX 3090 Looks Huge When Installed

Here's the first picture of an NVIDIA GeForce RTX 3090 Founders Edition card installed in an tower case. The triple-slot card measures 31.3 cm in length, and is 13.8 cm tall. Its design is essentially an upscale of that of the RTX 3080. The card still pulls power from a single 12-pin power connector, with an adapter included for two 8-pin connectors to convert to the 12-pin. The typical board power of the card is rated at 350 W. This particular card in the leak, posted on ChipHell forums, is pre-production as VideoCardz comments, given that some parts of its metal superstructure lack the chrome finish of the card NVIDIA CEO Jen-Hsun Huang unveiled on September 1. The RTX 3090 launches on September 24.
Sources: VideoCardz, ChipHell Forums
Add your own comment

122 Comments on NVIDIA GeForce RTX 3090 Looks Huge When Installed

#76
xman2007
From the nvidia presentation regarding Ampere, it seems there is no IPC or architectural increase when compared to Turing, infact the "huge" performance increase seems to have come from doubling down on the shader units and not from an architectural pov? obviously there are power consumption benefits, possibly from moving from 14nm to Samsung 8nm (aka 10nm) but aside from that if ampere had the same number of shaders as turing, there would likely be virtually no difference in performance, ray tracing excluded of course.
Posted on Reply
#77
bug
xman2007From the nvidia presentation regarding Ampere, it seems there is no IPC or architectural increase when compared to Turing, infact the "huge" performance increase seems to have come from doubling down on the shader units and not from an architectural pov? obviously there are power consumption benefits, possibly from moving from 14nm to Samsung 8nm (aka 10nm) but aside from that if ampere had the same number of shaders as turing, there would likely be virtually no difference in performance, ray tracing excluded of course.
Is it bad if Ampere is Turing with beefed up RT and tensor cores?
Posted on Reply
#79
xman2007
bugIs it bad if Ampere is Turing with beefed up RT and tensor cores?
Not at all, I'm just expressing an opinion, there doesnt seem to be any IPC or architectural improvements, yes there is more performance but that seems to have been brought upon by the shader increase and not anything to do with a newer/more refined architecture, but 2080 SLI in a single GPU is still a massive deal and it costs less than half of the previous gen to achieve so it's still a win.

In fact if anything they should be lauded for their power efficiency, as it's Turing IPC and shaders with 1/3 less power consumption, probably down to the node shrink and more conservative core/boost clocks plus like you said "beefed" up RT and tensor cores
Posted on Reply
#80
AddSub
What's the ROP count on these?

...
..
.
Posted on Reply
#81
windwhirl
AddSubWhat's the ROP count on these?

...
..
.


Ignore the ones that have just "2020" as Release date. Those are just placeholders.
Posted on Reply
#83
AsRock
TPU addict
bonehead123wHY OH Why did they put the power connector in the middle of the card, thereby creating moar visible cable clutter, especially with the adapter ?

f.A.i.L....
RahnakThat’s where the PCB ends.
And did not want to use glue again, even more so on the 3080.
Posted on Reply
#84
Cosmocalypse
DuxCroI'll probably get buried by Nvidia fanboys here, but I've been doing some thinking. If by some people on the net RTX 3090 is around 20% faster than RTX 3080, isn't this card basically an RTX 3080Ti with a bunch of VRAM slapped on it? But this time instead of charging $1200 for it, Nvidia had this clever idea to call it a Titan replacement card and charge +$300 knowing that people who were willing to spend $1200 on previous top card will spend extra $300 on this card.

So RTX 70 segment costs the same $499 aagain. RTX 80 segment costs $699 again. But the top segment is now + $300. Performance gains being as from 700 to 900 series and from 900 to 1000 series. Which used to cost considerably less. 1080Ti was $699
It is a Titan replacement. This is always how they do it. Releasing the Ti version at the same time as the regular card is something that normally doesn't happen (like it did with 20xx series). In 6 months they'll release a slightly cut down 3090 with less VRAM and that will be the 3080 Ti. They charge a high price for the latest tech and those capable will buy it as early adopters. Like the Titan, it's really a workstation card. Gamers should go for 3080.
Posted on Reply
#85
lemoncarbonate
I don't know, at first glance the card looks clean, but I'm not digging it when it's installed. Maybe the fan and power connector placement ruin the otherwise clean aesthetic. Just in my opinion. I'm more into classic look with plain backplate and fan facing down.

Although FE card is never sold in my country.
Posted on Reply
#86
Rob94hawk
bonehead123wHY OH Why did they put the power connector in the middle of the card, thereby creating moar visible cable clutter, especially with the adapter ?

f.A.i.L....
Are you buying the 3090 to look pretty or to give you incredible 4k fps?

I'll take awesome fps in 4k for $1499 Alex.
Posted on Reply
#87
Jcguy
DuxCroI'll probably get buried by Nvidia fanboys here, but I've been doing some thinking. If by some people on the net RTX 3090 is around 20% faster than RTX 3080, isn't this card basically an RTX 3080Ti with a bunch of VRAM slapped on it? But this time instead of charging $1200 for it, Nvidia had this clever idea to call it a Titan replacement card and charge +$300 knowing that people who were willing to spend $1200 on previous top card will spend extra $300 on this card.

So RTX 70 segment costs the same $499 aagain. RTX 80 segment costs $699 again. But the top segment is now + $300. Performance gains being as from 700 to 900 series and from 900 to 1000 series. Which used to cost considerably less. 1080Ti was $699
Why are you worried about what people are willing to spend? If you can't afford, don't buy it. That simple.
Posted on Reply
#88
zo0lykas
Evga 3years and another 2 if you register item, so 5 in total.
P4-630EVGA only gives a 24 month warranty in my country, while most other brands 36 months.
Posted on Reply
#89
agent_x007
xman2007From the nvidia presentation regarding Ampere, it seems there is no IPC or architectural increase when compared to Turing, infact the "huge" performance increase seems to have come from doubling down on the shader units and not from an architectural pov? obviously there are power consumption benefits, possibly from moving from 14nm to Samsung 8nm (aka 10nm) but aside from that if ampere had the same number of shaders as turing, there would likely be virtually no difference in performance, ray tracing excluded of course.
Arch changes are kinda like Sandy/Ivy Bridge vs. Haswell (if you like IPC comparisons).
You get more execution hardware (AVX2), with more cache bandwidth to not starve it.
Posted on Reply
#90
BlackWater
The amount of VRAM on the 3090 made me think on the following:

Let's assume most enthusiasts right now game at 1440p/144 Hz, and that Nvidia is doing a strong push for 4K/144 Hz. OK, so far, so good. But even then, we know that 4K doesn't need 24GB of VRAM. They say the card is capable of "8K", but this is with DLSS upscaling, so we are not talking about actual 8K native resolution rendering. Regardless of IPC improvements or not, I absolutely don't believe we have the processing power to do 8K yet, and even if we did... We're gonna do 8K on what exactly? After all, this is a PC GPU - how many people are going to attach this to a gigantic 8K TV? And let's not even mention ultra-high resolution monitors, the very small amount of them that exist are strictly professional equipment and have 5 figure prices...

So, considering that 1440p is 3.7 Mpixels, 4K is 8.3 Mpixels and 8K is 33.2 Mpixels, perhaps a more realistic application for the 3090 is triple-monitor 1440p/4K @ 144 Hz? 3x 1440p is 11.1 Mpixel, which is slightly above one 4K display's resolution, so it shouldn't have any trouble driving it and with DLSS, triple 4K is about 25 Mpixel, which seems somewhat possible - perhaps then the 24 GB VRAM would come into play?

But even then, where are the 4K monitors - at the moment the choice is very limited, and let's be honest, 4K on a 27" panel makes no sense, and there are a few monitors at 40+", which again, for a triple monitor setup doesn't really work either. So, either a wave of new 4K/144 Hz monitors at about 30" is coming or... The 3090 doesn't really make much sense at all... And I'm not even talking about the price here, it's irrelevant. The question is - why does the 3090 actually exists and what is the actual application for it - Titan replacement or not, Nvidia is strongly pushing the card as a gaming product, which is all fine, but I fail to see the scenario where the 24 GB VRAM is relevant to gaming. Regardless, in about 2 weeks, benchmarks will tell us all we need to know. :D
Posted on Reply
#91
sk8er
I want $799/$849 3090 12gb on 3080FE size/2 slots cooler (for itx build).
with half vram, cheaper 3090 i think 12gigs just enough for me, 49" 32:9 1080p or triple 16:9 1080p/1440p racing setup..

I can wait till nov/des after all Navi & 3070 16gb / 3080 20gb released to decide
Posted on Reply
#92
havox
BlackWaterBut even then, where are the 4K monitors - at the moment the choice is very limited, and let's be honest, 4K on a 27" panel makes no sense, and there are a few monitors at 40+", which again, for a triple monitor setup doesn't really work either. So, either a wave of new 4K/144 Hz monitors at about 30"
40" is also kind of small for 4K. At the distance I'm sitting 55" hits the right spot for me.
-t owner of LG OLED55C9 TV
Posted on Reply
#93
P4-630
zo0lykasEvga 3years and another 2 if you register item, so 5 in total.
Posted on Reply
#94
BlackWater
havox40" is also kind of small for 4K. At the distance I'm sitting 55" hits the right spot for me.
-t owner of LG OLED55C9 TV
Yeah, true, but in your use case, do you use the TV more as a TV, or more as a monitor? I'm wondering from the standpoint of someone hooking the 3090 to a monitor or multiple monitors on a desk, in which case viewing distance would be... about 0.5 to 1 meters, perhaps? I currently have a 27" 1440p monitor and I sit about 60-70 cm from it, and then I was looking at monitors like the Asus PG43UQ - great specs on paper, but it's pretty much the size of a TV, and I just can't see myself, or anyone really, hooking up 3 of those on a desk... You'd have to sit quite far so that you would literally have to constantly turn your head left and right, and at that point, you're better off just getting a large TV and using it from a distance, as you'd use a TV normally.

So that's what I was thinking originally, if we assume that the 3090 is targeted towards multi-monitor 4K gaming at 120-144Hz, where are the monitors suitable for that? IMO, the ideal desk implementation of 4K is 30-34", and pretty much all the ones available, that we can call 'gaming' monitors are 40+", and some are even straight up TVs without the TV tuner (the BFGDs)... So I kind of don't really get what exactly the 3090 is supposed to do. If you want to do big screen gaming in the living room, the 3080 can easily do that (allegedly), so then what even is the purpose of the 3090? It's not professional or scientific research for sure, since Nvidia is pretty much pushing it as the top-end gaming card. But to me it seems that if you try to figure out what it's supposed to do, it just comes out as a slightly bigger chip than the 3080 with a strangely large amount of VRAM, just so Nvidia can say "look what we can do, lol".
Posted on Reply
#95
havox
I'm using it mainly as a PC monitor for games, I don't watch that many movies. 0.5-1m sounds about right. It's great both in fullscreen mode for shooters, as well as running other genres like RPG or MMO in a 2K window while also having a browser and some other stuff open.

I'm personally fine with Nvidia rebranding big chungus Ampere from Titan to a xx90. I had issues with Titan series, only reference coolers meant it got outperformed by nonref cooler xx80Ti's at half the price, and also available only from Nvidia store which is not available in my country.
But here I can just buy a nonref tripple slot monstrosity and have a peace of mind for a couple years that for all the unoptimized Ubisoft garbage I can set all the graphics sliders to the right and get a minimum of 4K 60FPs, something 2080Ti was not capable of. And if it has twice more memory than I will ever need for 4K gaming... I can live with that.
Posted on Reply
#96
Gungar
DuxCroI'll probably get buried by Nvidia fanboys here, but I've been doing some thinking. If by some people on the net RTX 3090 is around 20% faster than RTX 3080, isn't this card basically an RTX 3080Ti with a bunch of VRAM slapped on it? But this time instead of charging $1200 for it, Nvidia had this clever idea to call it a Titan replacement card and charge +$300 knowing that people who were willing to spend $1200 on previous top card will spend extra $300 on this card.

So RTX 70 segment costs the same $499 aagain. RTX 80 segment costs $699 again. But the top segment is now + $300. Performance gains being as from 700 to 900 series and from 900 to 1000 series. Which used to cost considerably less. 1080Ti was $699
I dont understand your question. The RTX 3090 is the 3080ti, they changed it to 3090 maybe because they need the Ti for something else.
Posted on Reply
#97
Pumper
They did not even try to make an unobtrusive cable adapter.
Posted on Reply
#98
bug
PumperThey did not even try to make an unobtrusive cable adapter.
What would that look like?
Posted on Reply
#99
havox
PumperThey did not even try to make an unobtrusive cable adapter.
It's not a Titan, AIBs got you covered with their 3 fan RGB clown editions xD


Posted on Reply
#100
ratirt
Not sure why people think this card is Titan? It is called 3090 not Titan for one thing and second, The Titans have always had full chip enabled and this one doesn't (well most of them there was this XP and X Titan crap NV pulled off with 1080 TI era).
Posted on Reply
Add your own comment
Apr 18th, 2024 23:10 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts