• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce RTX 5090 Powered by "GB202" Silicon, 512-bit GDDR7, ASIC Pictured

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,801 (7.40/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Here is the first picture of what is very likely the GeForce RTX 5090 "Blackwell," the successor to the RTX 4090 "Ada." The picture, its massive GPU, and layout appear to confirm the weekend's bare PCB leak. The RTX 5090 is powered by the "GB202" silicon, the largest gaming GPU based on the "Blackwell" graphics architecture. The silicon in the picture has the ASIC code "GB202-300-A1." From this ASIC code, we can deduce that the RTX 5090 may not max out the silicon (i.e. enable all SM present on it), as maxed-out NVIDIA ASICs tend to have the variant designation "450."

The "GB202" ASIC is surrounded by sixteen GDDR7 memory chips, which reportedly make the 32 GB memory size of the RTX 5090. The chip count, coupled with the large GPU package size (high pin-count), confirm that the "GB202" features a 512-bit wide memory bus. Assuming a memory speed of 28 Gbps, this memory bus should yield a stellar memory bandwidth of 1,792 GB/s. The GPU and memory are surrounded by the card's 24-phase VRM solution. This draws power from a single 16-pin 12V-2x6 power connector. NVIDIA will likely max out the 600 W continuous power-delivery capability of the connector, and give the card a TGP of around 500-550 W, if not more.



View at TechPowerUp Main Site | Source
 
that is a whole lot wider, gonna guess this will be everyone's favorite space heater.
 
600Watts if not more WOW looks like for those who have 850W psu and 1000W psu make sure too add a 1200+psu alongside with the 5090 order.
 
its very neatly packed... but wouldnt it be better if the components was more spread out... looks like the thing will run super hot...
 
its very neatly packed... but wouldnt it be better if the components was more spread out... looks like the thing will run super hot...

It's designed around the flow through PCB that aids the cooling performance, which in turn can make components cooler
 
它包装得非常整齐......但是,如果组件更分散不是更好吗......看起来这东西会运行得非常热......
The Graphic Card made in accordance with PCI specifications needs to control a certain volume so that it can be put into the Case. Because Of the 512bits VRam bus, the GPU package will be quite large, which will waste a large part of the space on the PCB, in fact, in this case, the electronic components can not be placed very scattered.
 
600Watts if not more WOW looks like for those who have 850W psu and 1000W psu make sure too add a 1200+psu alongside with the 5090 order.
I have the PSU for that thingi - but I guess I will have to rob a bank before I can afford that GPU ;)
 
that is a whole lot wider, gonna guess this will be everyone's favorite space heater.

Yes, well, hate to be "that guy" but people that can afford these GPUs, care little for power consumed and definitely have air conditioning to not care about the heat being put out. :)
 
Run it at 0.5 - 0.7V voltage, and I bet it's going to be fine on 2-slot air cooling :D
 
One of them "watercool it and put the radiator in the other room" type of deals
 
its very neatly packed... but wouldnt it be better if the components was more spread out... looks like the thing will run super hot...
Wires prefer to be short because that improves speed and efficiency. That's why companies take extreme efforts to design and build GPUs and AI processors in 3D (to various degrees, called 2.5D or 2.3D or 2.1D for example). Systems will only get more concentrated, not less.

Yes, well, hate to be "that guy" but people that can afford these GPUs, care little for power consumed and definitely have air conditioning to not care about the heat being put out. :)
True or false. If you buy one, chances are you're going to put it to work 24/7/365. That would eat up 2,400 € worth of electricity in three years where I live (at 0.15 €/kWh). Not something to ignore (but highly dependent on country/region of course).
 
600Watts if not more WOW looks like for those who have 850W psu and 1000W psu make sure too add a 1200+psu alongside with the 5090 order.
Exactly the opposite. Given the hard limit on the 12vhpwr connector, with a safety factor of 1.1, the card cannot exceed its power rating.

It basically needs the same PSU as this 350w card.
04-Peak-Power-1.png
 
Exactly the opposite. Given the hard limit on the 12vhpwr connector, with a safety factor of 1.1, the card cannot exceed its power rating.

Are you sure? Which technical device limits the short time current over a connector? Please explain.
Just enlight me about the cannot exceed its power rating statement also. Please explain.
 
512 bit will be the challenge, I think AMD did something like that way back with the R9 290X and it lead to a very hot pcb layout.
Anyway, this GPU hyper performance will be only matched by its unattainable cost.
 
This thing is massive. Expect prices in the $2000 range. It's only GeForce in name now.
 
This thing is massive. Expect prices in the $2000 range. It's only GeForce in name now.
I wouldn't be surprised that if it starts from 2500EUR (incl. VAT). Even the cheapest 4090 cards are still starting from 2055EUR here.
 
This thing is massive. Expect prices in the $2000 range. It's only GeForce in name now.
It's always been GeForce in name only for the Titan and X090 series. It's a prosumer card for data stuff at home hence the specs especially VRAM. 2000 even 4000 is still cheap to that crowd. The heat doesn't matter nor does the power consumption as really what these are going to see is the same as the 4090 and prior cards. 4-8 of them slapped into a box with waterblocks (Camino had those out before we knew the cards specs and was taking orders) dual PSUs, Threadripper/Xeon platforms with massive internal radiators or what's becoming the norm ports out to external radiators with the option for a rackmount of the same product.

All the way back to the 8800gtx nvidia was stating CUDA was their future. When the first Titan hit the selling point was it's use in professional instances it wasn't until companies like Falcon NW decided it was SLI in a card slammed into an ITX case and people like Linus started buying them up that it being a "gaming" card took hold.

GeForce doesn't even really mean gaming. Tons of companies deploy GeForce based laptops that will never game but use the GPUs for other professional things that do not need the Quadro drivers or the price associated with those.
 
It's only GeForce in name now.
The Titan(s), for whatever cumulative reasons but surely price for one, didn't sell well enough apparently.

Reintroduce that tier of card as a XX90 and start the price lower, then gradually ratchet it up.
 
Yes, well, hate to be "that guy" but people that can afford these GPUs, care little for power consumed and definitely have air conditioning to not care about the heat being put out. :)
Already there with RTX4090. Another 150watts? I'm gonna need a bigger fan for the "Sim" room. Maybe a ceiling fan. We'll cross that bridge when we come to it.
 
Are you sure? Which technical device limits the short time current over a connector? Please explain.
Just enlight me about the cannot exceed its power rating statement also. Please explain.
It's all about the reliability of the new connector which wasn't a thing back in the days with 6+2 pci-ex connectors.
Nvidia already hard locked the power draw, spikes wise, on the 4090 when you bring the PL to 600w.

04-Peak-Power-600W.png


Nvidia barely took advantage of the pci-express slot power delivery on the RTX 4000 serie, they could bring it back with an additional ~50w contribute from it. But it isn't a game changer and they probably stopped using it as a safety measure.
 
Last edited:
Anyone got an idea how much a 5090 will cost?
 
Oh well my 4090 churns through games @ 4K barely using 250-300W (stock FE clocks & undervolted). Not sure if the 5090 will even break a sweat running today games @ locked 4K 144hz, would definitely need a 4K 240hz monitor to stress the 5090.
 
Back
Top