So, I borrowed a friends Astral RTX 5090 for a few weeks while he was overseas for work. My system otherwise has a Asus Tuf OC RTX 4090 (2x HDMI 2.1 & 3x DP).
With HWINFO64 and a wall power guage from my psu and another digital guage I can confirm Igor's Labs findings:
https://en.gamegpu.com/iron/energy-consumption-analysis-rtx-5090-power-up-to-901-vt-v-peak
It regularly jumped into the 630w+ with a few very near 900w moments.
Sure, it is a fancy card and does a great job to keep that heat flowing out of the card, I have had to make other compromises in the system.
Such as reducing my DDR5 2x32gb 6400mhz XMP specs to stock XMP CL32 from CL30.. CPU reduction and other.
Oh and I dislike MFG. Frame gen 1x is the max I'd ever use with MFG only okay-ish when have more than enough fps anyway. I really wish for all the power this thing consumes and heat it radiates into my case raw performance would be higher..
So, after returning this $7,000 AUD card my friend actually shelled out, as he was due for an upgrade and well he can afford to spend $50k on a PC these days.. the jerk. but I am happy to receive my silicon lottery double hdmi 2.1 card that has the open to +33% power and increase stock voltage. However, with a +3% power max of 464w I have the 4090 curve to live between 2985-3120mhz core and +1200 memory. With a custom fan/Voltage curve for the last couple years.
In reality it rarely goes over 400w @ 100% load and zero coil whine ever. hitting max power very rarely, but only during game benchmark runs or 3DMark tests.
So, I am happy to get my top 1% of the 4090s according to 3DMark card back and uses 250w less power than the 5090 did for barely 20% more perf. Even when I frame cap games to 119 fps (for my LG C3 42" TV) I have connected via HDMI 2.1 cable. Any game set to do the same fps has the 5090 running at 550-650w all the time and my 4090 running 300-400w for the same task.
The CO2 environment, my bank account and frying other electronic parts inside my case are now much happier.. and so are my previous CPU, RAM overclocks & concerning was my M.2 SSD hard drive temps that needed to throttle down to not overheat! Dodgy.. even with heatsinks on them with my fancy z790 mboard with extra cooling bits. With my 4090 back, they are now literally half the temp after an hours gaming of 4k Stalker 2.
Just like to say, please Intel and ATI/AMD have a 4090 beater in the next year or two. I would rather not do Nvidia again, as it is pathetic how they actually lie and use shonky graphic art on websites etc to make it look like a 5070 is as fast or faster than a 4090. If I sold my 2 yo 4090 assuming the much cheaper 5070 would actually beat it. Heck, needs to be a class action against Nvidia to reign them right back into reality. After massive stock valuation growth, providing a unrealistic value of the company.
The false claims, with one example regarding a RTX 5070 having same performance of the RTX 4090 and using graphs missing labels and wildly misleading wordplay. Is criminal to say such things in CES this year. They really needed to stand behind their previous products in the 4000 series to show 4k frames per second with Frame Gen, DLSS off then compare 4k with the usual Cyberpunk and other fancy graphically gorgeous games like the new Indiana Jones. Then plonk on FG and MFG to compare @ 4k dlss quality then DLAA ... see raw power vs a decent 4090 .. not a 1 fan poorly ventilated case etc.
Reality is a decent 4090 can get the 120fps in pretty much every game without FG or anything worse than DLSS Quality. While using 150w+ less power and less heat in the case. And having paid almost 70% less than the 5090 new over 2.5 years ago.
Now Nvidia using extaution with threats to not allow review sites and video channels to check out the cards how they like. Or they have to wait a long time until they personally have to purchase the card/s with weird low stock numbers. Reason is 99% to keep their share price pumping up up and away to fund their AI r&d and sell a fk load of infrastructure with pure lies upon performance from lack of independent reviews and proper supplied parts by nvidia with technical manuals to really see what they can do.
Anyway, rambling on here. More of a letter to myself but screw how Nvidia has turned a bit.. evil. Peace to us all