• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Are you camping out / lining up for an RTX 4090?

Are you camping out / lining up for an RTX 4090?


  • Total voters
    184
3090 is 350W, 4090 is 450W which is ~30% more power consumption for ~80% more performance. Extrapolating from that, it's about 65% better performance per watt.

AMD would have to achieve 70-75% better performance per watt if they plan on keeping the same power limits they currently have for 6900/6950XT to beat nV or increase the power limit to above what nV has for the 4090 for the promised 50% perf per watt improvement.
 
All right, the base line is still 10% more performance in favour of the RTX 3090 Ti over the Radeon RX 6950 XT.
So, we see 82% better RTX 4090 over RTX 3090.
~~50% improvement of a 350-watt Radeon RX 7000 over Radeon RX 6950 XT.
and ~~10% performance interiority of the RX 6950 XT compared to RTX 3090 Ti (not speaking about RT where the performance gap is gigantic).

View attachment 264855
AMD have the chiplets ~ pretty sure they can beat Nvidia on the top end if they go crazy with unlimited power budgets!
 
Prices are going up in Denmark a privious 18.000 dkr listing is now 18.500 dkr. Well I am sitting this one out
 
We will see tomorrow, I guess @W1zzard has ready reviews of the RTX 4090 for tomorrow.
80% performance uplift is probably the minimum.
 
Prices are going up in Denmark a privious 18.000 dkr listing is now 18.500 dkr. Well I am sitting this one out
Yes it is Crazy......
 
I'm not even sure I understand why people camp for stuff, doing without isn't so bad.
I can speak to this. BITD when the release of the Legend Of Zelda OOT was going to hit, we all knew the rush was going to happen and so did SoftwareEtc. They had a 12am door opener event and the line in the Mall was over 300 people long. If you didn't reserve it, you weren't getting one, but even if you had a copy reserved, if you didn't get there soon enough, you had to wait for the next shipment a month later. My wife and I stood in line for a few hours, but it was worth it. Granted, we got there 7 hours early and were 32rd in line. That shipment was 6 boxes of 12 copies each, so you can do the math.

Of course, that was for a game that had a limited supply for the remainder of the year, not a video card. Different time, different situation...

Point being, somethings are worth waiting in line for, other are not.. A video card? Nah.

80% performance uplift is probably the minimum.
80% is very unlikely. 40% above the 3090 is more realistic.
 
Last edited:
3090 is 350W, 4090 is 450W which is ~30% more power consumption for ~80% more performance. Extrapolating from that, it's about 65% better performance per watt.

AMD would have to achieve 70-75% better performance per watt if they plan on keeping the same power limits they currently have for 6900/6950XT to beat nV or increase the power limit to above what nV has for the 4090 for the promised 50% perf per watt improvement.
Just wanted to correct myself because my math was completely incorrect.

3090 achieves 100fps at 350W, 100/350= 0.29 perf/watt index
4090 achieves 100fps*1.8=180fps at 450W, 180/450= 0.4 perf/watt index.

0.4/0.29=1.38 which means ~40% better performance per watt for the 4090 compared to 3090.
If AMD achieves their promised 50% performance per watt improvement and has the same 450W power limit as nV then they would have pretty much the same performance considering their ~10% deficit in performance comparing 6900XT to 3090(Ti)
 
Nah, my RX 6700 XT is already plenty for my typical usages

Will maybe line up for a RX 7X00 XT :p
 
I am happy to read the reviews, but I don't buy any GPU over 200-250W max. It is as simple as that. The recent trend for GPU and CPUs is just ridiculous imo, like toasters with touchscreen.
 
but I don't buy any GPU over 200-250W max. It is as simple as that.
A lot of people are drawing that line. I too am not willing to use a GPU that has over a 300w draw. And it's not about the money or electricity. I don't want my PC to double as a space heater during use. I have enough problems with heat in my home without PC parts adding to them.
 
A lot of people are drawing that line. I too am not willing to use a GPU that has over a 300w draw. And it's not about the money or electricity. I don't want my PC to double as a space heater during use. I have enough problems with heat in my home without PC parts adding to them.
i had a nightmare of a problem even with a Corsair AX1200 and my 3090 @ 350W draw, remember?

:( so not only is it close to $2000 after taxes and fees out the door for a 4090, some may have to buy a hardcore PSU too.


Voltage Droop Enough to be Concerned With? - TechPowerUp

---
@lexluthermiester ,
I don't want my PC to double as a space heater during use. I have enough problems with heat in my home without PC parts adding to them.
1665427860619.png
 
Last edited:
Just wanted to correct myself because my math was completely incorrect.

3090 achieves 100fps at 350W, 100/350= 0.29 perf/watt index
4090 achieves 100fps*1.8=180fps at 450W, 180/450= 0.4 perf/watt index.

0.4/0.29=1.38 which means ~40% better performance per watt for the 4090 compared to 3090.
If AMD achieves their promised 50% performance per watt improvement and has the same 450W power limit as nV then they would have pretty much the same performance considering their ~10% deficit in performance comparing 6900XT to 3090(Ti)

i find this all rather interesting. Although ive been viewing TPU performance charts for some time now I havent really bothered looking at P/W stats. So essentially the P/W metric determines efficiency per frame?

Just curious, previously with high performance graphics cards have we ever reached 1w=1fps?

Excuse my constantly wandering inquisition.... in a similar speculatory break-up, are there any stats which determine how much we're paying for each frame (average) whereby we can compare the speculated $-per-frame for a 4090 vs lets say 1080 TI (or previous to 1000-series)?
 
I am happy to read the reviews, but I don't buy any GPU over 200-250W max. It is as simple as that. The recent trend for GPU and CPUs is just ridiculous imo, like toasters with touchscreen.

A lot of people are drawing that line. I too am not willing to use a GPU that has over a 300w draw. And it's not about the money or electricity. I don't want my PC to double as a space heater during use. I have enough problems with heat in my home without PC parts adding to them.

My line is 2 8-pin PCIe cables and recommended PSU <750W.
 
i had a nightmare of a problem even with a Corsair AX1200 and my 3090 @ 350W draw, remember?
Vaguely.
:( so not only is it close to $2000 after taxes and fees out the door for a 4090, some may have to buy a hardcore PSU too.
Right. I was referring more to the 4080's, which are the more affordable options for most people.
Ah, ok. You had an older high-end PSU. Your current PSU is perfectly fine for the new gen of RTX cards.
 

You allow your cat to chill on top of your rig - that too with huge vents? :banghead: Mate even if a kid/etc puts a smudgy fingerprint on my rig or even makes any type of contact with the build's exterior... my OCD kicks in and im in panic stage to get them removed from the room lol (or at a minimum 2 metre distance)

ok on close inspection that does look like a dust filter below the top vent... but still....

1W=1 FPS in what? The 6900 XT can make many more than that in Doom Eternal, let alone CS:GO.

I thought these stats were based on the 10 (or more) benched game averages in TPU charts... no? A mix of less/high demanding titles?


I did look at similar ones but no 900/1000 series cards listed. Also these are charted in percentage ascension order... just wandering if there's already available charts with $-per-frame metrics per card (or cheekily expecting someone else to do the maths for us hehe). Actually rethinking... i guess modern games demanding way more rendering power will distort those comparisons.
 
Nope, not for me, I've enough cards here, anymore and I think the house might start falling in....
 
10 (or more) benched game averages
Games are always evolving to take up the available hardware. The GTX 660 made 61 FPS in Skyrim at 140W vs the 3060 making 148 FPS* at 170W. However, in their testing 1920x1200**, it made 64FPS average at 140W. The 3060 makes 117 FPS at 170W. So, either the 3060 is 60% more efficient at contemporary games or games just haven't caught up yet.

*Outside source
**Limited data, I ran out of time
 
Nope, not for me, I've enough cards here, anymore and I think the house might start falling in....

Old stuff out, new stuff in.... :D
 
Back
Top