- Joined
- Jan 14, 2019
- Messages
- 15,807 (6.86/day)
- Location
- Midlands, UK
System Name | My second and third PCs are Intel + Nvidia |
---|---|
Processor | AMD Ryzen 7 7800X3D @ 45 W TDP Eco Mode |
Motherboard | MSi Pro B650M-A Wifi |
Cooling | be quiet! Shadow Rock LP |
Memory | 2x 24 GB Corsair Vengeance DDR5-4800 |
Video Card(s) | PowerColor Reaper Radeon RX 9070 XT |
Storage | 2 TB Corsair MP600 GS, 4 TB Seagate Barracuda |
Display(s) | Dell S3422DWG 34" 1440 UW 144 Hz |
Case | Corsair Crystal 280X |
Audio Device(s) | Logitech Z333 2.1 speakers, AKG Y50 headphones |
Power Supply | 750 W Seasonic Prime GX |
Mouse | Logitech MX Master 2S |
Keyboard | Logitech G413 SE |
Software | Bazzite (Fedora Linux) KDE Plasma |
1. I would rather not base anything I say on rumours. The 30-series was said to be supper efficient before launch, and look how it turned out. There isn't a single card without an 8-pin power connector in the whole lineup all the way down to the 3050, and power to performance ratios are just on par with similar Turing chips.But you are just WRONG. They are not raising the power usage to achieve better performance. Of course it is all rumours at this point, but the 4090 is supposedly almost twice as fast as the 3090. So yeah, a 20% consumption increase for a 100% performance increase is an insane efficiency jump. I don't understand how you do not get this.
2. The post you commented on said that there should be a line drawn to how much power any graphics card is allowed to consume. The 4090's rumoured 450 W is way above that line. A car that has 1000 HP and does 10 miles per gallon is more efficient than one that has 150 HP and does 30 mpg, but do you really want to fuel a car that only does 10 mpg? If so, enjoy, but most people don't.
Edit: To stay on topic, Alder Lake is said to be super efficient too, but if that efficiency comes at 200+ Watts, I couldn't care less.
Last edited: