• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 6000 Series "Big Navi" GPU Features 320 W TGP, 16 Gbps GDDR6 Memory

Joined
Jun 3, 2010
Messages
2,540 (0.50/day)
I think, if they split instruction pipelines from shader pipelines, they can do a frontend overclock until the pipelines are full, say the gpu works at not just 2.3GHz, but say 3.0GHz when shaders are idle. How much it would help is relatable since they have pinpointed exactly where the bottlenecks are - 18% idle for 4 workgroups(just enough work for 1 shader of each 4096).
 
Joined
Apr 12, 2013
Messages
6,799 (1.68/day)
They sold in their millions, Max-Q was a huge success in the laptop world, despite the high cost.
High costs for? I get your point but Nvidia probably made more money off their Max Q models especially for top tier cards like the 2080, 2070 et al. It's the manufacturer & the buyer who's had to pay through their collective noses even for reduced performance.
 
Joined
Sep 17, 2014
Messages
21,104 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
The entire "quiet computing" industry is a waste of cash. It doesn't add any performance at all but people pay serious money for it.
The entire "RGBLED" industry is a waste of cash. It doesn't add any performance at all but it costs quite a bit more whilst adding additional software bloat and cable spaghetti.
As you can tell from the current retail market - both of those segments are so successful that they utterly dominate the market and leave almost nothing else available.

Underclocking and undervolting a graphics card is exactly what every laptop manufacturer has ever done. Nvidia went one step further with their Max-Q models and gave people the option to buy far more expensive GPUs than their laptop cooling is capable of, but dialled back to heavily-reduced clocks and TDPs. They sold in their millions, Max-Q was a huge success in the laptop world, despite the high cost.

I think we can agree to disagree because having options on the market is good and more consumer choice is always better than less. At least AMD's graphics driver is an excellent tuning tool for undervolting and underclocking.

Couldn't agree more.

For me the biggest win is silence. I want a quiet rig above everything else really. I play music and games over speakers. Noticeable fan noise from the case is the most annoying immersion breaker - much more so than the loss of single digit FPS. Is that worth buying a bigger GPU for that I run at lower power? Probably, yes. Its that, or I can jump through a million hoops trying to dampen the noise coming out of the case... which is also adding extra cost but not offering the option of more performance should I want it. Because I haven't lost that, when I buy a bigger GPU.
 
Joined
Dec 26, 2016
Messages
281 (0.10/day)
Processor Ryzen 3900x
Motherboard B550M Steel Legend
Cooling XPX (custom loop)
Memory 32GB 3200MHz cl16
Video Card(s) 3080 with Bykski block (custom loop)
Storage 980 Pro
Case Fractal 804
Power Supply Focus Plus Gold 750FX
Mouse G603
Keyboard G610 brown
Software yes, lots!
I am glad I already upgraded to a 750W PSU in anticipation of the 3080 that I never got ;) Now it will be powering Big Navi if AMD can deliver...
 
Joined
Feb 20, 2019
Messages
7,453 (3.89/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
I'm looking for 250W GPU max. Best price/performance at this wattage gets my money. 3070 looks promising, but I do expect RDNA2 to beat it in performance/watt and performance/dollar given that it's on superior node and AMD is an underdog in GPU game. 52CU clocked at 2100MHz (around 14 Tflops) should match 2080ti/3070 and have favorable performance/watt ratio. It will all come down to pricing. I really hope AMD doesn't become greedy. All these 300-400W GPUs are a no go in my eyes. I have no need for expensive room heaters.

I'll be trying out the RDNA2 cards for the exact same reason as you. 250W max in my HTPC but the reason I'm back to Nvidia in the HTPC at the moment is the AMD HDMI audio driver cutting out with Navi cards. Didn't happen when I swapped to an RX480 or a 2060S, but when I tried a vanilla 5700 the exact same bug reappeared. A microsoft update was the trigger but AMD haven't put out a fix yet and after 3 months I got bored of watching the thread of people complaining on AMD's forum get longer without acknowledgement and moved on.
 
Joined
Jun 3, 2010
Messages
2,540 (0.50/day)
I am glad I already upgraded to a 750W PSU in anticipation of the 3080 that I never got ;) Now it will be powering Big Navi if AMD can deliver...
You'll have +7 years of safe operation until mean power deliver is down to 385w ~ what it averages for a 3080 system.
 
Joined
Dec 26, 2016
Messages
281 (0.10/day)
Processor Ryzen 3900x
Motherboard B550M Steel Legend
Cooling XPX (custom loop)
Memory 32GB 3200MHz cl16
Video Card(s) 3080 with Bykski block (custom loop)
Storage 980 Pro
Case Fractal 804
Power Supply Focus Plus Gold 750FX
Mouse G603
Keyboard G610 brown
Software yes, lots!
You'll have +7 years of safe operation until mean power deliver is down to 385w ~ what it averages for a 3080 system.
Sorry, I dont understand what you mean.
 
Joined
Dec 31, 2009
Messages
19,366 (3.68/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
The entire "quiet computing" industry is a waste of cash. It doesn't add any performance at all but people pay serious money for it.
The entire "RGBLED" industry is a waste of cash. It doesn't add any performance at all but it costs quite a bit more whilst adding additional software bloat and cable spaghetti.
As you can tell from the current retail market - both of those segments are so successful that they utterly dominate the market and leave almost nothing else available.

Underclocking and undervolting a graphics card is exactly what every laptop manufacturer has ever done. Nvidia went one step further with their Max-Q models and gave people the option to buy far more expensive GPUs than their laptop cooling is capable of, but dialled back to heavily-reduced clocks and TDPs. They sold in their millions, Max-Q was a huge success in the laptop world, despite the high cost.

I think we can agree to disagree because having options on the market is good and more consumer choice is always better than less. At least AMD's graphics driver is an excellent tuning tool for undervolting and underclocking.
I think my talking point went over your head (seeing some of your talking points).... but we'll agree to disagree. :)
 
Joined
Oct 12, 2005
Messages
682 (0.10/day)
Well this is not surprising

Why AMD would let Watt on the table if they can make their cards faster ? These 320 watts card exist because people are buying it. They even complain hard when they can't buy them because they are back order.

If nobody was buying a 250+ cards, AMD and Nvidia wouldn't produce them, that is as simple as that. That just show how little people really care about power consumption in general.

The good things is there will also be 200 watt GPU that will have very good performance increase for people that want a GPU that consume less power while still having better performance than current Gen.

But if people want to get the highest performance possible no matter the cost, why would AMD and Nvidia hold back ?

If they could sell a 1000w card that is twice the performance like they sell the 3080, they would certainly do it.

Psu's lose 10% capacity annually on average.
Any proof of this?

i mean if that is true, my PSU would just die right now with my current setup. But hey, it's still running strong and rock stable.

That is a myth or maybe true for cheap PSU with bad componement but it's certainly not true for good PSU.
 
Joined
Dec 26, 2016
Messages
281 (0.10/day)
Processor Ryzen 3900x
Motherboard B550M Steel Legend
Cooling XPX (custom loop)
Memory 32GB 3200MHz cl16
Video Card(s) 3080 with Bykski block (custom loop)
Storage 980 Pro
Case Fractal 804
Power Supply Focus Plus Gold 750FX
Mouse G603
Keyboard G610 brown
Software yes, lots!
Psu's lose 10% capacity annually on average.
Most hardware wont get to that age here. I never had noticeable loses at my psu before, maybe because I always oversized them and always bought the "better" brands. People tend to save on PSUs because its an easy way to save some bucks at total system cost. I don't. This one is a pretty solid one from Seasonic. And even if I ever encounter instability during its lifetime, I am not afraid to take out the good old solder iron and switch out those big caps that have aged. Anyways caps mostly age in warm environments, since I'm not even playing more than 15h a week, thermal stress and therefore ageing should not be a problem during its lifetime.
I think 10% per year is a pretty "worst case" scenario. That may be true to cheap PSUs with even cheaper caps in 24/7 full load in a 50°C environment....
 
Last edited:
Joined
Feb 20, 2019
Messages
7,453 (3.89/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
Psu's lose 10% capacity annually on average.
That's not necessarily a false statement, but it doesn't represent the huge variety of PSU quality, the way different platform designs age differently, and the quality of the power grid that effectively wears PSUs out over time.

I think 10% loss per year is a safe bet for a worst-case-scenario but there are plenty of people and independent tests proving that decade-old PSUs are still capable of delivering all or nearly all of their rated power. PSUs components are overprovisioned when new so that as the capacitors and other components wear out, they are still up to the rated specification during the warranty period. I forget where I read it but I seem to recall a review of a decade old OCZ 700W supply that had been in nearly 24/7 operation, yet it still hit the rated specs without any problems. The temperature it ran at was much higher (but still in spec) and the ripple was worse than when it was new (but still in spec) and it shutdown when tested at 120% load, something it managed to cope with when new.

I would not be using a decade-old PSU for a new build with high-end parts, but at the same time I would expect a new 750W PSU to still deliver 750W in 7 years from now.
 
Joined
Jan 14, 2019
Messages
10,065 (5.15/day)
Location
Midlands, UK
System Name Holiday Season Budget Computer (HSBC)
Processor AMD Ryzen 7 7700X
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 16 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) Sapphire Pulse Radeon RX 6500 XT 4 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
Going by an electricity price of 20p per kWh (which is a relatively expensive UK price) and an average game time of 2 hours per day; a 300 W difference in total system power consumption is going to cost £43.8 per year, which comes to £3.65 per month! So everybody stop crying about bills!

On the other hand, I'd be happy if more AIBs (other than EVGA) adopted the idea of AIO watercooled graphics cards, especially with Ampere and RDNA2. It's not only good for these hungry GPUs, but using the radiator as exhaust helps keeping other components cool as well.
 
Joined
Jul 9, 2015
Messages
3,413 (1.05/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
Efficiency is one thing, power consumption is another.

NVIDIA's RTX 3080 is a much more power-efficient design than anything that came before (at 1440p and 4K), as our review clearly demonstrates.



One other metric for discussion, however, is power envelope. So, for anyone who has concerns regarding energy consumption, and wants to reduce overall power consumption for environmental or other concerns, one can always just drop a few rungs in the product stack for an RTX 3070, or the (virtual) RTX 3060 or AMD equivalents, which will certainly deliver even higher power efficiency, within a smaller envelope.

We'll have to wait for reviews on other cards in NVIDIA's product stack (not to mention all of AMD's offerings, such as this one leaked card), but it seems clear that this generation will deliver higher performance at the same power level as older generations. You may have to drop in the product stack, yes - but if performance is higher on the same envelope, you will have a better-performing RTX 2080 in a 3070 at the same power envelope, or a better performing RTX 2070 in a RTX 3060 at the same power envelope, and so on.

These are two different concepts, and I can't agree with anyone talking about inefficiency. The numbers, in a frame/watt basis, don't lie.

Remember that these charts are based on GPU performance in a bunch of games, but power consumption in just one game.
Actual perf/watt might be well off claimed one due to this uncertainty.
 
Joined
Dec 31, 2009
Messages
19,366 (3.68/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
Remember that these charts are based on GPU performance in a bunch of games, but power consumption in just one game.
Actual perf/watt might be well off claimed one due to this uncertainty.
One of the first good points you've brought up in a while. I thought the power testing was at least across a few games... :(

That said, when the new AMD card is released, it will be apples to apples if only across one title.
 

Nkd

Joined
Sep 15, 2007
Messages
364 (0.06/day)
What's wrong with my numbers? Igor writes 320W TBP for FE NAVI 21 XT and 355W for AIB variants ('Die 6800XT ist heiss, bis zu 355 Watt++') That translates into +400W peak power draw.
NO! That is the entire damn board and all the power lol. Reference cards that are set to reference specs will never go above the power limit unless you manually adjust it in software. Plug and play it will maintain that power limit and be under it. That is how it works. You are comparing AIB cards that might have more power unlocked and consume more under peak load.
 
Last edited:
Joined
Dec 30, 2010
Messages
2,116 (0.43/day)
If TGP is 320W than peak power draw must be north of 400W, just like 3080 and 3090. That's really, really bad. Any single decent GPU should not peak over 300W, that's the datacenter rule of thumb and it's getting stumbled upon with Ampere and RDNA2. How long will air cooled 400W GPU last? I'm having hard time believing that there will be many fully functioning air cooled Big Navis/3080-90s around in 3-5 years time. Maybe that's the intend, 1080TIs are still killing new sales.

Come'on, you got a slider now in your driver that limits the current limit your GPU can consume. If you think it's pushing out too many watts, slide it down. If you think you shoudnt be pushing for more framerates then your screen can handle turn on Vsync or Freesync. If you think it's consuming too much undervolt and underclock it. You have so much freedom in relation of cards these days. For a long time i ran a RX580 at 1200Mhz with a undervolt, since the performance difference is only a few percent but the power reduction was huge.

The Vega as well is at certain clocks and voltages, very efficient > Untill AMD decided that pushing the Vega to compete with the 1080 was beyond it's efficiency curve. Like a Ryzen 2700x > From 4Ghz and above you need quite more and more voltages to clock higher, untill the point that the more voltage needed for just that few more Mhz is proportional. Makes no sense.

Br93YBT9zEYhNtJT23bG4D5r.jpg


Here's a old screenie of my RX580 running at 300 watts power consumption. Really if your hardware is capable of it it shoud'nt cause issues. I'm running with Vsync anyway capped at 70Hz. Its not using 300W sustained here, more like 160 to 190 Watts in gaming. Same goes out for AMD cards. Their enveloppe is set at up to xxx watts; and you can play / tweak / tune it if desired.
 
Joined
Jun 13, 2019
Messages
487 (0.27/day)
System Name Fractal
Processor Intel Core i5 13600K
Motherboard Asus ProArt Z790 Creator WiFi
Cooling Arctic Cooling Liquid Freezer II 360
Memory 16GBx2 G.SKILL Ripjaws S5 DDR5 6000 CL30-40-40-96 (F5-6000J3040F16GX2-RS5K)
Video Card(s) PNY RTX A2000 6GB
Storage SK Hynix Platinum P41 2TB
Display(s) LG 34GK950F-B (34"/IPS/1440p/21:9/144Hz/FreeSync)
Case Fractal Design R6 Gunmetal Blackout w/ USB-C
Audio Device(s) Steelseries Arctis 7 Wireless/Klipsch Pro-Media 2.1BT
Power Supply Seasonic Prime 850w 80+ Titanium
Mouse Logitech G700S
Keyboard Corsair K68
Software Windows 11 Pro
Does anybody not care about electricity bills anymore, or most not having responsibilty to pay the bills? Who would buy these cards?

I don't think I've ever once thought of the electricity bill when it comes to computers, except when it comes to convincing the wife that upgrading will actually SAVE us money. "Honey, it literally pays for itself!"

We have an 18k BTU mini-split that runs in our 1300 sq. ft. garage virtually 24/7. The 3-5 PC's in the home that run at any given moment are NOTHING compared to that.
 
Joined
Dec 26, 2016
Messages
281 (0.10/day)
Processor Ryzen 3900x
Motherboard B550M Steel Legend
Cooling XPX (custom loop)
Memory 32GB 3200MHz cl16
Video Card(s) 3080 with Bykski block (custom loop)
Storage 980 Pro
Case Fractal 804
Power Supply Focus Plus Gold 750FX
Mouse G603
Keyboard G610 brown
Software yes, lots!
I don't think I've ever once thought of the electricity bill when it comes to computers, except when it comes to convincing the wife that upgrading will actually SAVE us money. "Honey, it literally pays for itself!"

We have an 18k BTU mini-split that runs in our 1300 sq. ft. garage virtually 24/7. The 3-5 PC's in the home that run at any given moment are NOTHING compared to that.

Haha, youre American, power is practically free in the US. I pay 0.30€ for a kWh, so at least for servers etc I have to have an eye on power consumption.
But for Desktops etc i dont care since more power often means more performance and high power is only drawn during high load which is only minutes when working or 2h a night when gaming...
 
Joined
Apr 24, 2020
Messages
2,572 (1.73/day)
I think, if they split instruction pipelines from shader pipelines, they can do a frontend overclock until the pipelines are full, say the gpu works at not just 2.3GHz, but say 3.0GHz when shaders are idle. How much it would help is relatable since they have pinpointed exactly where the bottlenecks are - 18% idle for 4 workgroups(just enough work for 1 shader of each 4096).

Shaders run instructions. I'm not entirely sure what you mean by this.

Currently, RDNA (and GCN) split instructions into two categories: Scalar, and Vector. "Scalar" instructions handle branching and looping for the most part (booleans are often a Scalar 64-bit or 32-bit value), while "vector" instructions are replicated across 32 (64 on GCN) copies of the program.
 
Joined
Mar 10, 2010
Messages
11,878 (2.29/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
Haha, youre American, power is practically free in the US. I pay 0.30€ for a kWh, so at least for servers etc I have to have an eye on power consumption.
But for Desktops etc i dont care since more power often means more performance and high power is only drawn during high load which is only minutes when working or 2h a night when gaming...
It's a concern for me, UK power isn't cheap, having said that ,as you say power use depends on load, and few cards use flat out power much of the day, even folding at home or mining doesn't Max a cards power use in reality.
Still, Some game's are going to cook people while gaming, warm winter perhaps, hopefully that looto tickets not as shit as all my last one's.
 
Joined
Apr 24, 2020
Messages
2,572 (1.73/day)
Haha, youre American, power is practically free in the US. I pay 0.30€ for a kWh, so at least for servers etc I have to have an eye on power consumption.
But for Desktops etc i dont care since more power often means more performance and high power is only drawn during high load which is only minutes when working or 2h a night when gaming...

I mean, if you care a lot about power consumption, you could just game at 1080p instead of 4k (as an example). All of these components run power based off of the complexity of the computation. If you lower the complexity (lowering resolution, or graphical quality in other ways), then you'll use less power.

All of these GPUs idle at levels we can pretty much ignore.

1603213147038.png


Even the 14W idle of RX Vega is 0.0042€ per hour. It only ramps up to max power if you give it a game, or other load, that requires that kind of power draw. Cap your framerate to lower values (especially if you have VSync / GSync), etc. etc.

On the other hand, I don't think most people even give power-consumption thoughts to their computers. But... its not like these things are running full tilt all the time. If you really cared about power, there's plenty of things you can do right now, today, with your current GPU to reduce power consumption.
 
Joined
Jun 22, 2014
Messages
430 (0.12/day)
System Name Desktop / "Console"
Processor Ryzen 5950X / Ryzen 5800X
Motherboard Asus X570 Hero / Asus X570-i
Cooling EK AIO Elite 280 / Cryorig C1
Memory 32GB Gskill Trident DDR4-3600 CL16 / 16GB Crucial Ballistix DDR4-3600 CL16
Video Card(s) RTX 4090 FE / RTX 2080ti FE
Storage 1TB Samsung 980 Pro, 1TB Sabrent Rocket 4 Plus NVME / 1TB Sabrent Rocket 4 NVME, 1TB Intel 660P
Display(s) Alienware AW3423DW / LG 65CX Oled
Case Lian Li O11 Mini / Sliger CL530 Conswole
Audio Device(s) Sony AVR, SVS speakers & subs / Marantz AVR, SVS speakers & subs
Power Supply ROG Loki 1000 / Silverstone SX800
VR HMD Quest 3
I just don't get this undervolting crowd... You pay for a certain level of performance, but due to power you strip a fair amount of performance off of it to run lower. Why not pay less for a card down the stack with less power use and use it to its fullest potential? I don't buy a performance coupe and decide to chip it to lower the horsepower because I don't like the gas mileage... o_O :kookoo::wtf:
I was under the assumption from watching Optimum Tech that the point of undervolting was to achieve the lowest stable power draw at the same performance/clocks, as if the stock voltage curve 'overfeeds' the cards, causing throttling etc. I was thinking of going by his guide and entertaining the idea this gen for the sake of learning and getting first hand experience, just to try and save some heat/noise but not at the expense of performance. (Please correct me if I'm off here?)

Does anybody not care about electricity bills anymore, or most not having responsibilty to pay the bills? Who would buy these cards?
to me it has always been more about controlling noise than energy cost.
 
Joined
Apr 24, 2020
Messages
2,572 (1.73/day)
I just don't get this undervolting crowd... You pay for a certain level of performance, but due to power you strip a fair amount of performance off of it to run lower. Why not pay less for a card down the stack with less power use and use it to its fullest potential? I don't buy a performance coupe and decide to chip it to lower the horsepower because I don't like the gas mileage... o_O :kookoo::wtf:

You pay for a bigger and wider GPU. Effectively: you're paying for the silicon (as well as the size of a successful die. The larger the die, the harder it is to produce and naturally, the more expensive it is).

Whether you run it at maximum power, or minimum power, is up to you. Laptop chips, such as the Laptop RTX 2070 Super, are effectively underclocked versions of the desktop chip. The same thing, just running at lower power (and greater energy efficiency) for portability reasons. Similarly, a mini-PC user may have a harder time cooling down their computer, or maybe a silent-build wants to reduce the fan noise.

A wider GPU (ex: 3090) will still provide more power-efficiency than a narrower GPU (ex: 3070), even if you downclock a 3090 to 3080 or 3070 levels. More performance at the same levels of power, that's the main benefit of "more silicon".

--------

Power consumption is something like the voltage-cubed (!!!). If you reduce voltage by 10%, you get something like 30% less power draw. Dropping 10% of your voltage causes a 10% loss of frequency, but you drop in power-usage by a far greater number.
 
Top