• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Possible Radeon RX 6700 XT Specs Surface, 12GB the New Mid-Range Memory Size?

Joined
Apr 7, 2011
Messages
1,336 (0.37/day)
System Name Desktop
Processor Intel i7-3930k @ 4GHz Undervolted
Motherboard ASUS Sabertooth X79
Cooling Intel AIO
Memory 8x4GB DDR3 1866MHz
Video Card(s) EVGA GTX 970 SC
Storage Crucial MX500 1TB + 2x WD RE 4TB HDD
Display(s) HP ZR24w
Case Fractal Define XL Black
Audio Device(s) Schiit Modi Uber/Sony CDP-XA20ES/Pioneer CT-656>Sony TA-F630ESD>Sennheiser HD600
Power Supply Corsair HX850
Mouse Logitech G603
Keyboard Logitech G613
Software Windows 10 Pro x64
Here are MSRP prices at the time and I also provided prices with inflation included in the brackets.
HD5870 MSRP was $379(Inflation: $460), the eyefinity version $479(Inflation: $582), a dual card HD 5970 was $599(Inflation: $727).
GTX 970 MSRP was $329(Inflation: $362), GTX 980 cost $549(Inflation: $604), GeForce GTX 980 Ti cost $649(Inflation: $714) and GTX Titan X cost $999(Inflation: $1099).
There were already graphic cards which cost more than $500 and that's without adding inflation to that price.
Wanting the TOP cards -> RTX 3090 or RX 6900XT to sell for only $500 is insane.

I don't really care about MSRP, I only care how much I pay at the end and I provided the prices I paid.

And the inflation that is brought all the time is BS and does not apply for every region.
 
Joined
Jan 24, 2011
Messages
128 (0.04/day)
I don't really care about MSRP, I only care how much I pay at the end and I provided the prices I paid.

And the inflation that is brought all the time is BS and does not apply for every region.
And I don't really care at what cost you bought your card when I don't even know If you bought It at the release or a year later or If It was on discount at the time!

That inflation was for USA because both AMD and Nvidia are US companies and It's not BS, show me which country where the chips or cards are designed and produced didn't have inflation for the past 10 years!
Just making a 250mm2 chip today by TMSC cost a lot more than It did 5 or 10 years ago, just look at some info about wafer cost of different processes. Link
HD 5870 -> 40nm and 334mm2-> foundry sale per wafer: $2274 -> cost per chip is 2274/111(good chips per wafer) = $20.5
GTX 970 -> 28nm and 398mm2 for full GM204 chip-> foundry sale per wafer: $2981-> cost per chip is 2981/93(good chips per wafer) = $32.1
RX 5700XT -> 7nm and 251mm2-> foundry sale per wafer: $9346 -> cost per chip is 9346/173(good chips per wafer) = $54
The smallest chip cost by far the most to make from wafer.
You can't blame the increase in price just on corporation greed.
 
Last edited:
Joined
May 10, 2020
Messages
362 (1.44/day)
Processor Ryzen 9 3900X
Motherboard Asus ROG Strix B550-F Gaming
Cooling Be Quiet! Dark Rock 4
Memory 32 Gb G.Skill TridentZ RGB 3600CL16
Video Card(s) Zotac RTX 3080 AMP Holo
Storage 1 Tb Samsung 970 EVO NVMe + 1 Tb SSD WD Blue
Display(s) MSi Optix G27 + Samsung C24RG50
Case Corsair Carbide SPEC-06
Power Supply EVGA G3 750W
Mouse Razer Basilisk
Keyboard Razer Ornata Chroma
Benchmark Scores 3dMark TimeSpy - 16852(CPU 12525 - GPU 17959) Cinebench R20 - 515/7331
Joined
Jan 24, 2011
Messages
128 (0.04/day)
For sure not JUST on that, but surely ALSO on that.
It's hard to tell, because I don't know what margins they sell each GPU now and what they sold It in the past.
I won't say every single GPU is or will be overpriced(RDNA2 and Ampere), but some certainly look like they are.
 

ARF

Joined
Jan 28, 2020
Messages
1,304 (3.68/day)
And I don't really care at what cost you bought your card when I don't even know If you bought It at the release or a year later or If It was on discount at the time!

That inflation was for USA because both AMD and Nvidia are US companies and It's not BS, show me which country where the chips or cards are designed and produced didn't have inflation for the past 10 years!
Just making a 250mm2 chip today by TMSC cost a lot more than It did 5 or 10 years ago, just look at some info about wafer cost of different processes. Link
HD 5870 -> 40nm and 334mm2-> foundry sale per wafer: $2274 -> cost per chip is 2274/111(good chips per wafer) = $20.5
GTX 970 -> 28nm and 398mm2 for full GM204 chip-> foundry sale per wafer: $2981-> cost per chip is 2981/93(good chips per wafer) = $32.1
RX 5700XT -> 7nm and 251mm2-> foundry sale per wafer: $9346 -> cost per chip is 9346/173(good chips per wafer) = $54
The smallest chip cost by far the most to make from wafer.
You can't blame the increase in price just on corporation greed.

HD 5870 = 20.5 from 339 = 6% of the retail price
GTX 970 = 32.1 from 329 = 10% of the retail price
RX 5700 XT = 54 from 399 = 13% of the retail price

You are right - the contribution of the individual die price to the final consumer price only increases, which means that if all the other components in the bill of materials cost the same as they cost in 2010, then the profit margin for the manufacturer is less today.
 
Joined
Feb 13, 2014
Messages
381 (0.15/day)
Processor Intel i5 - 4590
Motherboard Gigabyte Ga- Z97x- Gaming 7
Cooling Raijintek Themis
Memory G.Skill Sniper ( 2 x 4GB ) 1866Mhz CL9
Video Card(s) Gigabyte GTX 970 WindForce
Storage Crucial MX100 256GB + Hitachi Deskstar NAS 3 TB
Display(s) Dell UltraSharp U2414h
Case Be Quiet! Silent Base 800
Audio Device(s) Onboard ( ALC1150 ) / Creative Gigaworks T20 Series II
Power Supply Superflower Leadex Gold II 650W
Mouse Logitech G203
Keyboard Mechanical
Software Windows 8.1 ( 64 Bit )
Not too sure about this one...VRAM requirements are heavily exaggerated these days and its RT performance will not be enough to sway 3060/Ti/70 buyers, especially for Cyberpunk. Will have to be shoved down one budget tier at least (300-350 dollars) to duke it out with the 3050
Exactly what i thought for the 6800/xt ones as well, I have no idea why they didn't release 8gb versions and 6gb of this as well, it is only against them.

Imagine a considerably cheaper 6800xt with 8gb ram versus nvidia, most gamers either play 1080p high refresh or 1440p, 8gb are plenty and the price drop will make it a vastly better product.
The same goes for this 6700xt.
 

ARF

Joined
Jan 28, 2020
Messages
1,304 (3.68/day)
Exactly what i thought for the 6800/xt ones as well, I have no idea why they didn't release 8gb versions and 6gb of this as well, it is only against them.

Imagine a considerably cheaper 6800xt with 8gb ram versus nvidia, most gamers either play 1080p high refresh or 1440p, 8gb are plenty and the price drop will make it a vastly better product.
The same goes for this 6700xt.

No, 8GB will make it mid-range product - there are already titles which require 10GB or more, and with true next-gen games the VRAM requirements will only rise.

Nvidia is wrong because it sacrifices other things like the textures resolution and makes the games look terrible.
 
Joined
Feb 20, 2019
Messages
2,002 (2.88/day)
System Name Flavour of the month. I roll through hardware like it's not even mine (it often isn't).
Processor 3900X, 3600XT, 2700U
Motherboard Aorus X570 Elite, B550 DS3H
Cooling Alphacool CPU+GPU soft-tubing loop (Laing D5 360mm+140mm), AMD Wraith Prism
Memory 32GB Patriot 3600CL17, 32GB Corsair LPX 3200CL16, 16GB HyperX 2400CL14
Video Card(s) 2070S, 5700XT, Vega10
Storage 1TB WD S100G, 2TB Adata SX8200 Pro, 1TB MX500, 500GB Hynix 2242 bastard thing, 16TB of rust + backup
Display(s) Dell SG3220 165Hz VA, Samsung 65" Q9FN 120Hz VA
Case NZXT H440NE, Silverstone GD04 (almost nothing original left inside, thanks 3D printer!)
Audio Device(s) CA DacMagic+ with Presonus Eris E5, Yamaha RX-V683 with Q Acoustics 3000-series, Sony MDR-1A
Power Supply BeQuiet StraightPower E9 680W, Corsair RM550, and a 45W Lenovo DC power brick, I guess.
Mouse G303, MX Anywhere 2, Another MX Anywhere 2.
Keyboard CM QuickFire Stealth (Cherry MX Brown), Logitech MX Keys (not Cherry MX at all)
Software W10
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
40CU means that all we're just getting another 5700XT with some arch/clock improvements and raytracing. If that's the case I'm expecting a $399 2080S rival with 192-bit RAM using 14Gbps GDDR6.

I'd like to be wrong about Navi22 having only 40CU but AMD has historically made very big cuts to their next-smallest silicon. The only hope we have that it's >40CU is that the XBSX has 52CU and presumably that's not cut down from Navi21's 80CU.
 
Joined
Dec 14, 2011
Messages
347 (0.10/day)
Location
South-Africa
Processor i5 4670k @ 4.2 Ghz
Motherboard MSi Z87 GD65 Gaming Motherboard
Cooling Zalman CNPS11 Extreme
Memory 16GB DDR3 Corsair Vengeance @ 2400Mhz
Video Card(s) MSI GTX660Ti 2GB
Storage 250GB Samsung 850 EVO SSD
Display(s) Dell S3220DGF
Case Corsair 750D Airflow Edition
Audio Device(s) ASUS Xonar D2X
Power Supply Corsair AX760 Platinum
Mouse Razer Mamba Chroma (2015)
Keyboard Corsair K70 MK.2 Low-Profile Rapidfire
Software Microsoft Windows 10 Enterprise 64-bit
You guys are wishful thinking these prices and how product segmentation works and competition works.
Right now we have 500 - ish level products as cheapest for nest-gen products. Next Nvidia will launch the 3060 ti at 399 so Amd will have to counter. So if 5700xt will have 3060ti (2080 super) performance they will be around the 399 mark probably a little bit less. Next both will launch 299 price products and then smth around 200 - could be more tiers as this are price oriented customers.
Bottom line both Nvidia and AMD will have products competing on all performance/price segments. They will not leave a 200 price gap between lines as they are loosing on a a whole tier of sales.
This is how commerce, competition in capitalism works.

I suppose you automatically assume everyone's salaries also increase to compensate for the magic thing called "inflation"

I remember when there was none, everyone who worked a job could afford a home, everyone could afford a car etc. I wonder who was behind this? *hand rubbing intensifies*
 
Joined
Feb 13, 2014
Messages
381 (0.15/day)
Processor Intel i5 - 4590
Motherboard Gigabyte Ga- Z97x- Gaming 7
Cooling Raijintek Themis
Memory G.Skill Sniper ( 2 x 4GB ) 1866Mhz CL9
Video Card(s) Gigabyte GTX 970 WindForce
Storage Crucial MX100 256GB + Hitachi Deskstar NAS 3 TB
Display(s) Dell UltraSharp U2414h
Case Be Quiet! Silent Base 800
Audio Device(s) Onboard ( ALC1150 ) / Creative Gigaworks T20 Series II
Power Supply Superflower Leadex Gold II 650W
Mouse Logitech G203
Keyboard Mechanical
Software Windows 8.1 ( 64 Bit )
No, 8GB will make it mid-range product - there are already titles which require 10GB or more, and with true next-gen games the VRAM requirements will only rise.

Nvidia is wrong because it sacrifices other things like the textures resolution and makes the games look terrible.
I disagree with that but it doesn't even matter, what i asked to just have the option at a lower price, not replace the current 16gb ones but provide both versions.
 
Joined
Mar 2, 2019
Messages
116 (0.17/day)
System Name My PC
Processor AMD 2700 x @ 4.1Ghz
Motherboard Gigabyte X470 Aorus Gaming
Cooling Zalman CNPS20X
Memory Corsair Vengeance LPX Black 32GB DDR4
Video Card(s) Sapphire Radeon RX 570 PULSE
Storage Adata Ultimate SU800
Case Phanteks Eclipse P500A
Audio Device(s) Logitech G51
Power Supply Seasonic Focus GX, 80+ Gold, 550W
Keyboard Roccat Vulcan 121
I suppose you automatically assume everyone's salaries also increase to compensate for the magic thing called "inflation"

I remember when there was none, everyone who worked a job could afford a home, everyone could afford a car etc. I wonder who was behind this? *hand rubbing intensifies*

How on Earth did you assumed i assume that from my post. I was just coldly assessing the GPU binary market and product segmentation.
I do not condone the price that Nvidia and Amd are asking - but the blame is on people that pay them. You see the scalper situation - it exists because people are willing to pay even more than absurd launch prices.
 
Joined
Jan 14, 2019
Messages
402 (0.55/day)
Location
United Kingdom
System Name Nebulon-B
Processor AMD Ryzen 9 5950X
Motherboard ASUS TUF Gaming B550M-Plus (WiFi)
Cooling Corsair H100i Platinum RGB
Memory 2x 16 GB Corsair Dominator Platinum RGB 3200 MHz DDR4
Video Card(s) ASUS ROG Strix Radeon RX 5700 XT
Storage 512 GB ADATA SU900 SATA-III SSD, 2 TB Hitachi 3.5" SATA-III 7200 rpm HDD
Display(s) Samsung C24F396
Case AeroCool Aero One Mini Eclipse
Audio Device(s) Genius SP-HF160 speakers, AKG Y50 headphones
Power Supply Seasonic Prime Ultra Platinum 550W
Mouse Cherry MW 8
Keyboard MagicForce 68
Software Windows 10
So technically, it's a 5700 (XT) with moderate raytracing support. I guess it was a good idea to buy a 5700 XT last week, despite my issues with its heat output.
 
Joined
Feb 20, 2019
Messages
2,002 (2.88/day)
System Name Flavour of the month. I roll through hardware like it's not even mine (it often isn't).
Processor 3900X, 3600XT, 2700U
Motherboard Aorus X570 Elite, B550 DS3H
Cooling Alphacool CPU+GPU soft-tubing loop (Laing D5 360mm+140mm), AMD Wraith Prism
Memory 32GB Patriot 3600CL17, 32GB Corsair LPX 3200CL16, 16GB HyperX 2400CL14
Video Card(s) 2070S, 5700XT, Vega10
Storage 1TB WD S100G, 2TB Adata SX8200 Pro, 1TB MX500, 500GB Hynix 2242 bastard thing, 16TB of rust + backup
Display(s) Dell SG3220 165Hz VA, Samsung 65" Q9FN 120Hz VA
Case NZXT H440NE, Silverstone GD04 (almost nothing original left inside, thanks 3D printer!)
Audio Device(s) CA DacMagic+ with Presonus Eris E5, Yamaha RX-V683 with Q Acoustics 3000-series, Sony MDR-1A
Power Supply BeQuiet StraightPower E9 680W, Corsair RM550, and a 45W Lenovo DC power brick, I guess.
Mouse G303, MX Anywhere 2, Another MX Anywhere 2.
Keyboard CM QuickFire Stealth (Cherry MX Brown), Logitech MX Keys (not Cherry MX at all)
Software W10
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
So technically, it's a 5700 (XT) with moderate raytracing support. I guess it was a good idea to buy a 5700 XT last week, despite my issues with its heat output.
An overclocked 5700XT with potentially 64-128MB of cache. Until we get a 6700XT and a 5700XT and clock them the same, nobody outside of AMD will know for sure what the architectural improvement between RDNA1 and RDNA2 really is.
 
Joined
Jan 14, 2019
Messages
402 (0.55/day)
Location
United Kingdom
System Name Nebulon-B
Processor AMD Ryzen 9 5950X
Motherboard ASUS TUF Gaming B550M-Plus (WiFi)
Cooling Corsair H100i Platinum RGB
Memory 2x 16 GB Corsair Dominator Platinum RGB 3200 MHz DDR4
Video Card(s) ASUS ROG Strix Radeon RX 5700 XT
Storage 512 GB ADATA SU900 SATA-III SSD, 2 TB Hitachi 3.5" SATA-III 7200 rpm HDD
Display(s) Samsung C24F396
Case AeroCool Aero One Mini Eclipse
Audio Device(s) Genius SP-HF160 speakers, AKG Y50 headphones
Power Supply Seasonic Prime Ultra Platinum 550W
Mouse Cherry MW 8
Keyboard MagicForce 68
Software Windows 10
An overclocked 5700XT with potentially 64-128MB of cache. Until we get a 6700XT and a 5700XT and clock them the same, nobody outside of AMD will know for sure what the architectural improvement between RDNA1 and RDNA2 really is.
Worst case scenario: if it only has the same IPC, but runs at higher clocks, even that's an achievement, I guess.

I'm also curious about the actual power consumption / heat output figures, as I think that's the main area where the 5700 series needs some serious improvement (mine especially).
 
Joined
Nov 26, 2020
Messages
106 (2.08/day)
Location
Germany
System Name Meeeh
Processor 8700K at 5.2 GHz
Memory 32 GB 3600/CL15
Video Card(s) Asus RTX 3080 TUF OC @ +175 MHz
Storage 1TB Samsung 970 Evo Plus
Display(s) 1440p, 165 Hz, IPS
Now THIS is what I'm talkin' about. I'm excited. Hope it turns out to be true AND affordable/readily available...

Yeah good luck with that :D

No, 8GB will make it mid-range product - there are already titles which require 10GB or more, and with true next-gen games the VRAM requirements will only rise.

Nvidia is wrong because it sacrifices other things like the textures resolution and makes the games look terrible.

Lmao, no there's not

AC Valhalla uses 6GB at 4K completely maxed out

Next gen consoles does not even have 10GB for GPU, they have16GB shared and most will be used by system and background/os meaning 4-8GB tops will be used for GPU

6700XT is not a 4K capable card or anywhere close, it's a 1440p card at best so 12GB VRAM will be pointless for 99.9% of games anyway, GPU is too weak to max settings out, and this lowers VRAM usage
 
Last edited:
Joined
Mar 23, 2005
Messages
3,887 (0.67/day)
Location
Ancient Greece, Acropolis (Time Lord)
System Name RiseZEN Gaming PC
Processor AMD Ryzen 7 1700X @ stock - (Ryzen 7 5700X - Waiting)
Motherboard ASRock Fatal1ty X370 GAMING X AM4 (ROG Crosshair VIII Dark Hero - Waiting)
Cooling Corsair H115i PRO RGB, 280mm Radiator, Dual 140mm ML Series PWM Fans
Memory G.Skill TridentZ 32GB (2 x 16GB) DDR4 3200 (Maybe get another 2 for 64GB Total)
Video Card(s) Sapphire Radeon RX 580 8GB Nitro+ SE + (Radeon 6700XT or 6600XT - Price Dependent)
Storage Corsair Force MP500 480GB M.2 (OS) + Force MP510 480GB M.2 (Steam/Games)
Display(s) Asus 27" (MG278Q) 144Hz WQHD 1440p + 1 x Asus 24" (VG245H) FHD 75Hz 1080p
Case Corsair Obsidian Series 450D Gaming Case
Audio Device(s) SteelSeries 5Hv2 w/ ASUS Xonar DGX PCI-E GX2.5 Audio Engine Sound Card
Power Supply Corsair TX750W Power Supply
Mouse Razer DeathAdder PC Gaming Mouse - Ergonomic Left Hand Edition
Keyboard Logitech G15 Classic Gaming Keyboard
Software Windows 10 Pro - 64-Bit Edition
Benchmark Scores I'm the Doctor, Doctor Who. The Definition of Gaming is PC Gaming...
Yeah good luck with that :D



Lmao, no there's not

AC Valhalla uses 6GB at 4K completely maxed out

Next gen consoles does not even have 10GB for GPU, they have16GB shared and most will be used by system and background/os meaning 4-8GB tops will be used for GPU

6700XT is not a 4K capable card or anywhere close, it's a 1440p card at best so 12GB VRAM will be pointless for 99.9% of games anyway, GPU is too weak to max settings out, and this lowers VRAM usage
More people play on 1080p screens. Many have moved to the better 1440p and a very small Niche play on 4k.
I think AMD made a mistake with not launching the RX 6700XT (1440p Mainstream GPU) along with the 6800XT and 6900XT. If AMD wanted only 3 GPUs at launch, they should have held back the 6800 non XT, because Nvidia is eating the mainstream market and AMD has no GPUs available to appease this market. A major miss-calculation.
 
Joined
Feb 20, 2019
Messages
2,002 (2.88/day)
System Name Flavour of the month. I roll through hardware like it's not even mine (it often isn't).
Processor 3900X, 3600XT, 2700U
Motherboard Aorus X570 Elite, B550 DS3H
Cooling Alphacool CPU+GPU soft-tubing loop (Laing D5 360mm+140mm), AMD Wraith Prism
Memory 32GB Patriot 3600CL17, 32GB Corsair LPX 3200CL16, 16GB HyperX 2400CL14
Video Card(s) 2070S, 5700XT, Vega10
Storage 1TB WD S100G, 2TB Adata SX8200 Pro, 1TB MX500, 500GB Hynix 2242 bastard thing, 16TB of rust + backup
Display(s) Dell SG3220 165Hz VA, Samsung 65" Q9FN 120Hz VA
Case NZXT H440NE, Silverstone GD04 (almost nothing original left inside, thanks 3D printer!)
Audio Device(s) CA DacMagic+ with Presonus Eris E5, Yamaha RX-V683 with Q Acoustics 3000-series, Sony MDR-1A
Power Supply BeQuiet StraightPower E9 680W, Corsair RM550, and a 45W Lenovo DC power brick, I guess.
Mouse G303, MX Anywhere 2, Another MX Anywhere 2.
Keyboard CM QuickFire Stealth (Cherry MX Brown), Logitech MX Keys (not Cherry MX at all)
Software W10
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
Worst case scenario: if it only has the same IPC, but runs at higher clocks, even that's an achievement, I guess.

I'm also curious about the actual power consumption / heat output figures, as I think that's the main area where the 5700 series needs some serious improvement (mine especially).
Oof, I missed this from November. I think you just need to undervolt. My 5700XT managed to drop 40W for a 50MHz (2.5% clock reduction) and at that power draw it's a pretty good performance/Watt match for the 2060S I had too.

I've saved a few clock/voltage profiles for my 5700XT but I commonly run at a near-silent 1666MHz/914mv which uses 115W for the GPU core, so I'm guessing the total board power is 145-150W or so - that's 75W less than stock and I've probably lost 15% performance but it doesn't bother me as I'm usually capped by the vsync of my TV and not the GPU.

AMD's preferences, reflected in guildelines to partners is that they don't care about cool or quiet - every GPU they've spat out in the last decade has been clocked to within an inch of its life, right at the ugly steep end of the voltage curve. Most samples don't need anything like that voltage anyway, and the voltage curve is so steep at AMD's recommended clocks that even a 5% clock reduction can have absolutely massive reductions in power consumption.

More people play on 1080p screens. Many have moved to the better 1440p and a very small Niche play on 4k.
I think AMD made a mistake with not launching the RX 6700XT (1440p Mainstream GPU) along with the 6800XT and 6900XT. If AMD wanted only 3 GPUs at launch, they should have held back the 6800 non XT, because Nvidia is eating the mainstream market and AMD has no GPUs available to appease this market. A major miss-calculation.
You seem to be mistaken in how companies operate. They are not trying to make the best graphics card for us, they are trying to make the most dollars for them.

The profit margins on a vanilla 6800 are probably three times the profit margins on a 6700-series card. AMD pays TSMC for a wafer and they can have, say, 100 Big Navi dies out of it, or 150 Little Navi dies out of it. Except AMD can sell the Big Navi to partners for $300 to go on $100PCBs with $25 coolers, or they can sell Little Navi to partners for $100 to go on $75 PCBs with $15 coolers. What would you do, if you were AMD?

Just be thankful we're getting GPUs at all - AMD can get about 3 5950X from a wafer for every Big Navi, and those sell for way more money and don't require the same cost of PCB or cooling hardware either.
 
Last edited:
Joined
Jan 14, 2019
Messages
402 (0.55/day)
Location
United Kingdom
System Name Nebulon-B
Processor AMD Ryzen 9 5950X
Motherboard ASUS TUF Gaming B550M-Plus (WiFi)
Cooling Corsair H100i Platinum RGB
Memory 2x 16 GB Corsair Dominator Platinum RGB 3200 MHz DDR4
Video Card(s) ASUS ROG Strix Radeon RX 5700 XT
Storage 512 GB ADATA SU900 SATA-III SSD, 2 TB Hitachi 3.5" SATA-III 7200 rpm HDD
Display(s) Samsung C24F396
Case AeroCool Aero One Mini Eclipse
Audio Device(s) Genius SP-HF160 speakers, AKG Y50 headphones
Power Supply Seasonic Prime Ultra Platinum 550W
Mouse Cherry MW 8
Keyboard MagicForce 68
Software Windows 10
Oof, I missed this from November. I think you just need to undervolt. My 5700XT managed to drop 40W for a 50MHz (2.5% clock reduction) and at that power draw it's a pretty good performance/Watt match for the 2060S I had too.

I've saved a few clock/voltage profiles for my 5700XT but I commonly run at a near-silent 1666MHz/914mv which uses 115W for the GPU core, so I'm guessing the total board power is 145-150W or so - that's 75W less than stock and I've probably lost 15% performance but it doesn't bother me as I'm usually capped by the vsync of my TV and not the GPU.

AMD's preferences, reflected in guildelines to partners is that they don't care about cool or quiet - every GPU they've spat out in the last decade has been clocked to within an inch of its life, right at the ugly steep end of the voltage curve. Most samples don't need anything like that voltage anyway, and the voltage curve is so steep at AMD's recommended clocks that even a 5% clock reduction can have absolutely massive reductions in power consumption.
To be honest, with a -25% power target, I only lose about 7% performance which is barely noticeable. The fan curve is a bigger issue. It is so relaxed, that I get the same temperatures regardless of clock speed or power consumption. Things are probably much better with a custom curve. Only if I didn't use all my PC time to play games. :rolleyes::D
 
Joined
Feb 20, 2019
Messages
2,002 (2.88/day)
System Name Flavour of the month. I roll through hardware like it's not even mine (it often isn't).
Processor 3900X, 3600XT, 2700U
Motherboard Aorus X570 Elite, B550 DS3H
Cooling Alphacool CPU+GPU soft-tubing loop (Laing D5 360mm+140mm), AMD Wraith Prism
Memory 32GB Patriot 3600CL17, 32GB Corsair LPX 3200CL16, 16GB HyperX 2400CL14
Video Card(s) 2070S, 5700XT, Vega10
Storage 1TB WD S100G, 2TB Adata SX8200 Pro, 1TB MX500, 500GB Hynix 2242 bastard thing, 16TB of rust + backup
Display(s) Dell SG3220 165Hz VA, Samsung 65" Q9FN 120Hz VA
Case NZXT H440NE, Silverstone GD04 (almost nothing original left inside, thanks 3D printer!)
Audio Device(s) CA DacMagic+ with Presonus Eris E5, Yamaha RX-V683 with Q Acoustics 3000-series, Sony MDR-1A
Power Supply BeQuiet StraightPower E9 680W, Corsair RM550, and a 45W Lenovo DC power brick, I guess.
Mouse G303, MX Anywhere 2, Another MX Anywhere 2.
Keyboard CM QuickFire Stealth (Cherry MX Brown), Logitech MX Keys (not Cherry MX at all)
Software W10
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
To be honest, with a -25% power target, I only lose about 7% performance which is barely noticeable. The fan curve is a bigger issue. It is so relaxed, that I get the same temperatures regardless of clock speed or power consumption. Things are probably much better with a custom curve. Only if I didn't use all my PC time to play games. :rolleyes::D
I mean if you are already underclocking, just let the card run hot. The silicon is rated for 105C (hot spot) and I let mine run at 90C+

Having a cool card in the 60-70C range is nice if you're aiming for max performance as the boost algorithms will see thermal headroom and use it to boost more. If your GPU isn't going to boost higher because of manually-reduced power limits, let it run up to it's rated temperature and enjoy the slower fans.
 
Joined
Jan 14, 2019
Messages
402 (0.55/day)
Location
United Kingdom
System Name Nebulon-B
Processor AMD Ryzen 9 5950X
Motherboard ASUS TUF Gaming B550M-Plus (WiFi)
Cooling Corsair H100i Platinum RGB
Memory 2x 16 GB Corsair Dominator Platinum RGB 3200 MHz DDR4
Video Card(s) ASUS ROG Strix Radeon RX 5700 XT
Storage 512 GB ADATA SU900 SATA-III SSD, 2 TB Hitachi 3.5" SATA-III 7200 rpm HDD
Display(s) Samsung C24F396
Case AeroCool Aero One Mini Eclipse
Audio Device(s) Genius SP-HF160 speakers, AKG Y50 headphones
Power Supply Seasonic Prime Ultra Platinum 550W
Mouse Cherry MW 8
Keyboard MagicForce 68
Software Windows 10
I mean if you are already underclocking, just let the card run hot. The silicon is rated for 105C (hot spot) and I let mine run at 90C+

Having a cool card in the 60-70C range is nice if you're aiming for max performance as the boost algorithms will see thermal headroom and use it to boost more. If your GPU isn't going to boost higher because of manually-reduced power limits, let it run up to it's rated temperature and enjoy the slower fans.
The GPU isn't an issue. It runs at the low-mid 70s (hot spot never exceeding 95 C) even on default settings. I'm more worried about the VRAM which can get above 90 C after some extensive gaming.
 
Joined
Feb 20, 2019
Messages
2,002 (2.88/day)
System Name Flavour of the month. I roll through hardware like it's not even mine (it often isn't).
Processor 3900X, 3600XT, 2700U
Motherboard Aorus X570 Elite, B550 DS3H
Cooling Alphacool CPU+GPU soft-tubing loop (Laing D5 360mm+140mm), AMD Wraith Prism
Memory 32GB Patriot 3600CL17, 32GB Corsair LPX 3200CL16, 16GB HyperX 2400CL14
Video Card(s) 2070S, 5700XT, Vega10
Storage 1TB WD S100G, 2TB Adata SX8200 Pro, 1TB MX500, 500GB Hynix 2242 bastard thing, 16TB of rust + backup
Display(s) Dell SG3220 165Hz VA, Samsung 65" Q9FN 120Hz VA
Case NZXT H440NE, Silverstone GD04 (almost nothing original left inside, thanks 3D printer!)
Audio Device(s) CA DacMagic+ with Presonus Eris E5, Yamaha RX-V683 with Q Acoustics 3000-series, Sony MDR-1A
Power Supply BeQuiet StraightPower E9 680W, Corsair RM550, and a 45W Lenovo DC power brick, I guess.
Mouse G303, MX Anywhere 2, Another MX Anywhere 2.
Keyboard CM QuickFire Stealth (Cherry MX Brown), Logitech MX Keys (not Cherry MX at all)
Software W10
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
The GPU isn't an issue. It runs at the low-mid 70s (hot spot never exceeding 95 C) even on default settings. I'm more worried about the VRAM which can get above 90 C after some extensive gaming.
That's absolutely fine. 106C is the temperature target for the VRAM, just like 105C is the temperature target for the hotspot. If it's not running that hot you're either leaving performance on the table, or running the fans louder than they need to be.

Nothing wrong with either of those, but I'm just saying if the noise bothers you there's no need to worry about the temps you're talking about. It took me a few months of Navi to get used to temperatures that were 20C higher than we were used to seeing with Nvidia or previous-gen AMD, but those cards also ran this hot - they just didn't report it.
 
Joined
Jan 14, 2019
Messages
402 (0.55/day)
Location
United Kingdom
System Name Nebulon-B
Processor AMD Ryzen 9 5950X
Motherboard ASUS TUF Gaming B550M-Plus (WiFi)
Cooling Corsair H100i Platinum RGB
Memory 2x 16 GB Corsair Dominator Platinum RGB 3200 MHz DDR4
Video Card(s) ASUS ROG Strix Radeon RX 5700 XT
Storage 512 GB ADATA SU900 SATA-III SSD, 2 TB Hitachi 3.5" SATA-III 7200 rpm HDD
Display(s) Samsung C24F396
Case AeroCool Aero One Mini Eclipse
Audio Device(s) Genius SP-HF160 speakers, AKG Y50 headphones
Power Supply Seasonic Prime Ultra Platinum 550W
Mouse Cherry MW 8
Keyboard MagicForce 68
Software Windows 10
That's absolutely fine. 106C is the temperature target for the VRAM, just like 105C is the temperature target for the hotspot. If it's not running that hot you're either leaving performance on the table, or running the fans louder than they need to be.

Nothing wrong with either of those, but I'm just saying if the noise bothers you there's no need to worry about the temps you're talking about. It took me a few months of Navi to get used to temperatures that were 20C higher than we were used to seeing with Nvidia or previous-gen AMD, but those cards also ran this hot - they just didn't report it.
I thought 95 C was the max rated temperature for GDDR6 chips. In this case I really have nothing to complain about the "dreaded" Asus Strix 5700 XT. :)
 
Joined
Mar 23, 2005
Messages
3,887 (0.67/day)
Location
Ancient Greece, Acropolis (Time Lord)
System Name RiseZEN Gaming PC
Processor AMD Ryzen 7 1700X @ stock - (Ryzen 7 5700X - Waiting)
Motherboard ASRock Fatal1ty X370 GAMING X AM4 (ROG Crosshair VIII Dark Hero - Waiting)
Cooling Corsair H115i PRO RGB, 280mm Radiator, Dual 140mm ML Series PWM Fans
Memory G.Skill TridentZ 32GB (2 x 16GB) DDR4 3200 (Maybe get another 2 for 64GB Total)
Video Card(s) Sapphire Radeon RX 580 8GB Nitro+ SE + (Radeon 6700XT or 6600XT - Price Dependent)
Storage Corsair Force MP500 480GB M.2 (OS) + Force MP510 480GB M.2 (Steam/Games)
Display(s) Asus 27" (MG278Q) 144Hz WQHD 1440p + 1 x Asus 24" (VG245H) FHD 75Hz 1080p
Case Corsair Obsidian Series 450D Gaming Case
Audio Device(s) SteelSeries 5Hv2 w/ ASUS Xonar DGX PCI-E GX2.5 Audio Engine Sound Card
Power Supply Corsair TX750W Power Supply
Mouse Razer DeathAdder PC Gaming Mouse - Ergonomic Left Hand Edition
Keyboard Logitech G15 Classic Gaming Keyboard
Software Windows 10 Pro - 64-Bit Edition
Benchmark Scores I'm the Doctor, Doctor Who. The Definition of Gaming is PC Gaming...
Oof, I missed this from November. I think you just need to undervolt. My 5700XT managed to drop 40W for a 50MHz (2.5% clock reduction) and at that power draw it's a pretty good performance/Watt match for the 2060S I had too.

I've saved a few clock/voltage profiles for my 5700XT but I commonly run at a near-silent 1666MHz/914mv which uses 115W for the GPU core, so I'm guessing the total board power is 145-150W or so - that's 75W less than stock and I've probably lost 15% performance but it doesn't bother me as I'm usually capped by the vsync of my TV and not the GPU.

AMD's preferences, reflected in guildelines to partners is that they don't care about cool or quiet - every GPU they've spat out in the last decade has been clocked to within an inch of its life, right at the ugly steep end of the voltage curve. Most samples don't need anything like that voltage anyway, and the voltage curve is so steep at AMD's recommended clocks that even a 5% clock reduction can have absolutely massive reductions in power consumption.


You seem to be mistaken in how companies operate. They are not trying to make the best graphics card for us, they are trying to make the most dollars for them.

The profit margins on a vanilla 6800 are probably three times the profit margins on a 6700-series card. AMD pays TSMC for a wafer and they can have, say, 100 Big Navi dies out of it, or 150 Little Navi dies out of it. Except AMD can sell the Big Navi to partners for $300 to go on $100PCBs with $25 coolers, or they can sell Little Navi to partners for $100 to go on $75 PCBs with $15 coolers. What would you do, if you were AMD?

Just be thankful we're getting GPUs at all - AMD can get about 3 5950X from a wafer for every Big Navi, and those sell for way more money and don't require the same cost of PCB or cooling hardware either.
I'll repeat myself, AMD made a calculated mistake, neglecting the mainstream market for both CPU & Graphics. All they had to do is launch one 8-Core/16 Thread Ryzen 5700XT mainstream CPU and one RX 6700XT mainstream GPU to appease that market segment. Instead they opened it up to Intel and Nvidia. The 6-Core/12-thread Ryzen 5 doesn't count in this scenario.

AMD would have been better off with releasing the RX 6800XT, RX 6900XT and a RX 6700XT for the mainstream market. Right now, they've left that market segment vulnerable and Nvidia is eating it right up with its entire RTX 3060 lineups. Mainstream is where you grow market share and positive market image. Anyhow seeing how Sony & MS has 80% of AMD's 7nm capacity, that explains why mobile chips, RDNA2 graphics cards, ZEN3 CPUs are rather difficult to locate. All 3 are battling with less than 20% cap.
 
Joined
Feb 20, 2019
Messages
2,002 (2.88/day)
System Name Flavour of the month. I roll through hardware like it's not even mine (it often isn't).
Processor 3900X, 3600XT, 2700U
Motherboard Aorus X570 Elite, B550 DS3H
Cooling Alphacool CPU+GPU soft-tubing loop (Laing D5 360mm+140mm), AMD Wraith Prism
Memory 32GB Patriot 3600CL17, 32GB Corsair LPX 3200CL16, 16GB HyperX 2400CL14
Video Card(s) 2070S, 5700XT, Vega10
Storage 1TB WD S100G, 2TB Adata SX8200 Pro, 1TB MX500, 500GB Hynix 2242 bastard thing, 16TB of rust + backup
Display(s) Dell SG3220 165Hz VA, Samsung 65" Q9FN 120Hz VA
Case NZXT H440NE, Silverstone GD04 (almost nothing original left inside, thanks 3D printer!)
Audio Device(s) CA DacMagic+ with Presonus Eris E5, Yamaha RX-V683 with Q Acoustics 3000-series, Sony MDR-1A
Power Supply BeQuiet StraightPower E9 680W, Corsair RM550, and a 45W Lenovo DC power brick, I guess.
Mouse G303, MX Anywhere 2, Another MX Anywhere 2.
Keyboard CM QuickFire Stealth (Cherry MX Brown), Logitech MX Keys (not Cherry MX at all)
Software W10
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
I'll repeat myself, AMD made a calculated mistake, neglecting the mainstream market for both CPU & Graphics. All they had to do is launch one 8-Core/16 Thread Ryzen 5700XT mainstream CPU and one RX 6700XT mainstream GPU to appease that market segment. Instead they opened it up to Intel and Nvidia. The 6-Core/12-thread Ryzen 5 doesn't count in this scenario.

AMD would have been better off with releasing the RX 6800XT, RX 6900XT and a RX 6700XT for the mainstream market. Right now, they've left that market segment vulnerable and Nvidia is eating it right up with its entire RTX 3060 lineups. Mainstream is where you grow market share and positive market image. Anyhow seeing how Sony & MS has 80% of AMD's 7nm capacity, that explains why mobile chips, RDNA2 graphics cards, ZEN3 CPUs are rather difficult to locate. All 3 are battling with less than 20% cap.
They haven't neglected the mainstream market - they're clearing inventories of Zen2. Can't buy a Ryzen 7 5700X because it doesn't exist yet? Buy a 3700X until something better comes along. The discounts are deep and it's not like this is new behaviour - AMD, Intel, and Nvidia have been following this pattern pretty solidly for almost 20 years.

Why should they appease the low-margin mainstream market segment? You can't buy a product that competes with Zen3 yet. Old Zen2 inventory is doing just fine against 10th Gen Intel. As for mainstream GPUs, it's not like you can buy anything faster than a vanilla 1650 for reasonably money right now, no matter what segment it's released in, so asking AMD to sacrifice the huge margins on high-end CPUs and GPUs is like baseless begging for charity.

When they launch midrange first, it's the exception to the rule, sacrificing profits as a way to regain marketshare, spurred on by fierce competition. At the moment, AMD have the highest marketshare they've had in ages, perhaps even ever, and the weakest competition. They don't need to compromise their profits by selling premium cost, constrained TSMC output at low-margin mainstream prices, especially when they have inventory of Zen2 they've already paid for - just sitting around in warehouses waiting to be turned into revenue.

Capitalism is what brought us CPU progress and you can't have it both ways; For nice things to come to us at great prices, companies have to be competitive and profitable. Right now, AMD is instantly selling everything they make at pretty much whatever price they ask for it. This is their reward for pulling ahead of the competition. It is the foundation of free-market capitalism and yes, for the plebs like us it does temporarily suck - but it's the natural course of things powered by fundamental rules of economics - it's happened before and it'll happen again. You just have to look at the 30-year history of the x86 home PC to see this is nothing new and nothing special.
 
Last edited:
Top