• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

GALAX GeForce RTX 3090 Hall Of Fame (HOF) Edition GPU Benched with Custom 1000 W vBIOS

Joined
Feb 26, 2016
Messages
548 (0.18/day)
Location
Texas
System Name O-Clock
Processor Intel Core i9-9900K @ 52x/49x 8c8t
Motherboard ASUS Maximus XI Gene
Cooling EK Quantum Velocity C+A, EK Quantum Vector C+A, CE 280, Monsta 280, GTS 280 all w/ A14 IP67
Memory 2x16GB G.Skill TridentZ @3900 MHz CL16
Video Card(s) EVGA RTX 2080 Ti XC Black
Storage Samsung 983 ZET 960GB, 2x WD SN850X 4TB
Display(s) Asus VG259QM
Case Corsair 900D
Audio Device(s) beyerdynamic DT 990 600Ω, Asus SupremeFX Hi-Fi 5.25", Elgato Wave 3
Power Supply EVGA 1600 T2 w/ A14 IP67
Mouse Logitech G403 Wireless (PMW3366)
Keyboard Logitech G910 Stickerbombed
Software Windows 10 Pro 64 bit
Benchmark Scores https://hwbot.org/search/submissions/permalink?userId=92615&cpuId=5773
And my Galax 3090 locked to 1800Mhz only uses 230W

These cards are NOT efficient when ramped up
just a thought, what if we liquid cooled the graphics cards? Maybe the point of diminishing returns is when under load, the GPU hits mid 40s, and the clocks and voltages needed to hit that would be probably the most optimal point before noticeable diminishing returns. NVIDIA had a really good cooler design, then decided "oh yeah lets just overclock the absofuckingshit out of these cards so that amd doesnt stand a chance", which is exactly why NVIDIA's Ampere cards don't have as good of a performance/watt as they could have.

On an off topic note, AMD's 6700 XT is not going to be a reliable card. That card's base clock is at 2300+ MHz. Yes, base clock. That card will degrade super fast.
 

hat

Enthusiast
Joined
Nov 20, 2006
Messages
21,731 (3.41/day)
Location
Ohio
System Name Starlifter :: Dragonfly
Processor i7 2600k 4.4GHz :: i5 10400
Motherboard ASUS P8P67 Pro :: ASUS Prime H570-Plus
Cooling Cryorig M9 :: Stock
Memory 4x4GB DDR3 2133 :: 2x8GB DDR4 2400
Video Card(s) PNY GTX1070 :: Integrated UHD 630
Storage Crucial MX500 1TB, 2x1TB Seagate RAID 0 :: Mushkin Enhanced 60GB SSD, 3x4TB Seagate HDD RAID5
Display(s) Onn 165hz 1080p :: Acer 1080p
Case Antec SOHO 1030B :: Old White Full Tower
Audio Device(s) Creative X-Fi Titanium Fatal1ty Pro - Bose Companion 2 Series III :: None
Power Supply FSP Hydro GE 550w :: EVGA Supernova 550
Software Windows 10 Pro - Plex Server on Dragonfly
Benchmark Scores >9000
just a thought, what if we liquid cooled the graphics cards? Maybe the point of diminishing returns is when under load, the GPU hits mid 40s, and the clocks and voltages needed to hit that would be probably the most optimal point before noticeable diminishing returns. NVIDIA had a really good cooler design, then decided "oh yeah lets just overclock the absofuckingshit out of these cards so that amd doesnt stand a chance", which is exactly why NVIDIA's Ampere cards don't have as good of a performance/watt as they could have.

On an off topic note, AMD's 6700 XT is not going to be a reliable card. That card's base clock is at 2300+ MHz. Yes, base clock. That card will degrade super fast.
IIRC since Pascal, keeping the card as cool as possible has had an effect on performance. The cooler the card is, the higher it can boost and the more efficiently it can use the power budget available to it. I think the best cards, like those with waterblocks already on them, are generally overbuilt and have a BIOS that will let the card do more than any other. That doesn't mean it will draw any less power if you keep it cold though...
 
Joined
Nov 11, 2016
Messages
3,065 (1.13/day)
System Name The de-ploughminator Mk-II
Processor i7 13700KF
Motherboard MSI Z790 Carbon
Cooling ID-Cooling SE-226-XT + Phanteks T30
Memory 2x16GB G.Skill DDR5 7200Cas34
Video Card(s) Asus RTX4090 TUF
Storage Kingston KC3000 2TB NVME
Display(s) LG OLED CX48"
Case Corsair 5000D Air
Power Supply Corsair HX850
Mouse Razor Viper Ultimate
Keyboard Corsair K75
Software win11
IIRC since Pascal, keeping the card as cool as possible has had an effect on performance. The cooler the card is, the higher it can boost and the more efficiently it can use the power budget available to it. I think the best cards, like those with waterblocks already on them, are generally overbuilt and have a BIOS that will let the card do more than any other. That doesn't mean it will draw any less power if you keep it cold though...

Efficiency increase mildly with lower operating temperature, but overall watercooling a card can only improve clock by 30-45mhz (same power consumption), and that is already taking into account the additional power headroom from removing the stock fans.

Another fun thing about Nvidia top of the line GPU (1080Ti, 2080Ti and 3090) is that they always have XOC BIOS that bypass thermal and power limit, you can flash these BIOS onto your GPU, do some fun benchmark then back to original BIOS for gaming.
 
Joined
Mar 6, 2012
Messages
566 (0.13/day)
Processor i5 4670K - @ 4.8GHZ core
Motherboard MSI Z87 G43
Cooling Thermalright Ultra-120 *(Modded to fit on this motherboard)
Memory 16GB 2400MHZ
Video Card(s) HD7970 GHZ edition Sapphire
Storage Samsung 120GB 850 EVO & 4X 2TB HDD (Seagate)
Display(s) 42" Panasonice LED TV @120Hz
Case Corsair 200R
Audio Device(s) Xfi Xtreme Music with Hyper X Core
Power Supply Cooler Master 700 Watts
On an off topic note, AMD's 6700 XT is not going to be a reliable card. That card's base clock is at 2300+ MHz. Yes, base clock. That card will degrade super fast.
What kind of tin foil hats are you guys wearing i wonder. Everything is conspiracy for you people. Are you telling me 6800XT. 6900XT & PS5 are unreliable - their clock base speed is almost 2300 MHZ, heck apart from PS5 all the RDNA2 GPUs boost above 2300MHZ+ in normal usecases.

And i would really like to know when you say degrade super fast, what kind of window are we talking about 1 year, 2 year, or may be 6 months ?
 
Joined
Jan 5, 2006
Messages
17,830 (2.67/day)
System Name AlderLake / Laptop
Processor Intel i7 12700K P-Cores @ 5Ghz / Intel i3 7100U
Motherboard Gigabyte Z690 Aorus Master / HP 83A3 (U3E1)
Cooling Noctua NH-U12A 2 fans + Thermal Grizzly Kryonaut Extreme + 5 case fans / Fan
Memory 32GB DDR5 Corsair Dominator Platinum RGB 6000MHz CL36 / 8GB DDR4 HyperX CL13
Video Card(s) MSI RTX 2070 Super Gaming X Trio / Intel HD620
Storage Samsung 980 Pro 1TB + 970 Evo 500GB + 850 Pro 512GB + 860 Evo 1TB x2 / Samsung 256GB M.2 SSD
Display(s) 23.8" Dell S2417DG 165Hz G-Sync 1440p / 14" 1080p IPS Glossy
Case Be quiet! Silent Base 600 - Window / HP Pavilion
Audio Device(s) Panasonic SA-PMX94 / Realtek onboard + B&O speaker system / Harman Kardon Go + Play / Logitech G533
Power Supply Seasonic Focus Plus Gold 750W / Powerbrick
Mouse Logitech MX Anywhere 2 Laser wireless / Logitech M330 wireless
Keyboard RAPOO E9270P Black 5GHz wireless / HP backlit
Software Windows 11 / Windows 10
Benchmark Scores Cinebench R23 (Single Core) 1936 @ stock Cinebench R23 (Multi Core) 23006 @ stock
Joined
Dec 26, 2020
Messages
363 (0.30/day)
System Name Incomplete thing 1.0
Processor Ryzen 2600
Motherboard B450 Aorus Elite
Cooling Gelid Phantom Black
Memory HyperX Fury RGB 3200 CL16 16GB
Video Card(s) Gigabyte 2060 Gaming OC PRO
Storage Dual 1TB 970evo
Display(s) AOC G2U 1440p 144hz, HP e232
Case CM mb511 RGB
Audio Device(s) Reloop ADM-4
Power Supply Sharkoon WPM-600
Mouse G502 Hero
Keyboard Sharkoon SGK3 Blue
Software W10 Pro
Benchmark Scores 2-5% over stock scores
just a thought, what if we liquid cooled the graphics cards? Maybe the point of diminishing returns is when under load, the GPU hits mid 40s, and the clocks and voltages needed to hit that would be probably the most optimal point before noticeable diminishing returns. NVIDIA had a really good cooler design, then decided "oh yeah lets just overclock the absofuckingshit out of these cards so that amd doesnt stand a chance", which is exactly why NVIDIA's Ampere cards don't have as good of a performance/watt as they could have.

On an off topic note, AMD's 6700 XT is not going to be a reliable card. That card's base clock is at 2300+ MHz. Yes, base clock. That card will degrade super fast.
Degrade super fast? I hope you know degradation is based on too much voltage at a certain temperature & use of the chip. Plenty of 6800, 6800xt, 6900xt running all over 2300-2400 out of the box. Not to forget that apparently the yields on the Navi 22 die are pretty amazing meaning the 6700xt might easily go over 2500-2600 with the same voltage. AMD wouldn't design chips to degrade after a few months (besides that would make the power draw insane).
 
Joined
Dec 29, 2010
Messages
3,455 (0.71/day)
Processor AMD 5900x
Motherboard Asus x570 Strix-E
Cooling Hardware Labs
Memory G.Skill 4000c17 2x16gb
Video Card(s) RTX 3090
Storage Sabrent
Display(s) Samsung G9
Case Phanteks 719
Audio Device(s) Fiio K5 Pro
Power Supply EVGA 1000 P2
Mouse Logitech G600
Keyboard Corsair K95
What kind of tin foil hats are you guys wearing i wonder. Everything is conspiracy for you people. Are you telling me 6800XT. 6900XT & PS5 are unreliable - their clock base speed is almost 2300 MHZ, heck apart from PS5 all the RDNA2 GPUs boost above 2300MHZ+ in normal usecases.

And i would really like to know when you say degrade super fast, what kind of window are we talking about 1 year, 2 year, or may be 6 months ?
It's part of some smear campaign they got going.

 
Joined
Feb 26, 2016
Messages
548 (0.18/day)
Location
Texas
System Name O-Clock
Processor Intel Core i9-9900K @ 52x/49x 8c8t
Motherboard ASUS Maximus XI Gene
Cooling EK Quantum Velocity C+A, EK Quantum Vector C+A, CE 280, Monsta 280, GTS 280 all w/ A14 IP67
Memory 2x16GB G.Skill TridentZ @3900 MHz CL16
Video Card(s) EVGA RTX 2080 Ti XC Black
Storage Samsung 983 ZET 960GB, 2x WD SN850X 4TB
Display(s) Asus VG259QM
Case Corsair 900D
Audio Device(s) beyerdynamic DT 990 600Ω, Asus SupremeFX Hi-Fi 5.25", Elgato Wave 3
Power Supply EVGA 1600 T2 w/ A14 IP67
Mouse Logitech G403 Wireless (PMW3366)
Keyboard Logitech G910 Stickerbombed
Software Windows 10 Pro 64 bit
Benchmark Scores https://hwbot.org/search/submissions/permalink?userId=92615&cpuId=5773
What kind of tin foil hats are you guys wearing i wonder. Everything is conspiracy for you people. Are you telling me 6800XT. 6900XT & PS5 are unreliable - their clock base speed is almost 2300 MHZ, heck apart from PS5 all the RDNA2 GPUs boost above 2300MHZ+ in normal usecases.

And i would really like to know when you say degrade super fast, what kind of window are we talking about 1 year, 2 year, or may be 6 months ?
Relatively speaking, I would bet that most of the 6700 XTs wouldn't last as long as the other RX 6000 GPUs since the 6700XT is clocked significantly higher. By degrade, I would expect at least some of the 6700 XTs that make it into gamers hands to no longer conform to spec in about 6 months to a year of normal gaming loads every day, maybe 4 or more hours a day, basically not "game for 1 day, rest for 6 days", since I know there are those people as well. I do expect there will be significantly more RMA rates for 6700 XTs versus the RX 6800.

This goes for almost anything, CPUs, RAM, etc. In a given generation/process node, there are yields. As an example, I will talk about RX 6000 GPUs as a whole. According to TechPowerUp's GPU database, all RX 6000 GPUs are running TSMC's 7nm node, and 6800+ is using Navi 21, 6700 XT is using Navi 22. But the transistors are the same, just, AMD isn't putting as many in Navi 22. The higher you clock a GPU, the lower yields you will get, that is just how engineering works. This process is called binning, and the GPU processors that can handle the higher speeds won't fail as quickly because while they all degrade at nearly the same rate (in the same generation/process node) there is a spec, and anything that meets spec or exceeds spec will be fine. However, if you spec a GPU w/ a significantly higher clock speed, you increase the silicon quality requirement needed to achieve that, and here is where it can get tricky. It either goes 1 way or the other: either the GPU with lower clock will be super stable and have almost zero failure rate and the GPU with higher clock is stable enough, OR the GPU with lower clock is stable enough, and the GPU with higher clock will have noticeable failure rates. The higher you clock a product, it will be less stable (given the same voltage). Sure, you can go from 2200 MHz to 2300 MHz on the same voltage, it just means you didn't actually need as high a voltage for 2200 MHz. The higher the clock speed, usually more voltage/current, and in the end, you want more cores, not more clock speed. For example, let's say AMD made a Navi GPU with 5120 cores and 1500 MHz core clock speed, that would be many times more reliable than making a Navi GPU with 2560 cores and 3000 MHz core clock speed, because after a certain point, there is diminishing returns with overclocking. There are more factors into play obviously, like cooling, current draw, etc, but all I was saying was, I expect the 6700 XT to have higher failure rates than the rest of the RX 6000 GPUs.

Also, 6800 XT and 6900 XT don't have as high base clocks; 6800XT and 6900XT GPUs are around 1800-1900 MHz base clock, not game clock.

It's part of some smear campaign they got going.

Smear campaign? Um... where are you getting this bullshit from?

Degrade super fast? I hope you know degradation is based on too much voltage at a certain temperature & use of the chip. Plenty of 6800, 6800xt, 6900xt running all over 2300-2400 out of the box. Not to forget that apparently the yields on the Navi 22 die are pretty amazing meaning the 6700xt might easily go over 2500-2600 with the same voltage. AMD wouldn't design chips to degrade after a few months (besides that would make the power draw insane).
If yields are better on Navi 22, then alrighty, it won't degrade as quickly as I thought it would. I thought it was running on the same Navi 21 GPU. But yeah, generally speaking, higher clock speeds = higher voltage, and that's why I was saying it would degrade quicker, because of the higher voltage.
 
Joined
Dec 29, 2010
Messages
3,455 (0.71/day)
Processor AMD 5900x
Motherboard Asus x570 Strix-E
Cooling Hardware Labs
Memory G.Skill 4000c17 2x16gb
Video Card(s) RTX 3090
Storage Sabrent
Display(s) Samsung G9
Case Phanteks 719
Audio Device(s) Fiio K5 Pro
Power Supply EVGA 1000 P2
Mouse Logitech G600
Keyboard Corsair K95
Relatively speaking, I would bet that most of the 6700 XTs wouldn't last as long as the other RX 6000 GPUs since the 6700XT is clocked significantly higher. By degrade, I would expect at least some of the 6700 XTs that make it into gamers hands to no longer conform to spec in about 6 months to a year of normal gaming loads every day, maybe 4 or more hours a day, basically not "game for 1 day, rest for 6 days", since I know there are those people as well. I do expect there will be significantly more RMA rates for 6700 XTs versus the RX 6800.

This goes for almost anything, CPUs, RAM, etc. In a given generation/process node, there are yields. As an example, I will talk about RX 6000 GPUs as a whole. According to TechPowerUp's GPU database, all RX 6000 GPUs are running TSMC's 7nm node, and 6800+ is using Navi 21, 6700 XT is using Navi 22. But the transistors are the same, just, AMD isn't putting as many in Navi 22. The higher you clock a GPU, the lower yields you will get, that is just how engineering works. This process is called binning, and the GPU processors that can handle the higher speeds won't fail as quickly because while they all degrade at nearly the same rate (in the same generation/process node) there is a spec, and anything that meets spec or exceeds spec will be fine. However, if you spec a GPU w/ a significantly higher clock speed, you increase the silicon quality requirement needed to achieve that, and here is where it can get tricky. It either goes 1 way or the other: either the GPU with lower clock will be super stable and have almost zero failure rate and the GPU with higher clock is stable enough, OR the GPU with lower clock is stable enough, and the GPU with higher clock will have noticeable failure rates. The higher you clock a product, it will be less stable (given the same voltage). Sure, you can go from 2200 MHz to 2300 MHz on the same voltage, it just means you didn't actually need as high a voltage for 2200 MHz. The higher the clock speed, usually more voltage/current, and in the end, you want more cores, not more clock speed. For example, let's say AMD made a Navi GPU with 5120 cores and 1500 MHz core clock speed, that would be many times more reliable than making a Navi GPU with 2560 cores and 3000 MHz core clock speed, because after a certain point, there is diminishing returns with overclocking. There are more factors into play obviously, like cooling, current draw, etc, but all I was saying was, I expect the 6700 XT to have higher failure rates than the rest of the RX 6000 GPUs.

Also, 6800 XT and 6900 XT don't have as high base clocks; 6800XT and 6900XT GPUs are around 1800-1900 MHz base clock, not game clock.


Smear campaign? Um... where are you getting this bullshit from?


If yields are better on Navi 22, then alrighty, it won't degrade as quickly as I thought it would. I thought it was running on the same Navi 21 GPU. But yeah, generally speaking, higher clock speeds = higher voltage, and that's why I was saying it would degrade quicker, because of the higher voltage.
From your bullshit posts.
 
Joined
Feb 26, 2016
Messages
548 (0.18/day)
Location
Texas
System Name O-Clock
Processor Intel Core i9-9900K @ 52x/49x 8c8t
Motherboard ASUS Maximus XI Gene
Cooling EK Quantum Velocity C+A, EK Quantum Vector C+A, CE 280, Monsta 280, GTS 280 all w/ A14 IP67
Memory 2x16GB G.Skill TridentZ @3900 MHz CL16
Video Card(s) EVGA RTX 2080 Ti XC Black
Storage Samsung 983 ZET 960GB, 2x WD SN850X 4TB
Display(s) Asus VG259QM
Case Corsair 900D
Audio Device(s) beyerdynamic DT 990 600Ω, Asus SupremeFX Hi-Fi 5.25", Elgato Wave 3
Power Supply EVGA 1600 T2 w/ A14 IP67
Mouse Logitech G403 Wireless (PMW3366)
Keyboard Logitech G910 Stickerbombed
Software Windows 10 Pro 64 bit
Benchmark Scores https://hwbot.org/search/submissions/permalink?userId=92615&cpuId=5773

Mussels

Freshwater Moderator
Staff member
Joined
Oct 6, 2004
Messages
58,413 (8.18/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
Okay, enough. Behave or there will be a thread cleanup and infractions.

Either state something as an opinion and others can respect it, or state a fact with linked evidence. Then stop acting like children over it.
 
Top