• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Dual Slot GPUs

Joined
Jan 11, 2013
Messages
1,237 (0.30/day)
Location
California, unfortunately.
System Name Sierra ~ Server
Processor Core i5-11600K ~ Core i3-12100
Motherboard Asus Prime B560M-A AC ~ MSI PRO B760M-P
Cooling CM 212 Black RGB Edition ~ Intel Stock Cooler
Memory 64GB (2x 32GB) DDR4-3600 ~ 32GB (4x 8GB) DDR4-3200
Video Card(s) XFX Radeon RX 6950 XT ~ EVGA GeForce GTX 970
Storage 4TB Samsung 990 Pro with Heatsink NVMe SSD ~ 2TB Kingston NV1 NVMe SSD + 500GB WD Blue SATA SSD
Display(s) 2x Dell S2721QS 4K 60Hz ~ N/A
Case Asus Prime AP201 - Open Frame Chassis
Power Supply Thermaltake GF1 850W ~ Thermaltake Smart 500W
Software Windows 11 Pro ~ Proxmox VE
Benchmark Scores Laptops: Dell Latitude E7270, Dell Latitude 14 Rugged 5420.
Why is it so, so hard to find true dual slot video cards!? Literally the only higher end GPUs I see online that only actually use two slots and don't interfere with the 3rd slot which I happen to have a USB-C PCI-E card in are used OEM cards that came out of Dell, etc. prebuilts.
 
Joined
Jun 2, 2017
Messages
7,906 (3.15/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
Why is it so, so hard to find true dual slot video cards!? Literally the only higher end GPUs I see online that only actually use two slots and don't interfere with the 3rd slot which I happen to have a USB-C PCI-E card in are used OEM cards that came out of Dell, etc. prebuilts.
There are 3060s and 6600s that are dual slot.
 
Joined
Jan 3, 2015
Messages
2,881 (0.85/day)
System Name The beast and the little runt.
Processor Ryzen 5 5600X - Ryzen 9 5950X
Motherboard ASUS ROG STRIX B550-I GAMING - ASUS ROG Crosshair VIII Dark Hero X570
Cooling Noctua NH-L9x65 SE-AM4a - NH-D15 chromax.black with IPPC Industrial 3000 RPM 120/140 MM fans.
Memory G.SKILL TRIDENT Z ROYAL GOLD/SILVER 32 GB (2 x 16 GB and 4 x 8 GB) 3600 MHz CL14-15-15-35 1.45 volts
Video Card(s) GIGABYTE RTX 4060 OC LOW PROFILE - GIGABYTE RTX 4090 GAMING OC
Storage Samsung 980 PRO 1 TB + 2 TB - Samsung 870 EVO 4 TB - 2 x WD RED PRO 16 GB + WD ULTRASTAR 22 TB
Display(s) Asus 27" TUF VG27AQL1A and a Dell 24" for dual setup
Case Phanteks Enthoo 719/LUXE 2 BLACK
Audio Device(s) Onboard on both boards
Power Supply Phanteks Revolt X 1200W
Mouse Logitech G903 Lightspeed Wireless Gaming Mouse
Keyboard Logitech G910 Orion Spectrum
Software WINDOWS 10 PRO 64 BITS on both systems
Benchmark Scores Se more about my 2 in 1 system here: kortlink.dk/2ca4x
My little RTX A2000 is a dual slot card. With that said, the reason for why dual slot cards is becoming more rare, specially on high-end cards. Modern cards are becoming more and more power hungry and needs better colling. Dual slot cooler simply dosent cut it any more. For dual slot cards you need to look at the mid to low end cards.

Its funny how things has changed on high-end cards.

EVGA GTX 1080 TI FTW 3 gaming = dual slot card
EVGA RTX 3080 = 3 slot card
ASUS RTX 4090 = 4 slot cards.
All high-end cards, but jut getting bigger every gen it seems.
 
Joined
May 8, 2016
Messages
1,741 (0.60/day)
System Name BOX
Processor Core i7 6950X @ 4,26GHz (1,28V)
Motherboard X99 SOC Champion (BIOS F23c + bifurcation mod)
Cooling Thermalright Venomous-X + 2x Delta 38mm PWM (Push-Pull)
Memory Patriot Viper Steel 4000MHz CL16 4x8GB (@3240MHz CL12.12.12.24 CR2T @ 1,48V)
Video Card(s) Titan V (~1650MHz @ 0.77V, HBM2 1GHz, Forced P2 state [OFF])
Storage WD SN850X 2TB + Samsung EVO 2TB (SATA) + Seagate Exos X20 20TB (4Kn mode)
Display(s) LG 27GP950-B
Case Fractal Design Meshify 2 XL
Audio Device(s) Motu M4 (audio interface) + ATH-A900Z + Behringer C-1
Power Supply Seasonic X-760 (760W)
Mouse Logitech RX-250
Keyboard HP KB-9970
Software Windows 10 Pro x64
3080 (Ti) Founders Edition ?
 

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
17,865 (2.99/day)
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K @ 4GHz
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (2 x 8GB Corsair Vengeance Black DDR3 PC3-12800 C9 1600MHz)
Video Card(s) MSI RTX 2080 SUPER Gaming X Trio
Storage Samsung 850 Pro 256GB | WD Black 4TB | WD Blue 6TB
Display(s) ASUS ROG Strix XG27UQR (4K, 144Hz, G-SYNC compatible) | Asus MG28UQ (4K, 60Hz, FreeSync compatible)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair AX1600i
Mouse Microsoft Intellimouse Pro - Black Shadow
Keyboard Yes
Software Windows 10 Pro 64-bit
My little RTX A2000 is a dual slot card. With that said, the reason for why dual slot cards is becoming more rare, specially on high-end cards. Modern cards are becoming more and more power hungry and needs better colling. Dual slot cooler simply dosent cut it any more. For dual slot cards you need to look at the mid to low end cards.

Its funny how things has changed on high-end cards.

EVGA GTX 1080 TI FTW 3 gaming = dual slot card
EVGA RTX 3080 = 3 slot card
ASUS RTX 4090 = 4 slot cards.
All high-end cards, but jut getting bigger every gen it seems.
The bit that gets me, is that the big brute of a card like a 4090 is just as skinny without its cooler as the little tiddler low profile card without its cooler, just a bigger surface area. Some components might stick up a little more, but that's it, the motherboards will be similar thicknesses.
 

Solaris17

Super Dainty Moderator
Staff member
Joined
Aug 16, 2005
Messages
25,866 (3.79/day)
Location
Alabama
System Name Rocinante
Processor I9 14900KS
Motherboard EVGA z690 Dark KINGPIN (modded BIOS)
Cooling EK-AIO Elite 360 D-RGB
Memory 64GB Gskill Trident Z5 DDR5 6000 @6400
Video Card(s) MSI SUPRIM Liquid X 4090
Storage 1x 500GB 980 Pro | 1x 1TB 980 Pro | 1x 8TB Corsair MP400
Display(s) Odyssey OLED G9 G95SC
Case Lian Li o11 Evo Dynamic White
Audio Device(s) Moondrop S8's on Schiit Hel 2e
Power Supply Bequiet! Power Pro 12 1500w
Mouse Lamzu Atlantis mini (White)
Keyboard Monsgeek M3 Lavender, Akko Crystal Blues
VR HMD Quest 3
Software Windows 11
Benchmark Scores I dont have time for that.
My little RTX A2000 is a dual slot card. With that said, the reason for why dual slot cards is becoming more rare, specially on high-end cards. Modern cards are becoming more and more power hungry and needs better colling. Dual slot cooler simply dosent cut it any more. For dual slot cards you need to look at the mid to low end cards.

Its funny how things has changed on high-end cards.

EVGA GTX 1080 TI FTW 3 gaming = dual slot card
EVGA RTX 3080 = 3 slot card
ASUS RTX 4090 = 4 slot cards.
All high-end cards, but jut getting bigger every gen it seems.

Even this isn’t even wholly accurate. Announcements like the Ada A6000 and the current A6000(3090ti) are blower style 2 slot cards. Though you’ll have to want to spend $5k on it
 
Joined
Jan 3, 2015
Messages
2,881 (0.85/day)
System Name The beast and the little runt.
Processor Ryzen 5 5600X - Ryzen 9 5950X
Motherboard ASUS ROG STRIX B550-I GAMING - ASUS ROG Crosshair VIII Dark Hero X570
Cooling Noctua NH-L9x65 SE-AM4a - NH-D15 chromax.black with IPPC Industrial 3000 RPM 120/140 MM fans.
Memory G.SKILL TRIDENT Z ROYAL GOLD/SILVER 32 GB (2 x 16 GB and 4 x 8 GB) 3600 MHz CL14-15-15-35 1.45 volts
Video Card(s) GIGABYTE RTX 4060 OC LOW PROFILE - GIGABYTE RTX 4090 GAMING OC
Storage Samsung 980 PRO 1 TB + 2 TB - Samsung 870 EVO 4 TB - 2 x WD RED PRO 16 GB + WD ULTRASTAR 22 TB
Display(s) Asus 27" TUF VG27AQL1A and a Dell 24" for dual setup
Case Phanteks Enthoo 719/LUXE 2 BLACK
Audio Device(s) Onboard on both boards
Power Supply Phanteks Revolt X 1200W
Mouse Logitech G903 Lightspeed Wireless Gaming Mouse
Keyboard Logitech G910 Orion Spectrum
Software WINDOWS 10 PRO 64 BITS on both systems
Benchmark Scores Se more about my 2 in 1 system here: kortlink.dk/2ca4x
The bit that gets me, is that the big brute of a card like a 4090 is just as skinny without its cooler as the little tiddler low profile card without its cooler, just a bigger surface area. Some components might stick up a little more, but that's it, the motherboards will be similar thicknesses.
It´s the cooling needed. The more power used, the bigger the cooler needs to be.

Even this isn’t even wholly accurate. Announcements like the Ada A6000 and the current A6000(3090ti) are blower style 2 slot cards. Though you’ll have to want to spend $5k on it
Look at the power consumption. A serie cards are better power optimized or better binned GPU´s than rtx gamings cards. A6000 is rated to 300 watt while 3090 is 350 watt and 3090 TI/4090 is 450 watt stock. Then ad the max power target of up to 600 watt for 4090. I am not familiar if A6000 power target can be raised throw. But my point is that 4090 can tecnically consume up to twice the power of A6000 if 4090 is allowed to do it. that needs big lumps of a cooler or water cooling.
 

Solaris17

Super Dainty Moderator
Staff member
Joined
Aug 16, 2005
Messages
25,866 (3.79/day)
Location
Alabama
System Name Rocinante
Processor I9 14900KS
Motherboard EVGA z690 Dark KINGPIN (modded BIOS)
Cooling EK-AIO Elite 360 D-RGB
Memory 64GB Gskill Trident Z5 DDR5 6000 @6400
Video Card(s) MSI SUPRIM Liquid X 4090
Storage 1x 500GB 980 Pro | 1x 1TB 980 Pro | 1x 8TB Corsair MP400
Display(s) Odyssey OLED G9 G95SC
Case Lian Li o11 Evo Dynamic White
Audio Device(s) Moondrop S8's on Schiit Hel 2e
Power Supply Bequiet! Power Pro 12 1500w
Mouse Lamzu Atlantis mini (White)
Keyboard Monsgeek M3 Lavender, Akko Crystal Blues
VR HMD Quest 3
Software Windows 11
Benchmark Scores I dont have time for that.
It´s the cooling needed. The more power used, the bigger the cooler needs to be.


Look at the power consumption. A serie cards are better power optimized or better binned GPU´s than rtx gamings cards. A6000 is rated to 300 watt while 3090 is 350 watt and 3090 TI/4090 is 450 watt stock. Then ad the max power target of up to 600 watt for 4090. I am not familiar if A6000 power target can be raised throw. But my point is that 4090 can tecnically consume up to twice the power of A6000 if 4090 is allowed to do it. that needs big lumps of a cooler or water cooling.

Oh for sure, I was splitting hairs I guess, but I was just mentioning (poorly) that they are the same physically they have the same boost targets as FE cards, though I cant say for sure yet for the Ada A6k's obviously. Just that its possible. I'm honestly kind of curious to know if the coolers would bolt to vanilla PCBs though if your spending that much money whats the point anymore.
 
Joined
Apr 16, 2010
Messages
3,456 (0.68/day)
Location
Portugal
System Name LenovoⓇ ThinkPad™ T430
Processor IntelⓇ Core™ i5-3210M processor (2 cores, 2.50GHz, 3MB cache), Intel Turbo Boost™ 2.0 (3.10GHz), HT™
Motherboard Lenovo 2344 (Mobile Intel QM77 Express Chipset)
Cooling Single-pipe heatsink + Delta fan
Memory 2x 8GB KingstonⓇ HyperX™ Impact 2133MHz DDR3L SO-DIMM
Video Card(s) Intel HD Graphics™ 4000 (GPU clk: 1100MHz, vRAM clk: 1066MHz)
Storage SamsungⓇ 860 EVO mSATA (250GB) + 850 EVO (500GB) SATA
Display(s) 14.0" (355mm) HD (1366x768) color, anti-glare, LED backlight, 200 nits, 16:9 aspect ratio, 300:1 co
Case ThinkPad Roll Cage (one-piece magnesium frame)
Audio Device(s) HD Audio, RealtekⓇ ALC3202 codec, DolbyⓇ Advanced Audio™ v2 / stereo speakers, 1W x 2
Power Supply ThinkPad 65W AC Adapter + ThinkPad Battery 70++ (9-cell)
Mouse TrackPointⓇ pointing device + UltraNav™, wide touchpad below keyboard + ThinkLight™
Keyboard 6-row, 84-key, ThinkVantage button, spill-resistant, multimedia Fn keys, LED backlight (PT Layout)
Software MicrosoftⓇ WindowsⓇ 10 x86-64 (22H2)
Idle fan-stop is a staple feature now.
It isn't easy to cover all the warranty bases when you have to cool a chip that will wobble between 20W to 250W+ (we're talking high-end here), plus years of use with dust gathering.
Add that to the fad that was very airflow-restrictive case fronts, solid sides and in some examples...the top was covered, and you have to deal a hot box that distracts you from the inferno inside with RGB.
Meanwhile, your card has to give you stable fps.

Dual-slot was common when cards were excessive when they almost reached 300W, but the 902, the CM690II and the HAF were around back then too.
 
Joined
May 8, 2021
Messages
1,978 (1.84/day)
Location
Lithuania
System Name Shizuka
Processor Intel Core i5 10400F
Motherboard Gigabyte B460M Aorus Pro
Cooling Scythe Choten
Memory 2x8GB G.Skill Aegis 2666 MHz
Video Card(s) PowerColor Red Dragon V2 RX 580 8GB ~100 watts in Wattman
Storage 512GB WD Blue + 256GB WD Green + 4TH Toshiba X300
Display(s) BenQ BL2420PT
Case Cooler Master Silencio S400
Audio Device(s) Topping D10 + AIWA NSX-V70
Power Supply Chieftec A90 550W (GDP-550C)
Mouse Steel Series Rival 100
Keyboard Hama SL 570
Software Windows 10 Enterprise
Why is it so, so hard to find true dual slot video cards!? Literally the only higher end GPUs I see online that only actually use two slots and don't interfere with the 3rd slot which I happen to have a USB-C PCI-E card in are used OEM cards that came out of Dell, etc. prebuilts.
Because they need bigger heatsinks to dissipate heat. They could get away by being smaller, but then they would end up being noisy or they could have copper only heatsinks, but then they will cost way too much.
 
Joined
Jun 21, 2021
Messages
2,654 (2.57/day)
System Name daily driver Mac mini M2 Pro
Processor Apple Silicon M2 Pro (6 p-cores, 4 e-cores)
Motherboard Apple proprietary
Cooling Apple proprietary
Memory Apple proprietary 16GB LPDDR5 unified memory
Video Card(s) Apple Silicon M2 Pro (16-core GPU)
Storage Apple proprietary 512GB SSD + various external HDDs
Display(s) LG 27UL850W (4K@60Hz IPS)
Case Apple proprietary
Audio Device(s) Apple proprietary
Power Supply Apple proprietary
Mouse Apple Magic Trackpad 2
Keyboard Keychron K1 tenkeyless (Gateron Reds)
Software macOS Ventura 13.6 (including latest patches)
Benchmark Scores (My Windows daily driver is a Beelink Mini S12. I'm not interested in benchmarking.)
For sure, increased power consumption requires bigger cooling solutions for add-in cards.

There's probably another factor in play.

Better silicon will provide the same amount of performance at lower power levels. Or you can increase power levels for more performance. This is performance-per-watt. Higher performance-per-watt is desirable.

The issue in 2022 is that there is a bigger market for high performance-per-watt: data center. All of the NVIDIA's best silicon is ending up in their Datacenter business whose customers are far more sensitive to performance-per-watt metrics.

The average DIY PC gamer doesn't really care if their graphics card draws 300, 350, 400 watts.

Same with VRMs, memory chips, etc. These companies are binning all silicon and the best samples are going elsewhere, not into discrete graphics cards for the DIY market.

Hell, even the big system builders like HP, Dell, Lenovo might be getting better GPU silicon so they can sell a 2-slot desktop PC to the General Accounting Office or NOAA.

The CPU landscape is similar. There are OEM-only versions of CPUs that are lower powered. Sometimes the silicon is locked so it can't go above a certain power threshold because the customer in question is sensitive to energy efficiency.
 

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
17,865 (2.99/day)
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K @ 4GHz
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (2 x 8GB Corsair Vengeance Black DDR3 PC3-12800 C9 1600MHz)
Video Card(s) MSI RTX 2080 SUPER Gaming X Trio
Storage Samsung 850 Pro 256GB | WD Black 4TB | WD Blue 6TB
Display(s) ASUS ROG Strix XG27UQR (4K, 144Hz, G-SYNC compatible) | Asus MG28UQ (4K, 60Hz, FreeSync compatible)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair AX1600i
Mouse Microsoft Intellimouse Pro - Black Shadow
Keyboard Yes
Software Windows 10 Pro 64-bit
It´s the cooling needed. The more power used, the bigger the cooler needs to be.
Yes obviously, I know. I think you misunderstood the point that I was making. nvm.
 
Joined
May 8, 2021
Messages
1,978 (1.84/day)
Location
Lithuania
System Name Shizuka
Processor Intel Core i5 10400F
Motherboard Gigabyte B460M Aorus Pro
Cooling Scythe Choten
Memory 2x8GB G.Skill Aegis 2666 MHz
Video Card(s) PowerColor Red Dragon V2 RX 580 8GB ~100 watts in Wattman
Storage 512GB WD Blue + 256GB WD Green + 4TH Toshiba X300
Display(s) BenQ BL2420PT
Case Cooler Master Silencio S400
Audio Device(s) Topping D10 + AIWA NSX-V70
Power Supply Chieftec A90 550W (GDP-550C)
Mouse Steel Series Rival 100
Keyboard Hama SL 570
Software Windows 10 Enterprise
For sure, increased power consumption requires bigger cooling solutions for add-in cards.

There's probably another factor in play.

Better silicon will provide the same amount of performance at lower power levels. Or you can increase power levels for more performance. This is performance-per-watt. Higher performance-per-watt is desirable.

The issue in 2022 is that there is a bigger market for high performance-per-watt: data center. All of the NVIDIA's best silicon is ending up in their Datacenter business whose customers are far more sensitive to performance-per-watt metrics.

The average DIY PC gamer doesn't really care if their graphics card draws 300, 350, 400 watts.

Same with VRMs, memory chips, etc. These companies are binning all silicon and the best samples are going elsewhere, not into discrete graphics cards for the DIY market.

Hell, even the big system builders like HP, Dell, Lenovo might be getting better GPU silicon so they can sell a 2-slot desktop PC to the General Accounting Office or NOAA.

The CPU landscape is similar. There are OEM-only versions of CPUs that are lower powered. Sometimes the silicon is locked so it can't go above a certain power threshold because the customer in question is sensitive to energy efficiency.
I would argue that silicon is basically the same, nVidia just doesn't crack clocks as high for those cards and that leads to massive saving of power, but tiny losses in performance.
 
Joined
Nov 26, 2021
Messages
1,340 (1.53/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04

freeagent

Moderator
Staff member
Joined
Sep 16, 2018
Messages
7,508 (3.68/day)
Location
Winnipeg, Canada
Processor AMD R9 5900X
Motherboard Asus Crosshair VIII Dark Hero
Cooling Thermalright Aqua Elite 360 V3 1x TL-B12, 2x TL-C12 Pro, 2x TL K12
Memory 2x8 G.Skill Trident Z Royal 3200C14, 2x8GB G.Skill Trident Z Black and White 3200 C14
Video Card(s) Zotac 4070 Ti Trinity OC
Storage WD SN850 1TB, SN850X 2TB, Asus Hyper M.2, 2x SN770 1TB
Display(s) LG 50UP7100
Case Fractal Torrent Compact RGB
Audio Device(s) JBL 2.1 Deep Bass
Power Supply EVGA SuperNova 750w G+, Monster HDP1800
Mouse Logitech G502 Hero
Keyboard Logitech G213
VR HMD Oculus 3
Software Yes
Benchmark Scores Yes
My 980 Classified is a dual slot card. Love that thing.. wish it had a bit more vram though. Maybe some extra horsepower too. Then there is my Fermi.. big triple slot triple fan dinosaur. It is only slightly larger than my 2.7 slot Ampere.

Remember when SLi was still fresh, and you were running a pair of single slot cards? I miss those days.

I do agree though, seeing a flagship card with a block on it does make it look a bit funny, but kinda sexeh too..
 
Joined
May 8, 2020
Messages
578 (0.40/day)
System Name Mini efficient rig.
Processor R9 3900, @4ghz -0.05v offset. 110W peak.
Motherboard Gigabyte B450M DS3H, bios f41 pcie 4.0 unlocked.
Cooling some server blower @1500rpm
Memory 2x16GB oem Samsung D-Die. 3200MHz
Video Card(s) RX 6600 Pulse w/conductonaut @65C hotspot
Storage 1x 128gb nvme Samsung 950 Pro - 4x 1tb sata Hitachi 2.5" hdds
Display(s) Samsung C24RG50FQI
Case Jonsbo C2 (almost itx sized)
Audio Device(s) integrated Realtek crap
Power Supply Seasonic SSR-750FX
Mouse Logitech G502
Keyboard Redragon K539 brown switches
Software Windows 7 Ultimate SP1 + Windows 10 21H2 LTSC (patched).
Benchmark Scores Cinebench: R15 3050 pts, R20 7000 pts, R23 17800 pts, r2024 1050 pts.
There are 2 slot cards out there, even rtx 4090s. The issue is the cooling, to keep a 400W card under 80C you really need a beefy cooling (and if its the traditional one, then the card is going to be either extra long, or extra thicc.)
 
Joined
Jun 21, 2021
Messages
2,654 (2.57/day)
System Name daily driver Mac mini M2 Pro
Processor Apple Silicon M2 Pro (6 p-cores, 4 e-cores)
Motherboard Apple proprietary
Cooling Apple proprietary
Memory Apple proprietary 16GB LPDDR5 unified memory
Video Card(s) Apple Silicon M2 Pro (16-core GPU)
Storage Apple proprietary 512GB SSD + various external HDDs
Display(s) LG 27UL850W (4K@60Hz IPS)
Case Apple proprietary
Audio Device(s) Apple proprietary
Power Supply Apple proprietary
Mouse Apple Magic Trackpad 2
Keyboard Keychron K1 tenkeyless (Gateron Reds)
Software macOS Ventura 13.6 (including latest patches)
Benchmark Scores (My Windows daily driver is a Beelink Mini S12. I'm not interested in benchmarking.)
I would argue that silicon is basically the same, nVidia just doesn't crack clocks as high for those cards and that leads to massive saving of power, but tiny losses in performance.

Performance-per-watt is a curve, it's not linear. Anyhow who has tried overclocking silicon should know this: CPU, GPU, memory.

Increasing your GPU power by 20% to get 5% more fps will give you a higher raw score. It also decreases performance-per-watt. NVIDIA's datacenter customers will want optimal performance-per-watt. If they need more performance, they can step up to a higher level product or just buy more compute units.

Overclocking a graphics card is something PC gamers do because you can't buy four RTX 3050 cards for gaming in place of one 3090.

If you're not forcing more electricity into the GPU, you don't need crazy thermal solutions like 4-slot heatsinks.

Also datacenter GPUs live in air conditioned server rooms. They already have the benefit of being shepherded by IT professionals and server administrators who will focus on proper operating conditions. Acoustics aren't really a concern either.

PC gamers generally have their computers nearby, in the worst case actually on their desk. So consumer discrete graphics cards are typically operated by people who often don't put much thought into providing proper airflow and adequate ventilation and put their equipment in places like living rooms, bedrooms, and offices where acoustics are a factor.
 
Last edited:
Joined
Jan 14, 2019
Messages
9,828 (5.11/day)
Location
Midlands, UK
System Name Nebulon-B Mk. 4
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) Sapphire Pulse Radeon RX 7800 XT
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
Benchmark Scores Cinebench R23 single-core: 1,800, multi-core: 18,000. Superposition 1080p Extreme: 9,900.
My 980 Classified is a dual slot card. Love that thing.. wish it had a bit more vram though. Maybe some extra horsepower too. Then there is my Fermi.. big triple slot triple fan dinosaur. It is only slightly larger than my 2.7 slot Ampere.

Remember when SLi was still fresh, and you were running a pair of single slot cards? I miss those days.

I do agree though, seeing a flagship card with a block on it does make it look a bit funny, but kinda sexeh too..
Well, as much as I appreciate a female body with slight curves, I prefer my GPUs on the small and efficient side. :D
 
Joined
May 8, 2021
Messages
1,978 (1.84/day)
Location
Lithuania
System Name Shizuka
Processor Intel Core i5 10400F
Motherboard Gigabyte B460M Aorus Pro
Cooling Scythe Choten
Memory 2x8GB G.Skill Aegis 2666 MHz
Video Card(s) PowerColor Red Dragon V2 RX 580 8GB ~100 watts in Wattman
Storage 512GB WD Blue + 256GB WD Green + 4TH Toshiba X300
Display(s) BenQ BL2420PT
Case Cooler Master Silencio S400
Audio Device(s) Topping D10 + AIWA NSX-V70
Power Supply Chieftec A90 550W (GDP-550C)
Mouse Steel Series Rival 100
Keyboard Hama SL 570
Software Windows 10 Enterprise
Performance-per-watt is a curve, it's not linear. Anyhow who has tried overclocking silicon should know this.

Increasing your GPU power by 20% to get 5% more fps will give you a higher raw score. It also decreases performance-per-watt. NVIDIA's datacenter customers will want optimal performance-per-watt. If they need more performance, they can step up to a higher level product or just buy more units.
It might be actually worse than 20% power for 5% performance. You can read about it here:

Basically, heat output or well power consumption linearly increases with current (basically how much energy CPU draws in) and frequency, but increments of voltage increase power usage by square. On top of that, if you want more frequencies and you are really pushing it, then you need increasingly more and more voltage to stabilize those speeds. Also semiconductor efficiency drops a bit when they run hot, therefore, it's not only just that simple and the higher you go, there more problems you have. Now, every new node tries to solves all such problems or at least some of them, but then again, semiconductor engineers like Intel, nV, AMD just like to crank clocks higher and higher with each node, therefore you don't get lower absolute wattage of chips, but you may get more performance peer watt at same wattage. Unfortunately, many computer parts today are cranked to the moon and beyond and losing even the last 5% of clock speed can yield 15-20% reduction in power consumption. 10% performance loss can lead to saving in 30-40% ballpark. If things are really that stupid, then it just makes sense for consumer to do something and no, it really does matter, because energy costs money and prices of it went up by times during this year. Also like never before, we need to switch to greener energy ASAP, because burning dead liquefied dinosaurs is awful for environment and public health and it turned out that it's a major geopolitical risk, which can start a war or just ruin economies overnight.

Overclocking a graphics card is something PC gamers do because you can't buy four RTX 3050s for gaming in place of one 3090.
And I wonder who killed SLI??? Anyway, most people can only afford RTX 3060s, it was the best selling GPU from Ampere line-up. That's all nice and all, but most popular cards that people use right now are GTX 1060, GTX 1650 and RTX 2060 and those 3 alone account for over 17% of all gamers. Performance per dollar matter a lot, so does performance per watt too and also the fact that buying high end cards makes no sense as basically next generation of cards can bring same performance for half the price. You can try to justify that RTX 3090 as much as you want, but it's a dumb card and makes no sense.

And if you're not forcing more electricity into the GPU, you don't need such crazy thermal solutions.

Also datacenter GPUs live in air conditioned server rooms. They already have the benefit of being shepherded by IT professionals and server administrators who will focus on proper operating conditions. PC gamers generally have their computers nearby, in the worst case actually on their desk. So consumer discrete graphics cards are typically operated by people who often really don't put much thought into providing proper airflow and adequate ventilation and put their equipment in places like living rooms, bedrooms, and offices.
And you basically end up with skimped cooling solutions in the end and hotter running GPUs, because quietness isn't important and that's basically what has been happening for over decade with pro cards. Many of them just plain suck, because they are either loud or run very hot and since there are no aftermarket coolers, you can't just buy Zotac or MSI version with actually decent heatsink either. So basically that efficiency advance only lead to smaller heatsinks, but neither lower temps or less noise.
 
Joined
Jun 21, 2021
Messages
2,654 (2.57/day)
System Name daily driver Mac mini M2 Pro
Processor Apple Silicon M2 Pro (6 p-cores, 4 e-cores)
Motherboard Apple proprietary
Cooling Apple proprietary
Memory Apple proprietary 16GB LPDDR5 unified memory
Video Card(s) Apple Silicon M2 Pro (16-core GPU)
Storage Apple proprietary 512GB SSD + various external HDDs
Display(s) LG 27UL850W (4K@60Hz IPS)
Case Apple proprietary
Audio Device(s) Apple proprietary
Power Supply Apple proprietary
Mouse Apple Magic Trackpad 2
Keyboard Keychron K1 tenkeyless (Gateron Reds)
Software macOS Ventura 13.6 (including latest patches)
Benchmark Scores (My Windows daily driver is a Beelink Mini S12. I'm not interested in benchmarking.)
It might be actually worse than 20% power for 5% performance.

Whether the actual figure is 20%, 22%, or 26.73% isn't important.

The point is overclocking silicon is a poor strategy from a performance-per-watt perspective, especially for a sustained workload. In the context of this particular thread, you should be looking for a peak load that is near the top of that curve.

Remember that performance-per-watt doesn't address fan acoustics. Decibels is a logarithmic scale so a 10 dB(A) difference is a tenfold increase in sound. A 3 dB(A) difference about twice the noise. So overclocking your GPU and having your fans run at 100% burns more GPU electricity (which is money) as well as more fan electricity (more money).

(truncated for brevity)

And I wonder who killed SLI???
Well, it certainly wasn't the GPU makers. They had a vested interest in selling more cards obviously. The onus was on the individual game developers to make SLI work properly on each title. In some notable cases, SLI made performance worse. Properly configuring SLI for the consumer wasn't a cakewalk either.

By 2017, SLI's heyday had passed. Graphics card performance was improving to the point where rasterization could be handled by one card making SLI unattractive.

Anyway, most people can only afford RTX 3060s, it was the best selling GPU from Ampere line-up. That's all nice and all, but most popular cards that people use right now are GTX 1060, GTX 1650 and RTX 2060 and those 3 alone account for over 17% of all gamers.

(truncated for brevity)

You can try to justify that RTX 3090 as much as you want, but it's a dumb card and makes no sense.
It's natural that the entry level graphics card two generations ago is the most common one on the Steam Hardware Survey, just like there are more five year old Toyota Celicas on the road than the brand new Mercedes-Benz S-Class V12 sedan.

For some people, the price of the S-Class isn't really a significant dent in their disposable income.

Remember that not everyone who buys a 3090 is going to game with it. Video games are made on PCs. And people not involved in gaming use the card too.

The thread discussion was the dearth of 2-slot cards for the DIY market. As several of us mentioned earlier, the main reason for this is increased power consumption in today's graphics card products which requires more hefty cooling solutions.

You can get a 2-slot 3090 but it will be liquid cooled, either as a hybrid with an AIO radiator (and an on-board fan for the VRM & VRAM) or as a waterblock for a custom cooling loop. I had an Alphacool full-length waterblock on a 2070 Super FE in a 2-slot NZXT H210 case. It worked great and ran much quieter than the stock cooler (which was also 2-slots thick). Most of the AIB coolers for 2070 Super were thicker than 2 slots and once I got the FE, I understood why. Two slots isn't realistically adequate for a graphics card GPU with a >225W TDP.

So, yes one can buy an off the shelf 2-slot GPU today. It'll be super expensive and has the caveat that there will be 240mm AIO radiator dangling off the end. There are a handful of entry level GPUs available for 2-slot configurations.

I have one: EVGA GeForce RTX 3050 8GB XC Gaming. It's in a build (the aforementioned NZXT H210) that I don't use for gaming but it works great.

And you basically end up with skimped cooling solutions in the end and hotter running GPUs, because quietness isn't important and that's basically what has been happening for over decade with pro cards. Many of them just plain suck, because they are either loud or run very hot and since there are no aftermarket coolers, you can't just buy Zotac or MSI version with actually decent heatsink either. So basically that efficiency advance only lead to smaller heatsinks, but neither lower temps or less noise.

Again, performance-per-watt is a different way to look at the situation. Apple does this with their iPhones. Those guys are laser focused on performance-per-watt because most of their business comes from iPhone and of the computers, >85% of Macs sold are notebook models.

You won't get 4090 benchmark results from a Mac but for sure their performance-per-watt metrics crush anything in the x86/x64 PC consumer market.
 
Last edited:
Joined
Jan 14, 2019
Messages
9,828 (5.11/day)
Location
Midlands, UK
System Name Nebulon-B Mk. 4
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) Sapphire Pulse Radeon RX 7800 XT
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
Benchmark Scores Cinebench R23 single-core: 1,800, multi-core: 18,000. Superposition 1080p Extreme: 9,900.
Well, it certainly wasn't the GPU makers. They had a vested interest in selling more cards obviously. It was really up to the individual game developers to make SLI work. In some notable cases, SLI made performance worse. Properly configuring SLI for the consumer wasn't a cakewalk either.
That. I personally think SLi and CF were never a thing anyway due to terrible scaling. You always got more performance out of a single $200 GPU than two $100 ones, not to mention that your VRAM was never doubled, either.
 
Joined
Jun 21, 2021
Messages
2,654 (2.57/day)
System Name daily driver Mac mini M2 Pro
Processor Apple Silicon M2 Pro (6 p-cores, 4 e-cores)
Motherboard Apple proprietary
Cooling Apple proprietary
Memory Apple proprietary 16GB LPDDR5 unified memory
Video Card(s) Apple Silicon M2 Pro (16-core GPU)
Storage Apple proprietary 512GB SSD + various external HDDs
Display(s) LG 27UL850W (4K@60Hz IPS)
Case Apple proprietary
Audio Device(s) Apple proprietary
Power Supply Apple proprietary
Mouse Apple Magic Trackpad 2
Keyboard Keychron K1 tenkeyless (Gateron Reds)
Software macOS Ventura 13.6 (including latest patches)
Benchmark Scores (My Windows daily driver is a Beelink Mini S12. I'm not interested in benchmarking.)
That. I personally think SLi and CF were never a thing anyway due to terrible scaling. You always got more performance out of a single $200 GPU than two $100 ones, not to mention that your VRAM was never doubled, either.

SLI was a thing for a handful of tinkerers, the type of people who often like to participate on PC forums. From a value perspective (cost, time to configure, etc.) SLI was very poor but the people who loved it didn't listen at the time to the naysayers.

In various discussions about GPU overclocking I've been rather dismissive of the idea of spending much time tweaking settings. However there are plenty of folks here who are still super gung ho about OC-ing their modern CPUs and GPUs.

Anyhow, most of today's two-slot GPUs end up in OEM builds (HP, Dell, Lenovo, etc.) or in some professional computing products. Joe Consumer really wants more gaming performance at the expense of electricity, heat, and size. Not everyone can afford it but look at the 4090's near instantaneous sellout and immediate scalping despite the fact the 4090 is way more expensive than the 3090, there's a recession, and there's zero mining demand.
 
Joined
May 8, 2021
Messages
1,978 (1.84/day)
Location
Lithuania
System Name Shizuka
Processor Intel Core i5 10400F
Motherboard Gigabyte B460M Aorus Pro
Cooling Scythe Choten
Memory 2x8GB G.Skill Aegis 2666 MHz
Video Card(s) PowerColor Red Dragon V2 RX 580 8GB ~100 watts in Wattman
Storage 512GB WD Blue + 256GB WD Green + 4TH Toshiba X300
Display(s) BenQ BL2420PT
Case Cooler Master Silencio S400
Audio Device(s) Topping D10 + AIWA NSX-V70
Power Supply Chieftec A90 550W (GDP-550C)
Mouse Steel Series Rival 100
Keyboard Hama SL 570
Software Windows 10 Enterprise
Whether the actual figure is 20%, 22%, or 26.73% isn't important.
How it is not? It just shows how far cards are cranked way past diminishing return point and it's all just ridiculous.


Well, it certainly wasn't the GPU makers. They had a vested interest in selling more cards obviously. The onus was on the individual game developers to make SLI work properly on each title. In some notable cases, SLI made performance worse. Properly configuring SLI for the consumer wasn't a cakewalk either.
Not exactly correct. You see they also have to sell their pro cards, that have really high margins and basically one pro card in margin might equal 10-20 gamer cards. Therefore they sure as heck want to move pro cards as much as possible and SLI or NVLink actually worked beautiful there and still does. Meanwhile, consumer cards in gamer just never worked right after 3DFX's SLI. Well, sure it was difficult for game devs to implement it in games, that's true. Some even skimp on really basic optimization as it is and unlike what some people think, game company margins aren't great and they are in super volatile market, so they have to do many things to not end up upside down. But then again nVidia has an interest in not making SLI work in games well, because people could have just bought cheaper 2 cards and not buy 1 more expensive one or even worse, pros use high end gaming cards in SLI for fraction of cost and not pay those high margins for pro cards. So I just don't think that nVidia really wanted to make SLI work truly well, they just made it kinda cool, but not cool enough to replace some of their other products. nVidia has also been fighting for years against people using GeForce cards instead of Quadros for work and basically had to come up with bunch of small things that made Quadros more sensible for work, it failed and culmination was just outright locking down of vBIOS cross-flashing and later making it physically impossible. I think that nVidia secretly hating SLI actually happened.


Remember that not everyone who buys a 3090 is going to game with it. Video games are made on PCs. And people not involved in gaming use the card too.
And that's when you are supposed to use RTX A or Radeon Pro card. They used to use same hardware and still do, but their drivers are different and Pro cards render depth in games properly, more accurately, plus you have more vRAM and some other smaller things that could make those cards better in game dev.


The thread discussion was the dearth of 2-slot cards for the DIY market. As several of us mentioned earlier, the main reason for this is increased power consumption in today's graphics card products which requires more hefty cooling solutions.
I think it was me lol. If you only want modern features and still decent performance, you can get less than BFGPU or high end SKU, especially if you only game.

You can get a 2-slot 3090 but it will be liquid cooled, either as a hybrid with an AIO radiator (and an on-board fan for the VRM & VRAM) or as a waterblock for a custom cooling loop. I had an Alphacool full-length waterblock on a 2070 Super FE in a 2-slot NZXT H210 case. It worked great and ran much quieter than the stock cooler (which was also 2-slots thick). Most of the AIB coolers for 2070 Super were thicker than 2 slots and once I got the FE, I understood why. Two slots isn't realistically adequate for a graphics card GPU with a >225W TDP.
For 225 watts it could be enough, not for more though. Anyway, you can just buy lower tier GPU from same gen or just undervolt what you want to buy.

So, yes one can buy an off the shelf 2-slot GPU today. It'll be super expensive and has the caveat that there will be 240mm AIO radiator dangling off the end. There are a handful of entry level GPUs available for 2-slot configurations.

I have one: EVGA GeForce RTX 3050 8GB XC Gaming. It's in a build (the aforementioned NZXT H210) that I don't use for gaming but it works great.
I really hate it when people call it an "entry" level card. It sure is the cheapest Ampere card you can get, but it's great at gaming. It can do 60 fps at 1440p, usually high, sometime ultra, very rarely at medium. 1080p runs at ultra with 60-100+ fps. It's a beast of a card and it's nothing like previous xx50 tier cards, which retrospectively sucked donkey's arse.
 
Joined
Jan 11, 2013
Messages
1,237 (0.30/day)
Location
California, unfortunately.
System Name Sierra ~ Server
Processor Core i5-11600K ~ Core i3-12100
Motherboard Asus Prime B560M-A AC ~ MSI PRO B760M-P
Cooling CM 212 Black RGB Edition ~ Intel Stock Cooler
Memory 64GB (2x 32GB) DDR4-3600 ~ 32GB (4x 8GB) DDR4-3200
Video Card(s) XFX Radeon RX 6950 XT ~ EVGA GeForce GTX 970
Storage 4TB Samsung 990 Pro with Heatsink NVMe SSD ~ 2TB Kingston NV1 NVMe SSD + 500GB WD Blue SATA SSD
Display(s) 2x Dell S2721QS 4K 60Hz ~ N/A
Case Asus Prime AP201 - Open Frame Chassis
Power Supply Thermaltake GF1 850W ~ Thermaltake Smart 500W
Software Windows 11 Pro ~ Proxmox VE
Benchmark Scores Laptops: Dell Latitude E7270, Dell Latitude 14 Rugged 5420.
Because they need bigger heatsinks to dissipate heat. They could get away by being smaller, but then they would end up being noisy or they could have copper only heatsinks, but then they will cost way too much.

Yes. I get that, but if Dell can make a perfectly functional video card that fits in two slots Asus and MSI and whoever else can too. They just don’t.

The reference RX6800 and RTX 3070 are dual slot cards too.
I have an RX6600 already. But I need a higher performance card.
 
Joined
Jun 21, 2021
Messages
2,654 (2.57/day)
System Name daily driver Mac mini M2 Pro
Processor Apple Silicon M2 Pro (6 p-cores, 4 e-cores)
Motherboard Apple proprietary
Cooling Apple proprietary
Memory Apple proprietary 16GB LPDDR5 unified memory
Video Card(s) Apple Silicon M2 Pro (16-core GPU)
Storage Apple proprietary 512GB SSD + various external HDDs
Display(s) LG 27UL850W (4K@60Hz IPS)
Case Apple proprietary
Audio Device(s) Apple proprietary
Power Supply Apple proprietary
Mouse Apple Magic Trackpad 2
Keyboard Keychron K1 tenkeyless (Gateron Reds)
Software macOS Ventura 13.6 (including latest patches)
Benchmark Scores (My Windows daily driver is a Beelink Mini S12. I'm not interested in benchmarking.)
How it is not? It just shows how far cards are cranked way past diminishing return point and it's all just ridiculous.

We're talking in general concepts here. We're not trying to prove what the actual curve looks like which would vary from GPU to GPU anyhow, as well as sample to sample. If it were 26.16% on a 3060 but 25.98% on a 6800XT. It doesn't matter.

I think that nVidia secretly hating SLI actually happened.

Well, NVIDIA got their wish, NVLink doesn't even exist on the 4090 cards.

For 225 watts it could be enough, not for more though. Anyway, you can just buy lower tier GPU from same gen or just undervolt what you want to buy.
Absolutely, however regardless of the wattage it still needs to fit in a 2-slot space. Remember the thread discussion? TWO SLOTS.

You can stick a 400W graphics card in a 2-slot space. Just not any card and not most cards. But there are two slot cards in the marketplace.

I really hate it when people call it an "entry" level card. It sure is the cheapest Ampere card you can get, but it's great at gaming. It can do 60 fps at 1440p, usually high, sometime ultra, very rarely at medium. 1080p runs at ultra with 60-100+ fps. It's a beast of a card and it's nothing like previous xx50 tier cards, which retrospectively sucked donkey's arse.

Entry level is a common term used in multiple industries (not just PC graphics) to describe the least expensive tier of product in a full stack. It doesn't mean "sucky". Entry level is a standard business term. It's even used for job listings.

We could be talking about food processors, digital SLRs, or fountain pens. Would you like us to call it something else? "Mass market" is a synonym. Would that make you happy?

In the context of the 30 Series, the RTX 3050 in my daily driver Windows PC is the entry level model. It replaced a Radeon RX 550 2GB which was also the entry level model of AMD's 500 Series. They both serve their purposes. Some people game with them, others do not. If they are happy with how it performs in whatever their particular usage case it, then things are fine.

Remember that not everyone games with discrete video games. I assure you that here in my country, there are tens of thousands of PCs sitting in government buildings, corporate offices, research labs that will never download and run a single byte of code from Steam, Epic Game Store, Ubisoft, whatever. Many of those are also using entry level graphics cards, just like my RX 550.

The problem is I can't keep track of which TPU participant hates which term. So the next time I bring up "entry level" in a TPU discussion, I will have forgotten how much you hate the term. Sorry in advance because I know I will use it again.
 
Last edited:
Top