• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce 456.38 WHQL Released: Ampere Support, SLI Finally Dead

Joined
Aug 20, 2007
Messages
20,778 (3.41/day)
System Name Pioneer
Processor Ryzen R9 7950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory 64GB (4x 16GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage 2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches + PBT DS keycaps
Software Gentoo Linux x64

ARF

Joined
Jan 28, 2020
Messages
3,945 (2.55/day)
Location
Ex-usa
The difference between implicit and explicit. DX12 gives explicit multi-GPU rendering.
So, when you have two Ampere cards attached to your mobo, proper game that is optimised to work with multi-GPU configuration, then it will be fine and ok..
 
Joined
Feb 18, 2005
Messages
5,238 (0.75/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Logitech G613
Software Windows 10 Professional x64
Looks like I more people are confused than me. It's a simple question:

I have 19 DirectX 11 games in my Steam library that I know support SLI and have a SLI profile.

If I buy 2x RTX 3090's for that magic 4K 120FPS (minimum) #, will I still be getting proper SLI support on my 19 DX11 games? or has Nvidia just purposely disabled that on the 3000 series and I would only get SLI support on 2000 series GPU's and below?

And the answer is equally simple, and already stated. Read it again. And do not spread FUD about NVIDIA "disabling" anything.

Also still driver errors with call of duty warzone.

That's because the game is a buggy POS. But as long as people like you keep rewarding EA for their failures by buying their buggy POS games, and blaming GPU manufacturers instead of EA, EA will happily take your money and keep churning out buggy POS games. So complain elsewhere, because you're part of the problem.
 

MxPhenom 216

ASIC Engineer
Joined
Aug 31, 2010
Messages
12,945 (2.60/day)
Location
Loveland, CO
System Name Ryzen Reflection
Processor AMD Ryzen 9 5900x
Motherboard Gigabyte X570S Aorus Master
Cooling 2x EK PE360 | TechN AM4 AMD Block Black | EK Quantum Vector Trinity GPU Nickel + Plexi
Memory Teamgroup T-Force Xtreem 2x16GB B-Die 3600 @ 14-14-14-28-42-288-2T 1.45v
Video Card(s) Zotac AMP HoloBlack RTX 3080Ti 12G | 950mV 1950Mhz
Storage WD SN850 500GB (OS) | Samsung 980 Pro 1TB (Games_1) | Samsung 970 Evo 1TB (Games_2)
Display(s) Asus XG27AQM 240Hz G-Sync Fast-IPS | Gigabyte M27Q-P 165Hz 1440P IPS | Asus 24" IPS (portrait mode)
Case Lian Li PC-011D XL | Custom cables by Cablemodz
Audio Device(s) FiiO K7 | Sennheiser HD650 + Beyerdynamic FOX Mic
Power Supply Seasonic Prime Ultra Platinum 850
Mouse Razer Viper v2 Pro
Keyboard Razer Huntsman Tournament Edition
Software Windows 11 Pro 64-Bit
I'm curious.

I guess pretty soon advances in fabrication will turn miniscule if not cease completely because you can only reduce transistors so much.

When that happens, it will be impossible to create larger and larger dies because it would require wafers of impossible sizes. Also, the video card itself only allows certain die dimensions.

The only way forward would be to stack GPU "cores" like CPU cores in which case it will become impossible to share the computational workload as it's currently done which means an alternative to SLI would be required.

How will NVIDIA/AMD/Intel/Apple/Qualcomm/Samsung/ARM/MediaTek/etc. approach this conundrum? Or will all game engines implement DirectX12/Vulkan multi-GPU scaling? Last time I heard it's extremely difficult to implement and only relatively large game development studios can afford it.

You would be surprised. There is a lot of research going on about possible new transistor architectures to continue to keep shrinking. The next one that most fabs are going to be using for 3nm and smaller is Gate-All-Around FET. Then we will probably be looking at carbon nano-tubes or sheets, etc depending on how effective the GAAFETs end up being at 3nm and 1.2(7)nm. We are in a similar situation with FINFET that we were in with the planar FET as we got to sub 20nm nodes. The conducting path from source to drain is actually getting so small that even if the transistor is off, there may be a flow of some electrons, which we don't want. This is a leakage issue, and a problem due to size of an election and the ever decreasing size of the channel.
 
Last edited:
Joined
Aug 20, 2007
Messages
20,778 (3.41/day)
System Name Pioneer
Processor Ryzen R9 7950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory 64GB (4x 16GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage 2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches + PBT DS keycaps
Software Gentoo Linux x64
The difference between implicit and explicit. DX12 gives explicit multi-GPU rendering.
So, when you have two Ampere cards attached to your mobo, proper game that is optimised to work

The last sentence is key.

DX12 switches the onus of development for mgpu to game devs, not driver devs. This effectively kills it outside triple-A game studios.
 

ARF

Joined
Jan 28, 2020
Messages
3,945 (2.55/day)
Location
Ex-usa
The last sentence is key.

DX12 switches the onus of development for mgpu to game devs, not driver devs. This effectively kills it outside triple-A game studios.

Well, given the rumours that Nvidia's next-gen Hopper designs will be MCM, then SLi will work in one form or another.
 
Joined
Feb 18, 2005
Messages
5,238 (0.75/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Logitech G613
Software Windows 10 Professional x64
The last sentence is key.

DX12 switches the onus of development for mgpu to game devs, not driver devs. This effectively kills it outside triple-A game studios.

I guess my mistake was expecting users on a supposed "tech enthusiast" forum to understand the basics of multi-GPU, when what they think is "multi-GPU make game go brrrrr". Imagine if they cared enough to educate themselves!

Well, given the rumours that Nvidia's next-gen Hopper designs will be MCM, then SLi will work in one form or another.

Yes, in exactly the same way that AMD's intercore interconnects on Zen are like SLI!

Which is to say, completely different in every way except the most basic description.
 
Joined
Aug 20, 2007
Messages
20,778 (3.41/day)
System Name Pioneer
Processor Ryzen R9 7950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory 64GB (4x 16GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage 2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches + PBT DS keycaps
Software Gentoo Linux x64
I feel you, but gave up educating a while ago when I discovered flat earthers and those types roaming here... yikes.

Anyhow, back to the topic before we drift.
 
Joined
Jun 2, 2017
Messages
7,920 (3.15/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
Apparently you can't read either. The changelog states "Implicit SLI Disabled on NVIDIA Ampere GPUs". It does not state "Implicit SLI Disabled on NVIDIA Ampere GPUs except in DirectX 11".

There is no SLI on Ampere. There is multi-GPU support if and only if it's implemented via a graphics API. DirectX 11 does not have multi-GPU functionality, therefore no DirectX 11 games can have any sort of multi-GPU support on Ampere.
Does the 3090 not support NVlink just like the 2080TI.
 
Joined
Apr 18, 2013
Messages
1,260 (0.31/day)
Location
Artem S. Tashkinov
You would be surprised. There is a lot of research going on about possible new transistor architectures to continue to keep shrinking. The next one that most fabs are going to be using for 3nm and smaller is Gate-All-Around FET. Then we will probably be looking at carbon nano-tubes or sheets, etc depending on how effective the GAAFETs end up being at 3nm and 1.2(7)nm. We are in a similar situation with FINFET that we were in with the planar FET as we got to sub 20nm nodes. The conducting path from source to drain is actually getting so small that even if the transistor is off, there may be a flow of some electrons, which we don't want. This is a leakage issue, and a problem due to size of an election and the ever decreasing size of the channel.

There's a limit to miniaturization and your reply doesn't deal with it at the slightest. There will be time when nothing could be done to make transistors smaller and faster, and performance could be increased only by stacking dies at the expense of infinitely increasing power consumption.
 
Last edited:
Joined
Mar 10, 2015
Messages
3,984 (1.20/day)
System Name Wut?
Processor 3900X
Motherboard ASRock Taichi X570
Cooling Water
Memory 32GB GSkill CL16 3600mhz
Video Card(s) Vega 56
Storage 2 x AData XPG 8200 Pro 1TB
Display(s) 3440 x 1440
Case Thermaltake Tower 900
Power Supply Seasonic Prime Ultra Platinum
I know that there are tons of people that think Multi GPU is passe but at the end of the day it is my computer and my money to do with as I please.

That is true. Just don't complain because there is no support when you're told there is no support. Not directed at you. Royal and stuff.
 
Joined
Aug 20, 2007
Messages
20,778 (3.41/day)
System Name Pioneer
Processor Ryzen R9 7950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory 64GB (4x 16GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage 2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches + PBT DS keycaps
Software Gentoo Linux x64
Does the 3090 not support NVlink just like the 2080TI.

Physically, yes, but the software has never existed to use SLI on it with the 3090. I'm sure you could use it somehow, but not for SLI.
 

MxPhenom 216

ASIC Engineer
Joined
Aug 31, 2010
Messages
12,945 (2.60/day)
Location
Loveland, CO
System Name Ryzen Reflection
Processor AMD Ryzen 9 5900x
Motherboard Gigabyte X570S Aorus Master
Cooling 2x EK PE360 | TechN AM4 AMD Block Black | EK Quantum Vector Trinity GPU Nickel + Plexi
Memory Teamgroup T-Force Xtreem 2x16GB B-Die 3600 @ 14-14-14-28-42-288-2T 1.45v
Video Card(s) Zotac AMP HoloBlack RTX 3080Ti 12G | 950mV 1950Mhz
Storage WD SN850 500GB (OS) | Samsung 980 Pro 1TB (Games_1) | Samsung 970 Evo 1TB (Games_2)
Display(s) Asus XG27AQM 240Hz G-Sync Fast-IPS | Gigabyte M27Q-P 165Hz 1440P IPS | Asus 24" IPS (portrait mode)
Case Lian Li PC-011D XL | Custom cables by Cablemodz
Audio Device(s) FiiO K7 | Sennheiser HD650 + Beyerdynamic FOX Mic
Power Supply Seasonic Prime Ultra Platinum 850
Mouse Razer Viper v2 Pro
Keyboard Razer Huntsman Tournament Edition
Software Windows 11 Pro 64-Bit
There's a limit to miniaturization and your reply doesn't deal with it at the slightest. There will be time when nothing could be done to make transistors smaller and faster, and performance could be increases only by stacking dies at the expense of infinitely increasing power consumption.

We are still pretty far out from actually having to worry about that. I was expecting us to really have to worry about it once we got to below 5nm, but that doesn't seem to be the case. There are fabs coming out with different ways of doing the stacked dies packaging. Zero real implementation so far though. TSMCs 3DFabric could be pretty promising.
 
Last edited:
Joined
Feb 19, 2013
Messages
14 (0.00/day)
@Assimilator such hostility in your answers but thank you for answering none the less. Apologies that I'm not 100% up to speed on what explicit vs implicit SLI/Multi-GPU means. Goggle didn't provide any clear answers before I posted. I also don't know how framing my question (of Nvidia disabling SLI for 3000 series) in the form of a question is considered spreading "FUD" like you mentioned. I didn't explicitly say Nvidia did this, I was simply asking if that's what the driver release notes meant since I was a bit confused. I have 2x RTX 2080 Ti's in SLI/NvLink. I was thinking of going 2x RTX 3090's to now push 4K 120FPS (minimum) on my LG OLED but it looks like i'll just be sticking to 1x RTX 3090 instead.
 

rtwjunkie

PC Gaming Enthusiast
Supporter
Joined
Jul 25, 2008
Messages
13,909 (2.42/day)
Location
Louisiana -Laissez les bons temps rouler!
System Name Bayou Phantom
Processor Core i7-8700k 4.4Ghz @ 1.18v
Motherboard ASRock Z390 Phantom Gaming 6
Cooling All air: 2x140mm Fractal exhaust; 3x 140mm Cougar Intake; Enermax T40F Black CPU cooler
Memory 2x 16GB Mushkin Redline DDR-4 3200
Video Card(s) EVGA RTX 2080 Ti Xc
Storage 1x 500 MX500 SSD; 2x 6TB WD Black; 1x 4TB WD Black; 1x400GB VelRptr; 1x 4TB WD Blue storage (eSATA)
Display(s) HP 27q 27" IPS @ 2560 x 1440
Case Fractal Design Define R4 Black w/Titanium front -windowed
Audio Device(s) Soundblaster Z
Power Supply Seasonic X-850
Mouse Coolermaster Sentinel III (large palm grip!)
Keyboard Logitech G610 Orion mechanical (Cherry Brown switches)
Software Windows 10 Pro 64-bit (Start10 & Fences 3.0 installed)
Well obviously for Ampere GPUs, this is what this article is about, why would anyone even think I am talking about previous series...
You yourself asked why they were removing SLI support from previous ones.

And let's be clear, any older games you had that benefited from SLI would not need it anymore with this One card.
 
Joined
Apr 18, 2013
Messages
1,260 (0.31/day)
Location
Artem S. Tashkinov
We are still pretty far out from actually having to worry about that. I was expecting us to really have to worry about it once we got to below 5nm, but that doesn't seem to be the case. There are fabs coming out with different ways of doing the stacked dies packaging. Zero real implementation so far though.

5nm CPUs are already produced (Apple's A14X) and I've already started worrying. Also, I'm not an engineer but AMD's 7nm CPUs/GPUs have become a lot harder to cool down than their higher nanometers counterparts. It seems to me we've already started hitting a brick wall in terms of efficiency and heat dissipation and it could get even worse going forward.
 
Last edited:

MxPhenom 216

ASIC Engineer
Joined
Aug 31, 2010
Messages
12,945 (2.60/day)
Location
Loveland, CO
System Name Ryzen Reflection
Processor AMD Ryzen 9 5900x
Motherboard Gigabyte X570S Aorus Master
Cooling 2x EK PE360 | TechN AM4 AMD Block Black | EK Quantum Vector Trinity GPU Nickel + Plexi
Memory Teamgroup T-Force Xtreem 2x16GB B-Die 3600 @ 14-14-14-28-42-288-2T 1.45v
Video Card(s) Zotac AMP HoloBlack RTX 3080Ti 12G | 950mV 1950Mhz
Storage WD SN850 500GB (OS) | Samsung 980 Pro 1TB (Games_1) | Samsung 970 Evo 1TB (Games_2)
Display(s) Asus XG27AQM 240Hz G-Sync Fast-IPS | Gigabyte M27Q-P 165Hz 1440P IPS | Asus 24" IPS (portrait mode)
Case Lian Li PC-011D XL | Custom cables by Cablemodz
Audio Device(s) FiiO K7 | Sennheiser HD650 + Beyerdynamic FOX Mic
Power Supply Seasonic Prime Ultra Platinum 850
Mouse Razer Viper v2 Pro
Keyboard Razer Huntsman Tournament Edition
Software Windows 11 Pro 64-Bit
5nm CPUs are already produced (Apple's A14X) and I've already started worrying. Also, I'm not an engineer but AMD's 7nm CPUs/GPUs have become a lot harder to cool down than their higher nanometers counterparts. It seems to me we've already started hitting brick wall in terms of efficiency and heat dissipation and it could be even worse going forward.

Designs play a huge roll in that though. Plus there's a lot of crazy things you can do to a FINFET to improve efficiency and the heat issue. Lithography method used helps a lot. There is a pretty huge shift in the industry to EUV now for some 7nm nodes and smaller.

My job we are dealing with 12nm and soon 7nm. Our dies are also smaller than 30mm^2

 

TheLostSwede

News Editor
Joined
Nov 11, 2004
Messages
16,062 (2.26/day)
Location
Sweden
System Name Overlord Mk MLI
Processor AMD Ryzen 7 7800X3D
Motherboard Gigabyte X670E Aorus Master
Cooling Noctua NH-D15 SE with offsets
Memory 32GB Team T-Create Expert DDR5 6000 MHz @ CL30-34-34-68
Video Card(s) Gainward GeForce RTX 4080 Phantom GS
Storage 1TB Solidigm P44 Pro, 2 TB Corsair MP600 Pro, 2TB Kingston KC3000
Display(s) Acer XV272K LVbmiipruzx 4K@160Hz
Case Fractal Design Torrent Compact
Audio Device(s) Corsair Virtuoso SE
Power Supply be quiet! Pure Power 12 M 850 W
Mouse Logitech G502 Lightspeed
Keyboard Corsair K70 Max
Software Windows 10 Pro
Benchmark Scores https://valid.x86.fr/5za05v
 
Joined
Apr 30, 2020
Messages
855 (0.59/day)
System Name S.L.I + RTX research rig
Processor Ryzen 7 5800X 3D.
Motherboard MSI MEG ACE X570
Cooling Corsair H150i Cappellx
Memory Corsair Vengeance pro RGB 3200mhz 16Gbs
Video Card(s) 2x Dell RTX 2080 Ti in S.L.I
Storage Western digital Sata 6.0 SDD 500gb + fanxiang S660 4TB PCIe 4.0 NVMe M.2
Display(s) HP X24i
Case Corsair 7000D Airflow
Power Supply EVGA G+1600watts
Mouse Corsair Scimitar
Keyboard Cosair K55 Pro RGB
I guess my mistake was expecting users on a supposed "tech enthusiast" forum to understand the basics of multi-GPU, when what they think is "multi-GPU make game go brrrrr". Imagine if they cared enough to educate themselves!



Yes, in exactly the same way that AMD's intercore interconnects on Zen are like SLI!

Which is to say, completely different in every way except the most basic description.

If you where an PC Enthusiast you won't want SLI or Xfire to die at all. What gamers want is SLI and Xfire to die.
I guess my mistake was expecting users on a supposed "tech enthusiast" forum to understand the basics of multi-GPU, when what they think is "multi-GPU make game go brrrrr". Imagine if they cared enough to educate themselves!



Yes, in exactly the same way that AMD's intercore interconnects on Zen are like SLI!

Which is to say, completely different in every way except the most basic description.

Are you saying the next gpu would have a central hub for chiplets ?
 
Joined
Feb 19, 2013
Messages
14 (0.00/day)
[/QUOTE]

Thanks. That article actually answers my question more clearly than this Techpowerup article and even tells you the exact games that would work in DX12 multi-GPU mode on the 3090 and future GPU's.

Key points:

Existing SLI driver profiles will continue to be tested and maintained for SLI-ready RTX 20 Series and earlier GPUs.

For GeForce RTX 3090 and future SLI-capable GPUs, SLI will only be supported when implemented natively within the game.


AKA, if I buy 2x SLI RTX 3090's, I can kiss the SLI support on my 19 DX11 SLI supported games in my Steam library goodbye on using both GPU's therefore I'm only getting 1 RTX 3090 and downsizing my system to something smaller such as ITX or MATX.
 
Joined
Aug 31, 2016
Messages
104 (0.04/day)
You yourself asked why they were removing SLI support from previous ones.

And let's be clear, any older games you had that benefited from SLI would not need it anymore with this One card.

Not very good at reading I see. Try again:

I don't get it. I understand that SLI is not going to be supported in future games, nobody expects that at this point, but to disable all support for existing ones?

"Existing games". Not much to do with "previous GPU generations". But whatever, let's call it a misunderstanding.

The other thing I am not even going to comment on, too much impotency in one sentence.
 
Last edited:
Joined
Feb 1, 2013
Messages
1,248 (0.30/day)
System Name Gentoo64 /w Cold Coffee
Processor 9900K 5.2GHz @1.312v
Motherboard MXI APEX
Cooling Raystorm Pro + 1260mm Super Nova
Memory 2x16GB TridentZ 4000-14-14-28-2T @1.6v
Video Card(s) RTX 4090 LiquidX Barrow 3015MHz @1.1v
Storage 660P 1TB, 860 QVO 2TB
Display(s) LG C1 + Predator XB1 QHD
Case Open Benchtable V2
Audio Device(s) SB X-Fi
Power Supply MSI A1000G
Mouse G502
Keyboard G815
Software Gentoo/Windows 10
Benchmark Scores Always only ever very fast
Does anyone with tweaking experience and driver experience know why NVidia is preventing implicit SLI on 3090, since it should come naturally free? Is this an artificial limitation, or is there something in Ampere that would be impossible or too much work to maintain support for implicit SLI?
 
Joined
Jan 19, 2018
Messages
183 (0.08/day)
Processor AMD 5800X
Motherboard MSI X570 Tomahawk
Memory G.Skill 32GB
Software Windows 10
That's because the game is a buggy POS. But as long as people like you keep rewarding EA for their failures by buying their buggy POS games, and blaming GPU manufacturers instead of EA, EA will happily take your money and keep churning out buggy POS games. So complain elsewhere, because you're part of the problem.
...Activision publishes Warzone.
 

MxPhenom 216

ASIC Engineer
Joined
Aug 31, 2010
Messages
12,945 (2.60/day)
Location
Loveland, CO
System Name Ryzen Reflection
Processor AMD Ryzen 9 5900x
Motherboard Gigabyte X570S Aorus Master
Cooling 2x EK PE360 | TechN AM4 AMD Block Black | EK Quantum Vector Trinity GPU Nickel + Plexi
Memory Teamgroup T-Force Xtreem 2x16GB B-Die 3600 @ 14-14-14-28-42-288-2T 1.45v
Video Card(s) Zotac AMP HoloBlack RTX 3080Ti 12G | 950mV 1950Mhz
Storage WD SN850 500GB (OS) | Samsung 980 Pro 1TB (Games_1) | Samsung 970 Evo 1TB (Games_2)
Display(s) Asus XG27AQM 240Hz G-Sync Fast-IPS | Gigabyte M27Q-P 165Hz 1440P IPS | Asus 24" IPS (portrait mode)
Case Lian Li PC-011D XL | Custom cables by Cablemodz
Audio Device(s) FiiO K7 | Sennheiser HD650 + Beyerdynamic FOX Mic
Power Supply Seasonic Prime Ultra Platinum 850
Mouse Razer Viper v2 Pro
Keyboard Razer Huntsman Tournament Edition
Software Windows 11 Pro 64-Bit
Holy shit some of you people are dumb.

NVIDIA is not "taking SLI away" on Ampere, because SLI on Ampere was never there to begin with.



But the second card will do nothing.

Unless you have a specific game that you know implements mGPU support via a graphics API, there is no point in buying more than one RTX 3000-series card.


And the answer is equally simple, and already stated. Read it again. And do not spread FUD about NVIDIA "disabling" anything.

Also still driver errors with call of duty warzone.

That's because the game is a buggy POS. But as long as people like you keep rewarding EA for their failures by buying their buggy POS games, and blaming GPU manufacturers instead of EA, EA will happily take your money and keep churning out buggy POS games. So complain elsewhere, because you're part of the problem.

Well that didnt age well. :laugh: :laugh: :laugh: :laugh:

Call of Duty Warzone is not an EA title. Get off the high horse you seem to be on right now.
 
Joined
Aug 24, 2004
Messages
217 (0.03/day)
Well if they could come up with a way to utilize all the cpu's with useless integrated graphics that would be great.
 
Top