• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce 456.38 WHQL Released: Ampere Support, SLI Finally Dead

Joined
Aug 20, 2007
Messages
13,748 (2.84/day)
System Name Pioneer
Processor Intel i9 9900k
Motherboard ASRock Z390 Taichi
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory G.SKILL TridentZ Series 32GB (4 x 8GB) DDR4-3200 @ 14-14-14-34-2T
Video Card(s) EVGA GeForce RTX 2080 SUPER XC ULTRA
Storage Mushkin Pilot-E 2TB NVMe SSD
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) VGA HDMI->Panasonic SC-HTB20/Schiit Modi MB/Asgard 2 DAC/Amp to AKG Pro K7712 Headphones
Power Supply Seasonic Prime Titanium 750W
Mouse ROCCAT Kone EMP
Keyboard WASD CODE 104-Key w/ Cherry MX Green Keyswitches, Doubleshot Vortex PBT White Transluscent Keycaps
Software Windows 10 Enterprise (Product of work, yes it's legit)
Benchmark Scores www.3dmark.com/fs/23478641 www.3dmark.com/spy/13863605 www.3dmark.com/pr/306218

ARF

Joined
Jan 28, 2020
Messages
1,295 (4.25/day)
The difference between implicit and explicit. DX12 gives explicit multi-GPU rendering.
So, when you have two Ampere cards attached to your mobo, proper game that is optimised to work with multi-GPU configuration, then it will be fine and ok..
 
Joined
Feb 18, 2005
Messages
3,190 (0.55/day)
Location
Ikenai borderline!
Looks like I more people are confused than me. It's a simple question:

I have 19 DirectX 11 games in my Steam library that I know support SLI and have a SLI profile.

If I buy 2x RTX 3090's for that magic 4K 120FPS (minimum) #, will I still be getting proper SLI support on my 19 DX11 games? or has Nvidia just purposely disabled that on the 3000 series and I would only get SLI support on 2000 series GPU's and below?
And the answer is equally simple, and already stated. Read it again. And do not spread FUD about NVIDIA "disabling" anything.

Also still driver errors with call of duty warzone.
That's because the game is a buggy POS. But as long as people like you keep rewarding EA for their failures by buying their buggy POS games, and blaming GPU manufacturers instead of EA, EA will happily take your money and keep churning out buggy POS games. So complain elsewhere, because you're part of the problem.
 

MxPhenom 216

ASIC Engineer
Joined
Aug 31, 2010
Messages
12,431 (3.32/day)
Location
Longmont, CO
System Name GPU and more SSD for 2021
Processor Intel Core i7 8700k @ 4.8GHz 1.28v
Motherboard MSi Z370 Gaming Pro Carbon AC
Cooling EK Supremacy w/ EK Coolstream PE360 & 6x Corsair LL120s - Hardline tubing
Memory Corsair Vengeance Pro RGB 32GB 3200 14-14-14-34
Video Card(s) RTX3080(Ti) or Radeon 6800XT -> EK Waterblock
Storage 2 1TB NVME SSDs
Display(s) Dell S3220DGF 32" 1440p Freesync 2 (G-Sync) HDR 165Hz | 2x Asus VP249QGR 144Hz IPS
Case Lian Li PC-011D
Audio Device(s) Realtek 1220 w/ Sennheiser Game Ones
Power Supply Seasonic Flagship Prime Ultra Platinum 850
Mouse Razer Viper
Keyboard Razer Huntsman Tournament Edition
Software Windows 10 Pro 64-Bit
I'm curious.

I guess pretty soon advances in fabrication will turn miniscule if not cease completely because you can only reduce transistors so much.

When that happens, it will be impossible to create larger and larger dies because it would require wafers of impossible sizes. Also, the video card itself only allows certain die dimensions.

The only way forward would be to stack GPU "cores" like CPU cores in which case it will become impossible to share the computational workload as it's currently done which means an alternative to SLI would be required.

How will NVIDIA/AMD/Intel/Apple/Qualcomm/Samsung/ARM/MediaTek/etc. approach this conundrum? Or will all game engines implement DirectX12/Vulkan multi-GPU scaling? Last time I heard it's extremely difficult to implement and only relatively large game development studios can afford it.
You would be surprised. There is a lot of research going on about possible new transistor architectures to continue to keep shrinking. The next one that most fabs are going to be using for 3nm and smaller is Gate-All-Around FET. Then we will probably be looking at carbon nano-tubes or sheets, etc depending on how effective the GAAFETs end up being at 3nm and 1.2(7)nm. We are in a similar situation with FINFET that we were in with the planar FET as we got to sub 20nm nodes. The conducting path from source to drain is actually getting so small that even if the transistor is off, there may be a flow of some electrons, which we don't want. This is a leakage issue, and a problem due to size of an election and the ever decreasing size of the channel.
 
Last edited:
Joined
Aug 20, 2007
Messages
13,748 (2.84/day)
System Name Pioneer
Processor Intel i9 9900k
Motherboard ASRock Z390 Taichi
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory G.SKILL TridentZ Series 32GB (4 x 8GB) DDR4-3200 @ 14-14-14-34-2T
Video Card(s) EVGA GeForce RTX 2080 SUPER XC ULTRA
Storage Mushkin Pilot-E 2TB NVMe SSD
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) VGA HDMI->Panasonic SC-HTB20/Schiit Modi MB/Asgard 2 DAC/Amp to AKG Pro K7712 Headphones
Power Supply Seasonic Prime Titanium 750W
Mouse ROCCAT Kone EMP
Keyboard WASD CODE 104-Key w/ Cherry MX Green Keyswitches, Doubleshot Vortex PBT White Transluscent Keycaps
Software Windows 10 Enterprise (Product of work, yes it's legit)
Benchmark Scores www.3dmark.com/fs/23478641 www.3dmark.com/spy/13863605 www.3dmark.com/pr/306218
The difference between implicit and explicit. DX12 gives explicit multi-GPU rendering.
So, when you have two Ampere cards attached to your mobo, proper game that is optimised to work
The last sentence is key.

DX12 switches the onus of development for mgpu to game devs, not driver devs. This effectively kills it outside triple-A game studios.
 

ARF

Joined
Jan 28, 2020
Messages
1,295 (4.25/day)
The last sentence is key.

DX12 switches the onus of development for mgpu to game devs, not driver devs. This effectively kills it outside triple-A game studios.
Well, given the rumours that Nvidia's next-gen Hopper designs will be MCM, then SLi will work in one form or another.
 
Joined
Feb 18, 2005
Messages
3,190 (0.55/day)
Location
Ikenai borderline!
The last sentence is key.

DX12 switches the onus of development for mgpu to game devs, not driver devs. This effectively kills it outside triple-A game studios.
I guess my mistake was expecting users on a supposed "tech enthusiast" forum to understand the basics of multi-GPU, when what they think is "multi-GPU make game go brrrrr". Imagine if they cared enough to educate themselves!

Well, given the rumours that Nvidia's next-gen Hopper designs will be MCM, then SLi will work in one form or another.
Yes, in exactly the same way that AMD's intercore interconnects on Zen are like SLI!

Which is to say, completely different in every way except the most basic description.
 
Joined
Aug 20, 2007
Messages
13,748 (2.84/day)
System Name Pioneer
Processor Intel i9 9900k
Motherboard ASRock Z390 Taichi
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory G.SKILL TridentZ Series 32GB (4 x 8GB) DDR4-3200 @ 14-14-14-34-2T
Video Card(s) EVGA GeForce RTX 2080 SUPER XC ULTRA
Storage Mushkin Pilot-E 2TB NVMe SSD
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) VGA HDMI->Panasonic SC-HTB20/Schiit Modi MB/Asgard 2 DAC/Amp to AKG Pro K7712 Headphones
Power Supply Seasonic Prime Titanium 750W
Mouse ROCCAT Kone EMP
Keyboard WASD CODE 104-Key w/ Cherry MX Green Keyswitches, Doubleshot Vortex PBT White Transluscent Keycaps
Software Windows 10 Enterprise (Product of work, yes it's legit)
Benchmark Scores www.3dmark.com/fs/23478641 www.3dmark.com/spy/13863605 www.3dmark.com/pr/306218
I feel you, but gave up educating a while ago when I discovered flat earthers and those types roaming here... yikes.

Anyhow, back to the topic before we drift.
 
Joined
Jun 2, 2017
Messages
3,113 (2.44/day)
System Name Best AMD Computer
Processor AMD TR4 1920X
Motherboard MSI X399 SLI Plus
Cooling Alphacool Eisbaer 420 x2 Noctua XPX Pro TR4 block
Memory Gskill RIpjaws 4 3000MHZ 48GB
Video Card(s) Sapphire Vega 64 Nitro, Gigabyte Vega 64 Gaming OC
Storage 6 x NVME 480 GB, 2 x SSD 2TB, 5TB HDD, 2 TB HDD, 2x 2TB SSHD
Display(s) Acer 49BQ0k 4K monitor
Case Thermaltake Core X9
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Corsair HX1200!
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 10 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 24955 Time Spy: 13500
Apparently you can't read either. The changelog states "Implicit SLI Disabled on NVIDIA Ampere GPUs". It does not state "Implicit SLI Disabled on NVIDIA Ampere GPUs except in DirectX 11".

There is no SLI on Ampere. There is multi-GPU support if and only if it's implemented via a graphics API. DirectX 11 does not have multi-GPU functionality, therefore no DirectX 11 games can have any sort of multi-GPU support on Ampere.
Does the 3090 not support NVlink just like the 2080TI.
 
Joined
Apr 18, 2013
Messages
910 (0.33/day)
You would be surprised. There is a lot of research going on about possible new transistor architectures to continue to keep shrinking. The next one that most fabs are going to be using for 3nm and smaller is Gate-All-Around FET. Then we will probably be looking at carbon nano-tubes or sheets, etc depending on how effective the GAAFETs end up being at 3nm and 1.2(7)nm. We are in a similar situation with FINFET that we were in with the planar FET as we got to sub 20nm nodes. The conducting path from source to drain is actually getting so small that even if the transistor is off, there may be a flow of some electrons, which we don't want. This is a leakage issue, and a problem due to size of an election and the ever decreasing size of the channel.
There's a limit to miniaturization and your reply doesn't deal with it at the slightest. There will be time when nothing could be done to make transistors smaller and faster, and performance could be increased only by stacking dies at the expense of infinitely increasing power consumption.
 
Last edited:
Joined
Mar 10, 2015
Messages
3,523 (1.69/day)
System Name Wut?
Processor 3900X
Motherboard ASRock Taichi X570
Cooling Water
Memory 32GB GSkill CL16 3600mhz
Video Card(s) Vega 56
Storage 2 x AData XPG 8200 Pro 1TB
Display(s) 3440 x 1440
Case Thermaltake Tower 900
Power Supply Seasonic Prime Ultra Platinum
I know that there are tons of people that think Multi GPU is passe but at the end of the day it is my computer and my money to do with as I please.
That is true. Just don't complain because there is no support when you're told there is no support. Not directed at you. Royal and stuff.
 
Joined
Aug 20, 2007
Messages
13,748 (2.84/day)
System Name Pioneer
Processor Intel i9 9900k
Motherboard ASRock Z390 Taichi
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory G.SKILL TridentZ Series 32GB (4 x 8GB) DDR4-3200 @ 14-14-14-34-2T
Video Card(s) EVGA GeForce RTX 2080 SUPER XC ULTRA
Storage Mushkin Pilot-E 2TB NVMe SSD
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) VGA HDMI->Panasonic SC-HTB20/Schiit Modi MB/Asgard 2 DAC/Amp to AKG Pro K7712 Headphones
Power Supply Seasonic Prime Titanium 750W
Mouse ROCCAT Kone EMP
Keyboard WASD CODE 104-Key w/ Cherry MX Green Keyswitches, Doubleshot Vortex PBT White Transluscent Keycaps
Software Windows 10 Enterprise (Product of work, yes it's legit)
Benchmark Scores www.3dmark.com/fs/23478641 www.3dmark.com/spy/13863605 www.3dmark.com/pr/306218
Does the 3090 not support NVlink just like the 2080TI.
Physically, yes, but the software has never existed to use SLI on it with the 3090. I'm sure you could use it somehow, but not for SLI.
 

MxPhenom 216

ASIC Engineer
Joined
Aug 31, 2010
Messages
12,431 (3.32/day)
Location
Longmont, CO
System Name GPU and more SSD for 2021
Processor Intel Core i7 8700k @ 4.8GHz 1.28v
Motherboard MSi Z370 Gaming Pro Carbon AC
Cooling EK Supremacy w/ EK Coolstream PE360 & 6x Corsair LL120s - Hardline tubing
Memory Corsair Vengeance Pro RGB 32GB 3200 14-14-14-34
Video Card(s) RTX3080(Ti) or Radeon 6800XT -> EK Waterblock
Storage 2 1TB NVME SSDs
Display(s) Dell S3220DGF 32" 1440p Freesync 2 (G-Sync) HDR 165Hz | 2x Asus VP249QGR 144Hz IPS
Case Lian Li PC-011D
Audio Device(s) Realtek 1220 w/ Sennheiser Game Ones
Power Supply Seasonic Flagship Prime Ultra Platinum 850
Mouse Razer Viper
Keyboard Razer Huntsman Tournament Edition
Software Windows 10 Pro 64-Bit
There's a limit to miniaturization and your reply doesn't deal with it at the slightest. There will be time when nothing could be done to make transistors smaller and faster, and performance could be increases only by stacking dies at the expense of infinitely increasing power consumption.
We are still pretty far out from actually having to worry about that. I was expecting us to really have to worry about it once we got to below 5nm, but that doesn't seem to be the case. There are fabs coming out with different ways of doing the stacked dies packaging. Zero real implementation so far though. TSMCs 3DFabric could be pretty promising.
 
Last edited:
Joined
Feb 19, 2013
Messages
11 (0.00/day)
@Assimilator such hostility in your answers but thank you for answering none the less. Apologies that I'm not 100% up to speed on what explicit vs implicit SLI/Multi-GPU means. Goggle didn't provide any clear answers before I posted. I also don't know how framing my question (of Nvidia disabling SLI for 3000 series) in the form of a question is considered spreading "FUD" like you mentioned. I didn't explicitly say Nvidia did this, I was simply asking if that's what the driver release notes meant since I was a bit confused. I have 2x RTX 2080 Ti's in SLI/NvLink. I was thinking of going 2x RTX 3090's to now push 4K 120FPS (minimum) on my LG OLED but it looks like i'll just be sticking to 1x RTX 3090 instead.
 

rtwjunkie

PC Gaming Enthusiast
Supporter
Joined
Jul 25, 2008
Messages
13,503 (2.99/day)
Location
Louisiana -Laissez les bons temps rouler!
System Name Bayou Phantom
Processor Core i7-8700k 4.4Ghz @ 1.18v
Motherboard ASRock Z390 Phantom Gaming 6
Cooling All air: 2x140mm Fractal exhaust; 3x 140mm Cougar Intake; Enermax T40F Black CPU cooler
Memory 2x 16GB Mushkin Redline DDR-4 3200
Video Card(s) EVGA RTX 2080 Ti Xc
Storage 1x 500 MX500 SSD; 1x 6TB WD Black; 2x 4TB WD Black; 1x400GB VelRptr; 1x 3TB WD Blue storage (eSATA)
Display(s) HP 27q 27" IPS @ 2560 x 1440
Case Fractal Design Define R4 Black w/Titanium front -windowed
Audio Device(s) Soundblaster Z
Power Supply Seasonic X-850
Mouse Coolermaster Sentinel III (large palm grip!)
Keyboard Logitech G610 Orion mechanical (Cherry Brown switches)
Software Windows 10 Pro 64-bit (Start10 & Fences 3.0 installed)
Well obviously for Ampere GPUs, this is what this article is about, why would anyone even think I am talking about previous series...
You yourself asked why they were removing SLI support from previous ones.

And let's be clear, any older games you had that benefited from SLI would not need it anymore with this One card.
 
Joined
Apr 18, 2013
Messages
910 (0.33/day)
We are still pretty far out from actually having to worry about that. I was expecting us to really have to worry about it once we got to below 5nm, but that doesn't seem to be the case. There are fabs coming out with different ways of doing the stacked dies packaging. Zero real implementation so far though.
5nm CPUs are already produced (Apple's A14X) and I've already started worrying. Also, I'm not an engineer but AMD's 7nm CPUs/GPUs have become a lot harder to cool down than their higher nanometers counterparts. It seems to me we've already started hitting a brick wall in terms of efficiency and heat dissipation and it could get even worse going forward.
 
Last edited:

MxPhenom 216

ASIC Engineer
Joined
Aug 31, 2010
Messages
12,431 (3.32/day)
Location
Longmont, CO
System Name GPU and more SSD for 2021
Processor Intel Core i7 8700k @ 4.8GHz 1.28v
Motherboard MSi Z370 Gaming Pro Carbon AC
Cooling EK Supremacy w/ EK Coolstream PE360 & 6x Corsair LL120s - Hardline tubing
Memory Corsair Vengeance Pro RGB 32GB 3200 14-14-14-34
Video Card(s) RTX3080(Ti) or Radeon 6800XT -> EK Waterblock
Storage 2 1TB NVME SSDs
Display(s) Dell S3220DGF 32" 1440p Freesync 2 (G-Sync) HDR 165Hz | 2x Asus VP249QGR 144Hz IPS
Case Lian Li PC-011D
Audio Device(s) Realtek 1220 w/ Sennheiser Game Ones
Power Supply Seasonic Flagship Prime Ultra Platinum 850
Mouse Razer Viper
Keyboard Razer Huntsman Tournament Edition
Software Windows 10 Pro 64-Bit
5nm CPUs are already produced (Apple's A14X) and I've already started worrying. Also, I'm not an engineer but AMD's 7nm CPUs/GPUs have become a lot harder to cool down than their higher nanometers counterparts. It seems to me we've already started hitting brick wall in terms of efficiency and heat dissipation and it could be even worse going forward.
Designs play a huge roll in that though. Plus there's a lot of crazy things you can do to a FINFET to improve efficiency and the heat issue. Lithography method used helps a lot. There is a pretty huge shift in the industry to EUV now for some 7nm nodes and smaller.

My job we are dealing with 12nm and soon 7nm. Our dies are also smaller than 30mm^2

 
Joined
Nov 11, 2004
Messages
7,116 (1.21/day)
Location
Formosa
System Name Overlord Mk MXVI
Processor AMD Ryzen 7 3800X
Motherboard Gigabyte X570 Aorus Master
Cooling Corsair H115i Pro
Memory 32GB Viper Steel 3600 DDR4 @ 3800MHz 16-19-16-19-36
Video Card(s) Gigabyte RTX 2080 Gaming OC 8G
Storage 1TB WD Black NVMe (2018), 2TB Viper VPN100, 1TB WD Blue 3D NAND
Display(s) Asus PG27AQ
Case Corsair Carbide 275Q
Audio Device(s) Corsair Virtuoso SE
Power Supply Corsair RM750
Mouse Logitech G500s
Keyboard Wooting Two
Software Windows 10 Pro
Benchmark Scores https://valid.x86.fr/33u9si
 
Joined
Apr 30, 2020
Messages
173 (0.82/day)
System Name Old one
Processor Phenom II x6 1035t
Motherboard Gigabyte ud3
Cooling Scythe mugen 2
Memory G.skill 1600kHz 9-9-924
Video Card(s) Sapphire HD 3870
Storage Western digital 500th hd
Display(s) Clear tunes TV
Case Coolermaster original 690
Power Supply Some dual fan 650 watts with a ratsnest
Mouse Ps/2 unknown
Keyboard Unknown, it just works!
I guess my mistake was expecting users on a supposed "tech enthusiast" forum to understand the basics of multi-GPU, when what they think is "multi-GPU make game go brrrrr". Imagine if they cared enough to educate themselves!



Yes, in exactly the same way that AMD's intercore interconnects on Zen are like SLI!

Which is to say, completely different in every way except the most basic description.
If you where an PC Enthusiast you won't want SLI or Xfire to die at all. What gamers want is SLI and Xfire to die.
I guess my mistake was expecting users on a supposed "tech enthusiast" forum to understand the basics of multi-GPU, when what they think is "multi-GPU make game go brrrrr". Imagine if they cared enough to educate themselves!



Yes, in exactly the same way that AMD's intercore interconnects on Zen are like SLI!

Which is to say, completely different in every way except the most basic description.
Are you saying the next gpu would have a central hub for chiplets ?
 
Joined
Feb 19, 2013
Messages
11 (0.00/day)
[/QUOTE]
Thanks. That article actually answers my question more clearly than this Techpowerup article and even tells you the exact games that would work in DX12 multi-GPU mode on the 3090 and future GPU's.

Key points:

Existing SLI driver profiles will continue to be tested and maintained for SLI-ready RTX 20 Series and earlier GPUs.

For GeForce RTX 3090 and future SLI-capable GPUs, SLI will only be supported when implemented natively within the game.


AKA, if I buy 2x SLI RTX 3090's, I can kiss the SLI support on my 19 DX11 SLI supported games in my Steam library goodbye on using both GPU's therefore I'm only getting 1 RTX 3090 and downsizing my system to something smaller such as ITX or MATX.
 
Joined
Aug 31, 2016
Messages
73 (0.05/day)
You yourself asked why they were removing SLI support from previous ones.

And let's be clear, any older games you had that benefited from SLI would not need it anymore with this One card.
Not very good at reading I see. Try again:

I don't get it. I understand that SLI is not going to be supported in future games, nobody expects that at this point, but to disable all support for existing ones?
"Existing games". Not much to do with "previous GPU generations". But whatever, let's call it a misunderstanding.

The other thing I am not even going to comment on, too much impotency in one sentence.
 
Last edited:
Joined
Feb 1, 2013
Messages
633 (0.22/day)
System Name Gentoo64 /w Cold Coffee
Processor 9900K 5.2GHz 1.288v
Motherboard EVGA Z370 Micro
Cooling Custom 480mm H2O, Raystorm Pro, Nemesis GTX, EK-XRES
Memory 2x8GB Trident Z 4133-C16-2T 1.45v
Video Card(s) MSI Seahawk EK X 1080Ti 2114/12420 1.081v
Storage Samsung 970 EVO 500GB, 860 QVO 2TB
Display(s) XB271HU 165Hz
Case FT03-T
Audio Device(s) SBz
Power Supply SS-850KM3
Mouse G502
Keyboard G710+
Software Gentoo 64-bit, Windows 7/10 64-bit
Benchmark Scores Always only ever very fast
Does anyone with tweaking experience and driver experience know why NVidia is preventing implicit SLI on 3090, since it should come naturally free? Is this an artificial limitation, or is there something in Ampere that would be impossible or too much work to maintain support for implicit SLI?
 
Joined
Jan 19, 2018
Messages
158 (0.15/day)
Location
SoCal
Processor AMD 3700X
Motherboard MSI B450 Gaming Pro Carbon
Memory G.Skill 16GB (2x8) 3600
Video Card(s) Powercolor Vega 56
Software Windows 10
That's because the game is a buggy POS. But as long as people like you keep rewarding EA for their failures by buying their buggy POS games, and blaming GPU manufacturers instead of EA, EA will happily take your money and keep churning out buggy POS games. So complain elsewhere, because you're part of the problem.
...Activision publishes Warzone.
 

MxPhenom 216

ASIC Engineer
Joined
Aug 31, 2010
Messages
12,431 (3.32/day)
Location
Longmont, CO
System Name GPU and more SSD for 2021
Processor Intel Core i7 8700k @ 4.8GHz 1.28v
Motherboard MSi Z370 Gaming Pro Carbon AC
Cooling EK Supremacy w/ EK Coolstream PE360 & 6x Corsair LL120s - Hardline tubing
Memory Corsair Vengeance Pro RGB 32GB 3200 14-14-14-34
Video Card(s) RTX3080(Ti) or Radeon 6800XT -> EK Waterblock
Storage 2 1TB NVME SSDs
Display(s) Dell S3220DGF 32" 1440p Freesync 2 (G-Sync) HDR 165Hz | 2x Asus VP249QGR 144Hz IPS
Case Lian Li PC-011D
Audio Device(s) Realtek 1220 w/ Sennheiser Game Ones
Power Supply Seasonic Flagship Prime Ultra Platinum 850
Mouse Razer Viper
Keyboard Razer Huntsman Tournament Edition
Software Windows 10 Pro 64-Bit
Holy shit some of you people are dumb.

NVIDIA is not "taking SLI away" on Ampere, because SLI on Ampere was never there to begin with.



But the second card will do nothing.

Unless you have a specific game that you know implements mGPU support via a graphics API, there is no point in buying more than one RTX 3000-series card.

And the answer is equally simple, and already stated. Read it again. And do not spread FUD about NVIDIA "disabling" anything.

Also still driver errors with call of duty warzone.
That's because the game is a buggy POS. But as long as people like you keep rewarding EA for their failures by buying their buggy POS games, and blaming GPU manufacturers instead of EA, EA will happily take your money and keep churning out buggy POS games. So complain elsewhere, because you're part of the problem.
Well that didnt age well. :laugh: :laugh: :laugh: :laugh:

Call of Duty Warzone is not an EA title. Get off the high horse you seem to be on right now.
 
Joined
Aug 24, 2004
Messages
173 (0.03/day)
Well if they could come up with a way to utilize all the cpu's with useless integrated graphics that would be great.
 
Top