• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA SLI GeForce RTX 2080 Ti and RTX 2080 with NVLink

D

Deleted member 177333

Guest
You know, I ran (1) gpu for a really long time, and when I finally got out of college and started making some money, I was able to move, to Crossfire with AMD cards and that was a great experience, and then I was able to start moving into the high end first with 780 Ti SLI, to Titan X Maxwell SLI, and now 1080 Ti SLI. I personally LOVE the technology and have had excellent experiences with each setup.

I've been gaming in 4k either via a true 4k monitor (currently using a Phillips 40'' 4K monitor that I've had for a couple years) or a 1080p monitor using DSR to upscale for about 3-4 years. It was this move to the UHD resolutions that really pushed me into the SLI arena with the high end cards as it was the only way to get a smooth 60fps in most games at that resolution with the high IQ settings. For the most part, the games I play support SLI, but I'm noticing more and more, that developers just simply don't want to support it because of the smaller percentage of folks that use it and the fault is two-fold - one, on developers like Ubisoft for example who don't want to implement SLI because of the development time, but yet spend tons of money on multiple forms of DRM for their games (and the games still get cracked, lol), and also on NVidia for the reason that was noted in the review - back in the 900 series days even the 960 had SLI. NVidia keeps removing SLI support from the mid range cards and making it a luxury item so less and less users have it, making the developers less and less willing to implement it - NVidia needs to fix their side and the developers need to make more effort on behalf of their customers as well. By using NVlink now where the VRAM will stack, that would help push SLI adoption for mid-range users bigtime if NVidia would have only planned for it.

It's really a great technology from my many years of experience using it - I would love to see it come back on mid-range cards so more people could make use of it at affordable pricing. After all, I started XFire on a pair of 4870 cards back when I didn't have a lot of $ and had a fantastic experience, and that's what got me into using it more. I'd love to see other people be able to enjoy it as well.
 

phill

Moderator
Staff member
Joined
Jun 8, 2011
Messages
15,973 (3.39/day)
Location
Somerset, UK
System Name Not so complete or overkill - There are others!! Just no room to put! :D
Processor Ryzen Threadripper 3970X
Motherboard Asus Zenith 2 Extreme Alpha
Cooling Lots!! Dual GTX 560 rads with D5 pumps for each rad. One rad for each component
Memory Viper Steel 4 x 16GB DDR4 3600MHz not sure on the timings... Probably still at 2667!! :(
Video Card(s) Asus Strix 3090 with front and rear active full cover water blocks
Storage I'm bound to forget something here - 250GB OS, 2 x 1TB NVME, 2 x 1TB SSD, 4TB SSD, 2 x 8TB HD etc...
Display(s) 3 x Dell 27" S2721DGFA @ 7680 x 1440P @ 144Hz or 165Hz - working on it!!
Case The big Thermaltake that looks like a Case Mods
Audio Device(s) Onboard
Power Supply EVGA 1600W T2
Mouse Corsair thingy
Keyboard Razer something or other....
VR HMD No headset yet
Software Windows 11 OS... Not a fan!!
Benchmark Scores I've actually never benched it!! Too busy with WCG and FAH and not gaming! :( :( Not OC'd it!! :(
I do wish SLI was a load better than it is but if the developers don't put the support in the games for it, then we have no hope. I do ask myself sometimes why do I have an expensive PC if I only use one GPU and not go with a console (I know there's a big performance gap but) when we have all these fancy liquid cooled systems with 3 or 4 cards in on one side and then another 2 or 3 cards on the other, for the cost of the build, I mean is it really worth it??

I used to use a lot of SLI and Crossfire, had no real bad experiences with either, I liked having them both, meant faster frame rates and meant for me more tweaking which I always like doing..

I do agree with @Razrback16 that if Nvidia take it away from the lower priced cards that that will limit who has it even more. We all know that if a game doesn't support it then it's only as fast as your slowest card but when it works on Nvidia or AMD cards, it's an utter joy.
 
D

Deleted member 177333

Guest
Excellent points phill -

NVidia, especially with their pricing on Turing, is making it really hard on PC Gamers. It's like they're trying to chase us away or something. Make the latest line so unaffordable that sales are undoubtedly going to drop, we just don't know to what degree, but we already know their stock has dropped like 2% since the reviews game out for Turing, then they further limit SLI to only big dawg models of Turing, then just to put the cherry on top, for SLI, they make the new bridge like $80 for a device that probably cost $5 to make like Wizzard said in the review.

I hope AMD is ready to take advantage of this because NVidia is really kinda shooting themselves in the foot bigtime with this release. IMO anyway.
 

phill

Moderator
Staff member
Joined
Jun 8, 2011
Messages
15,973 (3.39/day)
Location
Somerset, UK
System Name Not so complete or overkill - There are others!! Just no room to put! :D
Processor Ryzen Threadripper 3970X
Motherboard Asus Zenith 2 Extreme Alpha
Cooling Lots!! Dual GTX 560 rads with D5 pumps for each rad. One rad for each component
Memory Viper Steel 4 x 16GB DDR4 3600MHz not sure on the timings... Probably still at 2667!! :(
Video Card(s) Asus Strix 3090 with front and rear active full cover water blocks
Storage I'm bound to forget something here - 250GB OS, 2 x 1TB NVME, 2 x 1TB SSD, 4TB SSD, 2 x 8TB HD etc...
Display(s) 3 x Dell 27" S2721DGFA @ 7680 x 1440P @ 144Hz or 165Hz - working on it!!
Case The big Thermaltake that looks like a Case Mods
Audio Device(s) Onboard
Power Supply EVGA 1600W T2
Mouse Corsair thingy
Keyboard Razer something or other....
VR HMD No headset yet
Software Windows 11 OS... Not a fan!!
Benchmark Scores I've actually never benched it!! Too busy with WCG and FAH and not gaming! :( :( Not OC'd it!! :(
We are in a crazy world at the moment and all this Nvidia monopoly isn't good for anyone.. AMD we need you back to show Nvidia how to make cheap gaming cards and how to you do this next series of cards will make or break you I think... I hope your seeing what Nvidia are doing and do something completely opposite...

Make us proud AMD again... Please don't be stupid.... :)
 
Joined
Sep 17, 2014
Messages
20,944 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
Like what? The much-vaunted DX12 multi-GPU, which never materialised as the review notes?
Of course it never will, we have SLI. Now with NVLink, so it sounds fancier when you drop 2500$.

Don't you realize it doesn't matter what name you give it, devs have to do most of the work regardless, helped by bags of money + Nvidia engineers if need be. Its always a collaborative effort and that alone makes it destined to fail. SLI is a niche and the moment they killed SLI in the midrange was the moment it literally just ended.

Nobody optimizes for niche, and this review shows what that looks like.
 

HTC

Joined
Apr 1, 2008
Messages
4,604 (0.78/day)
Location
Portugal
System Name HTC's System
Processor Ryzen 5 2600X
Motherboard Asrock Taichi X370
Cooling NH-C14, with the AM4 mounting kit
Memory G.Skill Kit 16GB DDR4 F4 - 3200 C16D - 16 GTZB
Video Card(s) Sapphire Nitro+ Radeon RX 480 OC 4 GB
Storage 1 Samsung NVMe 960 EVO 250 GB + 1 3.5" Seagate IronWolf Pro 6TB 7200RPM 256MB SATA III
Display(s) LG 27UD58
Case Fractal Design Define R6 USB-C
Audio Device(s) Onboard
Power Supply Corsair TX 850M 80+ Gold
Mouse Razer Deathadder Elite
Software Ubuntu 19.04 LTS
I'm really intrigued here: how can SLIed cards perform better with all tested games VS with only games that scale?

Somethings off, unless i'm seriously miss reading the performance summary.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,049 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
I'm really intrigued here: how can SLIed cards perform better with all tested games VS with only games that scale?

Somethings off, unless i'm seriously miss reading the performance summary.
Not sure how you are misreading them. Higher percentage value = higher performance

Put down your thoughts, I'd be happy to elaborate
 

HTC

Joined
Apr 1, 2008
Messages
4,604 (0.78/day)
Location
Portugal
System Name HTC's System
Processor Ryzen 5 2600X
Motherboard Asrock Taichi X370
Cooling NH-C14, with the AM4 mounting kit
Memory G.Skill Kit 16GB DDR4 F4 - 3200 C16D - 16 GTZB
Video Card(s) Sapphire Nitro+ Radeon RX 480 OC 4 GB
Storage 1 Samsung NVMe 960 EVO 250 GB + 1 3.5" Seagate IronWolf Pro 6TB 7200RPM 256MB SATA III
Display(s) LG 27UD58
Case Fractal Design Define R6 USB-C
Audio Device(s) Onboard
Power Supply Corsair TX 850M 80+ Gold
Mouse Razer Deathadder Elite
Software Ubuntu 19.04 LTS
Not sure how you are misreading them. Higher percentage value = higher performance

Put down your thoughts, I'd be happy to elaborate

It seems i miss interpreted the chart: my bad.

A question if i may: does that 215% in the 4K results for the SLIed 2080 Ti mean it actually scales past 100%?
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,049 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
It seems i miss interpreted the chart: my bad.

A question if i may: does that 215% in the 4K results for the SLIed 2080 Ti mean it actually scales past 100%?
215% relative to a single GTX 1080 Ti (which is the 100% score)
 
  • Like
Reactions: HTC

HTC

Joined
Apr 1, 2008
Messages
4,604 (0.78/day)
Location
Portugal
System Name HTC's System
Processor Ryzen 5 2600X
Motherboard Asrock Taichi X370
Cooling NH-C14, with the AM4 mounting kit
Memory G.Skill Kit 16GB DDR4 F4 - 3200 C16D - 16 GTZB
Video Card(s) Sapphire Nitro+ Radeon RX 480 OC 4 GB
Storage 1 Samsung NVMe 960 EVO 250 GB + 1 3.5" Seagate IronWolf Pro 6TB 7200RPM 256MB SATA III
Display(s) LG 27UD58
Case Fractal Design Define R6 USB-C
Audio Device(s) Onboard
Power Supply Corsair TX 850M 80+ Gold
Mouse Razer Deathadder Elite
Software Ubuntu 19.04 LTS
215% relative to a single GTX 1080 Ti (which is the 100% score)

I thought the base line was the 2080 Ti: my bad, again.
 

phill

Moderator
Staff member
Joined
Jun 8, 2011
Messages
15,973 (3.39/day)
Location
Somerset, UK
System Name Not so complete or overkill - There are others!! Just no room to put! :D
Processor Ryzen Threadripper 3970X
Motherboard Asus Zenith 2 Extreme Alpha
Cooling Lots!! Dual GTX 560 rads with D5 pumps for each rad. One rad for each component
Memory Viper Steel 4 x 16GB DDR4 3600MHz not sure on the timings... Probably still at 2667!! :(
Video Card(s) Asus Strix 3090 with front and rear active full cover water blocks
Storage I'm bound to forget something here - 250GB OS, 2 x 1TB NVME, 2 x 1TB SSD, 4TB SSD, 2 x 8TB HD etc...
Display(s) 3 x Dell 27" S2721DGFA @ 7680 x 1440P @ 144Hz or 165Hz - working on it!!
Case The big Thermaltake that looks like a Case Mods
Audio Device(s) Onboard
Power Supply EVGA 1600W T2
Mouse Corsair thingy
Keyboard Razer something or other....
VR HMD No headset yet
Software Windows 11 OS... Not a fan!!
Benchmark Scores I've actually never benched it!! Too busy with WCG and FAH and not gaming! :( :( Not OC'd it!! :(
If only SLI or Crossfire/X scaled well or the developers would make the most of why people have PC's then we'd have such a different setup for GPUs.. That said with Nvidia only putting SLI on their top end cards and charging a small fortune for those, AMD have the ability and chance to really make a difference for the rest of people that don't wish to spend $1600 to $2400 or so (ÂŁ1500 +) on GPUs... As said before, please AMD make us proud.....
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,049 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
If only SLI or Crossfire/X scaled well or the developers would make the most of why people have PC's then we'd have such a different setup for GPUs.. That said with Nvidia only putting SLI on their top end cards and charging a small fortune for those, AMD have the ability and chance to really make a difference for the rest of people
At Computex I sat in a small meeting with 4 other journalists and AMD head honchos. David Wang made it clear that he does not believe that multi-chip modules (similar to Ryzen) are viable for GPUs. Specifically because of the difficulties of getting multi-GPU to work.
 

phill

Moderator
Staff member
Joined
Jun 8, 2011
Messages
15,973 (3.39/day)
Location
Somerset, UK
System Name Not so complete or overkill - There are others!! Just no room to put! :D
Processor Ryzen Threadripper 3970X
Motherboard Asus Zenith 2 Extreme Alpha
Cooling Lots!! Dual GTX 560 rads with D5 pumps for each rad. One rad for each component
Memory Viper Steel 4 x 16GB DDR4 3600MHz not sure on the timings... Probably still at 2667!! :(
Video Card(s) Asus Strix 3090 with front and rear active full cover water blocks
Storage I'm bound to forget something here - 250GB OS, 2 x 1TB NVME, 2 x 1TB SSD, 4TB SSD, 2 x 8TB HD etc...
Display(s) 3 x Dell 27" S2721DGFA @ 7680 x 1440P @ 144Hz or 165Hz - working on it!!
Case The big Thermaltake that looks like a Case Mods
Audio Device(s) Onboard
Power Supply EVGA 1600W T2
Mouse Corsair thingy
Keyboard Razer something or other....
VR HMD No headset yet
Software Windows 11 OS... Not a fan!!
Benchmark Scores I've actually never benched it!! Too busy with WCG and FAH and not gaming! :( :( Not OC'd it!! :(
It's a real shame considering how it used to be something that was worked on and in some cases worked so very well with the scaling. You'd have thought that because of the cost of GPUs now, that this might have been considered more so than not.. Otherwise, why don't we just grab a console and have done with it? :(
 

HTC

Joined
Apr 1, 2008
Messages
4,604 (0.78/day)
Location
Portugal
System Name HTC's System
Processor Ryzen 5 2600X
Motherboard Asrock Taichi X370
Cooling NH-C14, with the AM4 mounting kit
Memory G.Skill Kit 16GB DDR4 F4 - 3200 C16D - 16 GTZB
Video Card(s) Sapphire Nitro+ Radeon RX 480 OC 4 GB
Storage 1 Samsung NVMe 960 EVO 250 GB + 1 3.5" Seagate IronWolf Pro 6TB 7200RPM 256MB SATA III
Display(s) LG 27UD58
Case Fractal Design Define R6 USB-C
Audio Device(s) Onboard
Power Supply Corsair TX 850M 80+ Gold
Mouse Razer Deathadder Elite
Software Ubuntu 19.04 LTS
At Computex I sat in a small meeting with 4 other journalists and AMD head honchos. David Wang made it clear that he does not believe that multi-chip modules (similar to Ryzen) are viable for GPUs. Specifically because of the difficulties of getting multi-GPU to work.

IMO, the only way they'll manage to make it work is if, and that's a big if, however many small chips there are, are seen as one big chip by the OSes / drivers: only then they'll manage to pull it off.

If they do manage, somehow, nVidia is screwed because their whole design revolves around a very big monolithic chip (i'm talking high end stuff here: not mid range and lower). Essencialy: nVidia would be in the position Intel is now, minus the 10nm woes (again referring to the higher end stuff only).
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,147 (2.94/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
This explains why you don't see flexible NVLink bridges printed on polyethylene substrates as those would need to be single-layer, running the entire width of the NVLink connector and resembling an ugly ribbon cable, such as IDE, hampering airflow for the Founders Edition axial-flow cooler.
Ummm, what? I seriously doubt it would obstruct airflow and at least a ribbon is flexible and it's not like your placement of the cards needs to revolve around the inflexible NVLink adapter. I think the reality is that the adapter would be wider if it were a ribbon because of the size of the connector. I honestly think there is absolutely no good reason to make it solid other than to have a [bad] reason charge 80 dollars for it. Also, it's certainly not worth it considering SLI scaling seems to still be garbage.

Side note, I remember Crossfire scaling better than this when I used CFX with two 6870s. Maybe this should be contrasted with how AMD's cards are scaling or did they ditch CFX support?
 
Last edited:
Joined
Mar 18, 2015
Messages
2,960 (0.89/day)
Location
Long Island
I do wish SLI was a load better than it is but if the developers don't put the support in the games for it, then we have no hope. I do ask myself sometimes why do I have an expensive PC if I only use one GPU and not go with a console (I know there's a big performance gap but) when we have all these fancy liquid cooled systems with 3 or 4 cards in on one side and then another 2 or 3 cards on the other, for the cost of the build, I mean is it really worth it??

I think there's more going on here than we realize.

-When I got my 780s it was just after the Ti surfaced and proces went in the toilet. In addition to the price drop, I got $200 worth of game coupons. So i bought one and my son bought me one (Xmas gift that I paid for) ... and when all was said and done, it was cheaper and a lot faster than a single 780 Ti.
-Later when my son did his "I just graduated kolludge and got a job" build, we did the reverse. He bought a 970 and I bought one along with a crapload of game coupons. On average, using TPIs test suite the SLI option was 40% faster, some games didn't support it or sacled poorly which lotat critics like to banter about, but neither of us cared as we were getting well above 60 fps in those games. What we cared about was 42 fps on the 980 was over 60 fps on the twin 970s. That made the game playable ,,, going from having to play at 78 fps on an unspoorted game versus 87 on a 980 was a "who cares" .

In essence, with no competion from AMD, nVidia is basically competing with itself... and it's far more profitable to sell 1 x80 than 2 x60s or x70s. I also remember looking and saying 'wait a minute ? Why does nVidia have a thermal limit on the X70 that is 5C lower than the X8 ? Should they vary, shouldn't it be the other way around ? Then again, why is it that average scaling at 1080p is a measly at 18% or so... 1440p its up to 34% and at 2160p it's up over 50%. (some games did 95-100%). By making scaling reasonable at 4k, and x80 performance at 4k under 60 fps, they had good reason to make 4K SLI a viable option. And finally, you can no longer why twin x70s for the price of an X80.

Twin 560 Tis were cheaper and faster than the 580
Twin 650 Tis were much cheaper than the 680 and performed about the same
Twin 780s and 970s we already covered.

It seemed to me that Vidia was intentionally nerfing the x70s performance to create a wider gap between the 70 and 80 models so as to make purchasing an 80 more attractive. This option only became possible due to tthe lack of competition from AMD in this price / performance niche. They had little fear of losing sales to AMD ... so why compete with themselves when they could just make the x70 or x70 SLI less attractive ?
 

lilkwarrior

New Member
Joined
Oct 8, 2018
Messages
2 (0.00/day)
Atrocious scaling as expected. SLI is an ancient technology which should have kicked the bucket long ago and make way for better alternatives.
I'm disappointed. I hoped NVlink would be more driver/engine independent then SLI was, but it appears to just be SLI on sterioids.

I'll just stick to a single vega this generation. Maybe in 2020 they'll figure out multi GPU again.
NVLINK maximizes communication between the two cards while DX12 or Vulkan handles the multi-GPU part as far as gaming. It's why they didn't transfer Quadro's memory pooling with NVLINK to the Geforce cards; DX12 & Vulkan is supposed to be doing that.

Now that DX12 & Vulkan have ray-tracing (their first killer feature beside's Vulkan being cross-platform ready) & mainstream end-of-life has been reached for Windows 8, Nvidia & everyone involved is hoping modern PC gamers moving forward will more maximize DX12 or Vulkan—including their multi-GPU capabilities.

Ummm, what? I seriously doubt it would obstruct airflow and at least a ribbon is flexible and it's not like your placement of the cards needs to revolve around the inflexible NVLink adapter. I think the reality is that the adapter would be wider if it were a ribbon because of the size of the connector. I honestly think there is absolutely no good reason to make it solid other than to have a [bad] reason charge 80 dollars for it. Also, it's certainly not worth it considering SLI scaling seems to still be garbage.

Side note, I remember Crossfire scaling better than this when I used CFX with two 6870s. Maybe this should be contrasted with how AMD's cards are scaling or did they ditch CFX support?
NVLINK, which has been available far more optimally in Quadro+ cards for years, has never had a ribbon form & it would not make sense given what the technology is. There would not be sufficient placement of the hw needed for the far more efficient intercommunication between GPUs it provides.

Don't you realize it doesn't matter what name you give it, devs have to do most of the work regardless, helped by bags of money + Nvidia engineers if need be. Its always a collaborative effort and that alone makes it destined to fail. SLI is a niche and the moment they killed SLI in the midrange was the moment it literally just ended.

Nobody optimizes for niche, and this review shows what that looks like.
Nvidia & AMD expect more use of DX12's & Vulkan's explicit multi-GPU modes to be leveraged by developers, not their obsolete, proprietary methods.

Now that Nvidia has done their part (providing NVLINK, consumer GPUs w/ dedicated deep learning for anti-aliasing RTX cores, Influenced standardization of Ray tracing for Vulkan & DX12) & Microsoft (WindowsML, DXR, and end-of-life support for Windows 8 reached), Devs now should do their part as all major engines have versions that'll out-of-the-box support Ray-tracing & explicit multi-GPU mode.
 
Joined
Aug 20, 2007
Messages
20,787 (3.41/day)
System Name Pioneer
Processor Ryzen R9 7950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory 64GB (4x 16GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage 2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches + PBT DS keycaps
Software Gentoo Linux x64
Devs now should do their part as all major engines have versions that'll out-of-the-box support Ray-tracing & explicit multi-GPU mode.

Forgive my skepticism... but as a part-time developer myself, good luck there...
 

Spyle

New Member
Joined
Oct 22, 2018
Messages
2 (0.00/day)
You acknowledge that "2080 Ti is able to saturate PCI-Express gen 3.0 x8 ", then proceed to test SLI in an x8/x8 configuration?
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,147 (2.94/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
You acknowledge that "2080 Ti is able to saturate PCI-Express gen 3.0 x8 ", then proceed to test SLI in an x8/x8 configuration?
The 8600k doesn't have enough PCIe lanes for two cards running at 16x as the CPU only has 16 lane available so, two x8 slots is literally all you can use with an Intel MSDT platform. The reality is that most people have a setup like this. Even a guy I know with a 2080 Ti has this kind of setup. I suspect that W1zz isn't going to change his entire testing setup to be something unrealistic for most consumers as most consumers (even those buying a 2080 Ti,) aren't likely going to have a HEDT build.

So, while you're right that a 2080 Ti could use 16 lanes each, the reality is that most builds don't have that many PCIe lanes and that reviews really should be capturing realistic real word usage and realistic doesn't mean a $5,000 USD build, even more so when these CPUs don't really do these GPUs justice with their lower clocks and high core counts.

With that said, PCIe bandwidth doesn't even begin to describe why performance with multiple GPUs is so terrible with the 2080 Ti.
 

Spyle

New Member
Joined
Oct 22, 2018
Messages
2 (0.00/day)
With that said, PCIe bandwidth doesn't even begin to describe why performance with multiple GPUs is so terrible with the 2080 Ti.

But it may not be that terrible, it could be the PCIe bandwidth holding it back. I recently went from a x99 platform to z390, while my new CPU is far better and I gained significant performance in BF1, other games that were GPU bottlenecked took around a 10% hit because I was now running x8/x8. And this is on Titan X pascal, so I imagine with faster cards the lack of PCIe lanes becomes even more apparent.
 

claydough

New Member
Joined
Dec 24, 2018
Messages
1 (0.00/day)
the article acknowledges use cases like 3d vision in multi-monitor surround.
( wider is better wsgf )

As a fan of both I look to these SLI benchmarks to incorporate as much into their findings ( ya know for us users who might benefit via stereo renders across 3 QHD monitors! )
Where NVlink's bandwidth sounds potentially revolutionary.

In which case...
Why do these reviews then,
Never acknowledge that there are actually TWO FLAVORS OF SLI?
3d "or"
Accelerated Surround!

For which case...
I can't think of a single title I play that doesn't benefit!
Where the scaling bad or not often makes the difference of making 3d Vision in surround "possible".


Where scaling isn't nearly as important as "finally playable" as a 3D Vision in NV Surround Fan.
For which there are die hard communities relying on homebrewed solutions to support games where devs leave off.
Like Hayden at WSGF. or Helix for 3D vision support!

We ain't going anywhere!

SO?
How bout a lil representation (love) in your reviews?
And include Surround SLI in these benchmark reviews where SLI is covered?
Then u could even tout a comprehensive review!
 

cars10

New Member
Joined
Jan 12, 2019
Messages
8 (0.00/day)
An SLI benchmark bottlenecked in 8x/8x. You guys should be ashamed.

as most consumers (even those buying a 2080 Ti,) aren't likely going to have a HEDT build.

PRECISELY because of BOGUS reviews like this one from Techpower up. If people knew more about the 8x/8x PCI-E lane bottleneck they would likely use a HEDT platform, and people who have money for SLI and the 2080ti are already swimming cash, obviously. It is only out of ignorance and the knowledge of Clockspeed > Core count that they go fore the mainstream platform

the LEAST they could do is write a big fat red amendment to the review "NOTE: This setup is bottlenecked!". Probably best change the charts to so it says ("only games that scale but are bottlenecked")

This sheds a bad light on SLI and makes me absolutely livid. If the review doessn't get ammended I am sharing this to every tech related subreddit I can think of
 
Last edited:

Tatty_Two

Gone Fishing
Joined
Jan 18, 2006
Messages
25,801 (3.87/day)
Location
Worcestershire, UK
Processor Rocket Lake Core i5 11600K @ 5 Ghz with PL tweaks
Motherboard MSI MAG Z490 TOMAHAWK
Cooling Thermalright Peerless Assassin 120SE + 4 Phanteks 140mm case fans
Memory 32GB (4 x 8GB SR) Patriot Viper Steel 4133Mhz DDR4 @ 3600Mhz CL14@1.45v Gear 1
Video Card(s) Asus Dual RTX 4070 OC
Storage WD Blue SN550 1TB M.2 NVME//Crucial MX500 500GB SSD (OS)
Display(s) AOC Q2781PQ 27 inch Ultra Slim 2560 x 1440 IPS
Case Phanteks Enthoo Pro M Windowed - Gunmetal
Audio Device(s) Onboard Realtek ALC1200/SPDIF to Sony AVR @ 5.1
Power Supply Seasonic CORE GM650w Gold Semi modular
Mouse Coolermaster Storm Octane wired
Keyboard Element Gaming Carbon Mk2 Tournament Mech
Software Win 10 Home x64
An SLI benchmark bottlenecked in 8x/8x. You guys should be ashamed.
That's the NVlink protocol, nothing to do with the PCI-E bus.
 

cars10

New Member
Joined
Jan 12, 2019
Messages
8 (0.00/day)
That's the NVlink protocol, nothing to do with the PCI-E bus.

What are you talking about ? Scaling has everything to do with the PCI-E Bus. The big question was if NVLink would alleviate it when in 8x/8x, which there is evidence to confirm that it does not. (Example: check out the GamersNexus reviews)
 
Top