• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA SLI GeForce RTX 2080 Ti and RTX 2080 with NVLink

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
28,726 (3.75/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
In our Turing NVLink review, we test RTX 2080 Ti SLI against RTX 2080 SLI in 23 games and also include Pascal's GTX 1080 Ti SLI numbers. Our results confirm that NVLink does give higher performance than previous generations' SLI, even though game support could be better.

Show full review
 
Last edited:
I'm already laughing at the guy, which I forgot the name, with trump icon who said "I already ordered two of them. You are just too poor to admit that this generation rocks".

Sure dude, most of console games are worse with a SLI than with the card alone. And some times worse than a 1080 Ti alone.
These cards are a waste. The 2080 Ti is only there for 4k but that's all.
 
Atrocious scaling as expected. SLI is an ancient technology which should have kicked the bucket long ago and make way for better alternatives.
 
Like what? The much-vaunted DX12 multi-GPU, which never materialised as the review notes?

Of course it never will, we have SLI. Now with NVLink, so it sounds fancier when you drop 2500$.
 
Of course it never will, we have SLI. Now with NVLink, so it sounds fancier when you drop 2500$.
Witcher 3 seems scale wonderfully.. I blame the developers cause certain games scale fine.
 
Can we get SLI perf. graph depending on CPU frequency? Isn't it already starved?
 
Don't know why but that review avatar made me think that this was a sewing machine review(Oh new Pfaff)... But yeah had to chuckle a bit 1080p with rtx cards in sli. You could just leave that off altogether(Lot of your good work gone in vain), no one in their right mind should buy one not to mention two of these and run it on that kind of puny resolution. Maybe 8k or some sort of higher than 4k surround would be better show case, but 4k and 1440p would have been enough.

Edit: for SLI, frametime analysis or at least min. fps 1%/0.1% percentiles would have been good to see. And of course RX Vega⁶⁴ CF numbers.
 
Last edited:
Blame the death of AFR SLI on shading methods that rely on data between frames. I think a common culprit is temporal anti-aliasing, which does wonders for removing pixel crawl in motion and is becoming more popular but relies on the previous frame to be effective. Throws a mega wrench in the gears of AFR SLI.
 
@sweet I was about to write something similar. ;)

RE: SLI on the Ti, it looks like "The more you buy, the lower your frames".
 
Where are Vega CF's numbers? They would make NVlink look less "astounding", wouldn't they?
I was thinking the same. These numbers by themselves don't really prove anything.
 
Hmm, interesting read up as I have been reading a lot about this new NVlink on the pro cards and RTX cards and wanted to see some solid numbers. I did not know however (Guess I just missed it previously) that the RTX 2070 lacks support... So at this point only the RTX 2080 and ti variant support it which is really a let down as I was thinking the 2070 seemed to be a logical area for people still wanting to have budget high performance versus those on the high end wanting just the best since I figured 2 2070's would eclipse the 2080 and get close to the 2080ti for a little less money (Or for those in the future wanting to get more performance without replacing their current card).

Either way, I may still get a 2080ti but will not be investing in SLI/NVlink (Meaning more than one card) anymore unless something really demands it that I play constantly.
 
W1zzard, could you include 5k numbers too? (perhaps instead of 1080p)

Spending $2.5k on GPUs deserves a nice monitor too, and I suspect scaling might be good there.
 
The number of results where 2080 SLI performs worse than a single 2080, or even 1080Ti in some cases, is staggering... absolutely worthless unless you only play the handful of games that actually support it properly.
 
I'm already laughing at the guy, which I forgot the name, with trump icon who said "I already ordered two of them. You are just too poor to admit that this generation rocks".

Sure dude, most of console games are worse with a SLI than with the card alone. And some times worse than a 1080 Ti alone.
These cards are a waste. The 2080 Ti is only there for 4k but that's all.

Agreed, I saw his post and didn't notice Trump icon or forgot if it was positive or negative for Trump? Anyway I didn't remember it because me liking his icon or not has nothing to do with me totally disagreeing with him I believe in same thread I mostly crapped on the cards and one guy (maybe him or another?) said he was excited to get a couple so I was nice and replied I'm sure he'll like them and congrats but that I had doubts and was waiting on reviews....my inner being is very happy I got a cheap 1080ti a couple months back. I got to use it couple months longer than 2080 was out, and I spent less than half of what the 2080ti is going for.
 
dual 2080Ti in NVLINK SLI is only worthy at 4K or higher & on certain games. Newer games from Fall 2018 to next year MAY get better scaling than current or older games coz devs are either having a time crunch or just don't bother coding their games to support multi GPU scaling regardless of how potent the game engine is. Still, spending $2500 ish for 2 of those cards + NVLINK bridge just to get the "best gaming experience" is a little overkill IMO. I would just settle with a single 2080Ti non FE, a decent 4K monitor & tweak the settings to get that sweet, buttery smooth 60fps on all games. $1200 for an FE variant doesn't really make a difference unless one has a really, really deep pocket & got nothing to lose...
 
Like what? The much-vaunted DX12 multi-GPU, which never materialised as the review notes?
Wasn't Deus Ex Mankind Divided support to have received DX12 mGPU support with a patch?
Was DX:MD tested at DX11 or DX12?
 
The new NVLink bridge is an assault on your wallet and an affront to decency.

Tipicall Nvidia !
Thats i why i swore never to buy Nvidia graphics Card ever again. Damn, i even have second doubts getting a Nintendo Switch, since it has nvidia chipset, but the fact that diablo 3 will lunch on Switch has convinced me in the end.

However, this is going to be the only exception i am ever going to do.
Nvidia Needs to Die allready. It has become way to greedy.
 
Wasn't Deus Ex Mankind Divided support to have received DX12 mGPU support with a patch?
Was DX:MD tested at DX11 or DX12?
DX12 of course, and we used the latest patch
 
Nice review but what about the frame times? Does Nvlink improve the frame pacing or is mGPU still the same stuttery mess as before?
 
Last edited:
I would like to see titles like crysis , crysis 2 and crysis 3 and witcher 3 and gta 5 in 8k with 2x2080 ti . I am considering to get 2 ti's and an 8k monitor from dell, so please I need help to decide , PLEASE make some tests.
 
I'm disappointed. I hoped NVlink would be more driver/engine independent then SLI was, but it appears to just be SLI on sterioids.

I'll just stick to a single vega this generation. Maybe in 2020 they'll figure out multi GPU again.
 
Did you actually check if both cards are being used (like in Win10 Taskmanager or GPUz) when SLI is enabled?
I noticed some irregularities on my dual 980TIs in SLI on some games when using DX12 and the latest Nvidia driver, which resulted in only one card being used for rendering (with lower clocks due to SLI). With DX11 on the same game, both cards are being used for rendering. Maybe this explains the drop in performance on some games when SLI is used.
 
It's not that SLI technology has slipped, it's that GFX card performance grows by leaps and bounds every generation. The problem is everything else has been stagnant. With CPUs, generation to generation increases are in single digits while GPU generations with 30 or 50% increases, no on blinks. we are not seeing the scaling we used to see on SLI because a) performance is bottlenecked by other componentry and b) with no competion from AMD, nVidia is competing with itself, sacriicing profits when 2 lesser cards are used in SLI instead of the pricey flagship models.
 
Back
Top