Wednesday, July 11th 2018

An Anthem for SLI: Bioware's New Universe in 60 FPS 4K Run on Two NVIDIA GeForce GTX 1080Ti GPUs

Lo and Behold: SLI working properly. This was my first reaction whilst reading up on this potential news piece (which, somewhat breaking the fourth wall, actually did end up as one). My thought likely isn't alone; it's been a while since we heard of any relevant dual graphics card configuration and performance improvement now, as developers seem to be throwing dreams of any "Explicit Multi-GPU" tech out of the water. This slight deviation from the news story aside, though: Anthem needed two of the world's fastest GPUs running in tandem to deliver a 4K, 60 FPS experience.

Naturally, this doesn't mean that much by now: performance will improve, optimizations will happen - perhaps a watering of the graphics will happen (to be fair, we have seen that before, so the precedent is there). We know that. Still, it does speak volumes that that kind of graphics power was needed. Still, SLI'd GTX 1080Ti graphics cards for 4K and 60 FPS really isn't that extravagant: remember that the Cyberpunk 2077 demo from E3 ran at 1080p on a single such graphics card. Anthem used double the graphics power to push through a fourfold resolution increase - not too shabby. Anthem is just 7 months away (February 22nd) from release, though, while the bets are still off for Cyberpunk 2077. Still, both games look glorious, and Bioware's Anthem really does showcase the Frostbite engine as never seen before. Digital Foundry even seems to think that the showcased demo wasn't running with the full effects galore they observed on their playthroug at E3 - screen-space reflections were absent, for one. It seems the PC version of the game could look even better than what it does right now. Here's to that.

Sources: TechRadar, Digital Foundry
Add your own comment

81 Comments on An Anthem for SLI: Bioware's New Universe in 60 FPS 4K Run on Two NVIDIA GeForce GTX 1080Ti GPUs

#51
cadaveca
My name is Dave
JismPubg did excellent with 2 RX580's. Noticable difference when going from 1 to 2 cards.
SLI doesn't work with this app. Not sure why Crossfire is relevant... SLI profile turn SLI OFF. It's funny, because I have dual 1080's and one is simply collecting dust. I re-install it every once in a while to see if it works with the games I play, and it does with a couple, but not the ones where I really want it to. I mean, I am gaming @ 2560x1600 at 60 Hz, which is still too much res for many games and a single 1080. That's what makes SLI so frustrating for me... I have the second card, but it doesn't do much of anything when I need it to.
Imsochobopubg is one of the few exceptions were I actually think nothing bad can happen with multigpu cause the game in itself has microstutters and what not.
I've used sli and CF and the microstutter, ohh my god it's terrible.

Last sli was with maxwell and last CF was 6970's.

yay I have 200fps, still feels worse than 80fps...

neither vendor seems to be able to do anything about it.
Microstutter these days is not as bad as it used to be, and isn't so jarring, but is definitely still a problem at times. I will never forget having dual 2900XT, having one die, and when I RMA'd the card, I found one GPU just FELT better than two. I started looking into it and found what is now called microstutter to be a problem, long before anyone else was complaining about it... seems I'm very sensitive to it. That said, since I've been aware of it since then, I've paid close attention to how this all works out and there definitely has been progress but no, it is not gone completely.
Posted on Reply
#52
Jism
AMD has a few profiles which you can set. And some options on the way the CF behaves. There's various ways or approached to render the frames split by 2 gpu's. Some have advantages, some less. But for the majority of games there are plenty of profiles to select from. For just 60Hz at 2560x1080 even a single RX580 will do the job. But when going 120 ~ 144Hz that SLI really comes into play. The 2 cards thing is when you want the best and maximum hardware can do.

If i'm not mistaken, where'nt the AMD crossfire chipsets back in the days better compared to the counterparts such as Nvidia ?
Posted on Reply
#53
RealNeil
XaledI have some bad news for you guys.
You have spoken, and we are all fooling ourselves. It's settled.
Posted on Reply
#54
Unregistered
Good to see - I've been running multi GPU solutions for a number of years. I ran Crossfire cards for a while, then when I switched over to NVidia, I ran 780 Ti SLI, Titan X Maxwell SLI, and now 1080 Ti SLI and have overall had a great experience. I simply can't game at the image quality & framerates I expect to game at without multiple video cards, GPUs simply aren't powerful enough with just one. It's good to see companies making an effort to help us gamers out with SLI support.

After Arkham Knight's disaster, I stopped buying graphically demanding games if they don't support SLI. Gotta vote with your wallet. :)

Thanks for the article.
Posted on Edit | Reply
#55
RealNeil
Razrback16I stopped buying graphically demanding games if they don't support SLI. Gotta vote with your wallet. :)
Me too. Most of mine do use SLI. (although my shooters are not that demanding)
Posted on Reply
#56
Upgrayedd
XaledI have some bad news for you guys. Multi gpu is useless because you cant get consistent fps rate even from a single gpu. you either get inconsistent frame rate from both cards which leads to stutttering or a socalled solution; cap frame rate to the lower frame which changes constantly and in its way lead to lag. In both senarios gameplay loses the fluidity and smoothness of a single gpu. So good look in fooling yourself forever.
Do you have SLI? Cause there's some people here saying its fine, over multiple generations.
Seems to be a lot of inexperienced, word of mouth, SLI haters, but I'm an idiot for wanting to try
XaledOr maybe low profile cards expose the lie of multi-gpu and bridge more clearly because they are more prone to fps drops that becomes much more worse when using multi_GPU solution which *supposedly* its main purpose is increasing fps therefor eincreasing gaming fluidity.
Mid range sli was eating into Nvidia's high end sales so they stopped that. It really is that simple.
Posted on Reply
#58
Unregistered
UpgrayeddMid range sli was eating into Nvidia's high end sales so they stopped that. It really is that simple.
Hah, ya, when I did Crossfire many years back, that's what I did - I didn't make a lot of money back then so I had to go budget as much as possible so I'd usually run a couple $150-200 cards together and net some really good performance with them. By the time I switched to NVidia, they were starting to remove SLI on the middle-tier cards.
Posted on Edit | Reply
#59
Upgrayedd
Xx Tek Tip xX4k 45ish average on far cry 5 benchmark with a 1080 ti ftw3 and 6600k 4.5 ultra - perfectly playable and sli isn't worth it.

babeltechreviews.com/gtx-1080-ti-sli-performance-25-games/3/

44min
51avg
60max
V sync enabled.
1080 ti sli is useless until 4k144hz drops - it won't drive it though.
They have a newer SLI review with 1070Ti. They say the HB Bridge improves stutters.
I have Gsync and I absolutely hate going below 50fps still. There's no tearing but its not near as smooth as 80+fps.
Posted on Reply
#60
razaron
dinmasterthought with dx12, it would put any gpu together as a pool and the memory as well. something along the lines of any gpu work together to render games. no need for sli or crossfire.. was i wrong? can't remember
How Vulkan and DX12 work in general:
  • Start the program and grab a list of all available physical devices (IGP, discrete GPU etc.)
  • Query the physical devices to find one that supports the features you need, then create a logical device out of it.
  • Attach a command pool (the thing you've heard of in articles that makes DX12 multithreaded) to the logical device.
  • Throw work into a command queue then throw that command queue at the command pool to get the physical device to do work. Rinse and repeat.
The way multi-GPU works is you just create more than one logical device, each with it's own command pool. Then you can just spread the work between the different command pools.

The reason why devs don't really use it is because this is all new and they're still trying to get to grips with it.
Different GPU's have different features available and perform differently, so spreading work evenly between them usually ends up slower. So you have to load balance the work you send to the GPU which takes a bunch of testing to get just right.
Then there's the problem of cross-GPU communication. Sometimes work depends on the results of previous work (e.g. postprocessing) which could be on a different GPU, requiring communication between GPU's which add latency. That latency could end up making things slower than doing everything on a single GPU because GPUs are extremely fast and any time spent outside them (e.g. copying 3d model data from ram to vram) is wasted time. Ideally the program would require zero cross-GPU communication but that would take dev resources to redesign how they go about their renderer code.

Over time as devs learn more and tools get better, DX12/Vulkan (which are very similar) will end up replacing DX11/OpenGL* and we'll live in a magical world of cross vendor multi-GPU.
Of course the closet Mormons say that will never happen but they said the same about DX9, DX10, DX11, dualcores, quadcores and now 6/ cores. So they can safely be ignored.

*Khronos said Vulkan isn't supposed to replace Vulkan but that's what will happen. Graphics programming is commonly referred to as black magic or a dark art by other programmers because, while it may be easy to do a basic "hello world", it's extremely time consuming getting good at it. The reason for that is that pre-DX12, the API was a black box (a bloated finite state machine that does mysterious things) which was interpreted by the driver, another black box (also bloated because it's interpreting a bloated thing), which in turn relays commands to the GPU, another (you guessed it) black box.
The way you learn a black box is by throw it a bunch of inputs then seeing what it ouputs. This is time consuming enough for a single black box but when you have 3 layers of blackboxes, each interchangeable, you can imagine how this becomes a dark art.
With DX12/Vulkan, instead of a bloated FSM they give you a model of a GPU (which accurately maps to how modern GPUs work) and give you functions to directly interact with different parts of that model, removing its blackbox nature.
Since the model maps almost one-to-one with the physical GPU drivers can be a much leaner (which is why AMD is so good now, they aren't hampered by their bad drivers anymore). The driver ends up being just a glorified conversion layer which you can ignore because it's not doing anything special, removing it's blackbox nature.
And finally, since the model maps so closely to the physical GPU and the driver is ignorable the GPU also stops being a black box.
Writing a basic "hello world" is harder in DX12/Vulkan because you can essentially interact with any part of the GPU in any order. However, once you've written that "hello world" program you will have learnt most of the API. I've messed around with Vulkan and I'd compare it to C. It doesn't hold your hand but it does make eveything very transparent, so you can easily tell what your code is doing at the hardware level. It even feels the same with all the memory management.
Posted on Reply
#61
TheinsanegamerN
UpgrayeddPlz elaborate. Games are are still releasing with SLI support. coming out with SLI support a year later
FTFY. SLI support at launch is praticaly non existant, more then 50% scaling within the first year is rare, a lot of games simply never get SLI support, and SLI is still completely MIA on DX12 games.

SLI is a far cry from the golden days of 2010, when every game was at least somewhat supported, and proper SLI support was the norm by the 3 month mark.
Posted on Reply
#62
Upgrayedd
TheinsanegamerNFTFY. SLI support at launch is praticaly non existant, more then 50% scaling within the first year is rare, a lot of games simply never get SLI support, and SLI is still completely MIA on DX12 games.

SLI is a far cry from the golden days of 2010, when every game was at least somewhat supported, and proper SLI support was the norm by the 3 month mark.
Lol "FTFY" ... did you even look at the 3 games I mentioned?
Posted on Reply
#63
Xaled
UpgrayeddDo you have SLI? Cause there's some people here saying its fine, over multiple generations.
Seems to be a lot of inexperienced, word of mouth, SLI haters, but I'm an idiot for wanting to try
yes i tried both. and tried to use it with 120hz monitors but the result was in best cases easily noticeable smoothlessness when compared to single gpu. i used to play many games at constant 125fps and 250fps (old games at low settings) at 120hz monitors and got used to that smoothness and can notice and get bothered by anything less smoother which was really obvious in gtx 680 SLI. And guys here who claim their SLI setups are working. it looks working because theey are using high end cards and stuttering and lag issues are less noticable because they are getting higher framerates BUT it is still there. there are either stuttering (when getting low framerats, or with low-mid range cards ) or lag (with high framerates; when using highend cards). The lag is noticable and obvious especially when playing games online. so maybe they arent noticing it because they are casual gamers and not used to play at high framerates.
Posted on Reply
#64
B-Real
RavenmasterI've been running SLI setups since the GTX 280 days. I've tried dual, triple, quad SLI. Dual SLI seems to be the sweet spot. With dual SLI you'll see around 60-80% performance boost as opposed to just using 1 card in many games. With triple SLI, the 3rd card usually only gives an extra 15-20% performance boost on top of dual SLI. Then a 4th card usually only gives you an extra 4% performance on top of triple SLI. So 3-way and 4-way SLI are absolutely not worth it but 2-way SLI gives you quite a huge boost in FPS.
www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080_SLI/20.html

70% when only normal scaling games are included. This means you lose 30% of the money you invested in a second card. And if you take all games into consideration (it's not certain that your beloved upcoming game will get a "better" 2 card support), it drops nearly to 50%, meaning HALF of your money is throw into dustbin when buying a second card.
RavenmasterI've been running SLI setups since the GTX 280 days. I've tried dual, triple, quad SLI. Dual SLI seems to be the sweet spot. With dual SLI you'll see around 60-80% performance boost as opposed to just using 1 card in many games. With triple SLI, the 3rd card usually only gives an extra 15-20% performance boost on top of dual SLI. Then a 4th card usually only gives you an extra 4% performance on top of triple SLI. So 3-way and 4-way SLI are absolutely not worth it but 2-way SLI gives you quite a huge boost in FPS.
www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080_SLI/20.html

70% when only normal scaling games are included. This means you lose 30% of the money you invested in a second card. And if you take all games into consideration (it's not certain that your beloved upcoming game will get a "better" 2 card support), it drops nearly to 50%, meaning HALF of your money is throw into dustbin when buying a second card. So the one who says it's dead: it's really dead. Awful scaling, more heat, more noise. The only acceptable scaling would be 90% for scaling games, 70-75% including the non (or less)-scaling games.
hyp36rmaxWhat games do you? What's your monitor setup look like?
Then wait for a better GPU. SLI and CF are both awful.
Posted on Reply
#65
trog100
Razrback16Good to see - I've been running multi GPU solutions for a number of years. I ran Crossfire cards for a while, then when I switched over to NVidia, I ran 780 Ti SLI, Titan X Maxwell SLI, and now 1080 Ti SLI and have overall had a great experience. I simply can't game at the image quality & framerates I expect to game at without multiple video cards, GPUs simply aren't powerful enough with just one. It's good to see companies making an effort to help us gamers out with SLI support.

After Arkham Knight's disaster, I stopped buying graphically demanding games if they don't support SLI. Gotta vote with your wallet. :)

Thanks for the article.
that about sums it up.. vote with your wallet.. if the game dosnt suport sli dont buy it.. i am sure the sli support situation would improve dramatically if more folks did this.. :)

as for it not being cost effective.. since when has top end performance ever been cost effective.. if you want the best performance available it costs..

trog
Posted on Reply
#66
newtekie1
Semi-Retired Folder
Vote with your wallet only works if a significant number of the people buying the product stop, but the fact is SLI, and even Crossfire, users make up a very small percentage of the people buying these games. So the game companies don't even notice the lost sales.

This is the first generation that I have not run and SLI setup since I first started running SLI with 7800GTs over a decade ago, and I've run SLI with every GPU generation since then. But the support for it in games, and a large part of this is due to nVidia dropping SLI support on lower end cards, just isn't there anymore. So I opted to wait until I could afford the one big GPU and buy it instead of buying a lower teir GPU and then buying a second a few months later.
Posted on Reply
#67
hyp36rmax
Xaledyes i tried both. and tried to use it with 120hz monitors but the result was in best cases easily noticeable smoothlessness when compared to single gpu. i used to play many games at constant 125fps and 250fps (old games at low settings) at 120hz monitors and got used to that smoothness and can notice and get bothered by anything less smoother which was really obvious in gtx 680 SLI. And guys here who claim their SLI setups are working. it looks working because theey are using high end cards and stuttering and lag issues are less noticable because they are getting higher framerates BUT it is still there. there are either stuttering (when getting low framerats, or with low-mid range cards ) or lag (with high framerates; when using highend cards). The lag is noticable and obvious especially when playing games online. so maybe they arent noticing it because they are casual gamers and not used to play at high framerates.
Curious what resolution and monitors you're using at 120hz. When was the last time you used an SLI Setup? (GTX 680?) We've had major advances in minimizing latency and stuttering since then.

SLI and Crossfire User here with Triple 1440P @144hz (7680x1440P @144hz) monitors and 4K @60hz monitors. Although SLI and Crossfire were more prevalent generations ago especially with low and mid-tier GPU's, a setup like this is really only recommended for users at the bleeding edge pushing hardware and graphics to the limit today. SLI, Crossfire, Multi-GPU isn't for everyone. If anything major diminishing returns for users playing below 1440P @144hz with current titles, you just won't maximize the GPU's enough to warrant that second video card. Hence the complaints of people with negative multi-GPU experiences. (Game selection is another)

I personally recommend going SLI, Crossfire, Multi-GPU route only if you're playing at 4K+ Resolutions, High-Refresh, and Multi-Monitor setups with two of the highest tier GPU's. This guarantees you the best performance with all the games in the market regardless of support. I would never recommend users to go with mid or low tier SLI, Crossfire, Multi-GPU setups since you should be purchasing the single best GPU you can afford if you're playing below 1440P @144hz (including 1080P @240hz).

My point is to focus on building a balanced system matching your video card to the monitor setup for the best results. Does it hurt to pick up a GTX 1080Ti for a 1080P monitor? Not at all, but don't expect increased performance in that resolution and refresh adding another one.
B-RealThen wait for a better GPU. SLI and CF are both awful.
Even if you have the highest tier GPU available? Yea ok... Try running a single gpu to run a setup with triple 1440P @ 144hz (7680x1440P @144hz)... (That's a long wait) Really curious what monitor, resolution, refresh rate, and game selection you're playing at to say it's awful... Would love to know your current experience. i'll wait...
Posted on Reply
#68
Upgrayedd
Xaledyes i tried both. and tried to use it with 120hz monitors but the result was in best cases easily noticeable smoothlessness when compared to single gpu. i used to play many games at constant 125fps and 250fps (old games at low settings) at 120hz monitors and got used to that smoothness and can notice and get bothered by anything less smoother which was really obvious in gtx 680 SLI. And guys here who claim their SLI setups are working. it looks working because theey are using high end cards and stuttering and lag issues are less noticable because they are getting higher framerates BUT it is still there. there are either stuttering (when getting low framerats, or with low-mid range cards ) or lag (with high framerates; when using highend cards). The lag is noticable and obvious especially when playing games online. so maybe they arent noticing it because they are casual gamers and not used to play at high framerates.
Since your SLI days they have released two new SLI bridges to combat the issues...
Most people talking smack on SLI here are talking about the old SLI hardware, haven't seen anyone talking bad about their HB SLI bridge setup, just the stuff of yesteryear.

@B-Real wow those 1080 reviews are more than 2 years old lmao we need new cards so bad.

So what are we all going to do when they keep releasing SLI games in 6 months? Keep claiming its dead? Keep asking for people to prove to them that SLI games are still coming out because they don't have the patience to look it up themselves? Looking at you @cadaveca...
Posted on Reply
#69
cadaveca
My name is Dave
UpgrayeddSince your SLI days they have released two new SLI bridges to combat the issues...
Most people talking smack on SLI here are talking about the old SLI hardware, haven't seen anyone talking bad about their HB SLI bridge setup, just the stuff of yesteryear.

@B-Real wow those 1080 reviews are more than 2 years old lmao we need new cards so bad.

So what are we all going to do when they keep releasing SLI games in 6 months? Keep claiming its dead? Keep asking for people to prove to them that SLI games are still coming out because they don't have the patience to look it up themselves? Looking at you @cadaveca...
I have both the SLI+ and and HB-SLI bridge, as well as a lit HB SLI bridge, all MSI, to go with my MSI cards on what was supposed to be an MSI motherboard. The perfect example I have of SLI NOT working is PUBG... when crossfire works fine. That's crap support. That's why I have the complaints I do… because I've been using both SLI and Crossfire since you could (A8N SLI board, IIRC), and today support is far worse than it used to be, even with titles that apparently have support. What's more concerning is that SLI works fine for me in benchmarks... just not games. I'd like support in Destiny 2 as well... but again, it doesn't work (just tried again today, I get graphical anomolies).

So I'm left with a question, maybe my bridge is bad? OK, try new one. Nope, still doesn't work. try new board.. nope. New platform (I have them all), nope. Maybe it's a bad card? nope.. cards work fine on their own.

So, why do I have problems with getting SLI to work? I'm the guy with many machines and a store's-worth of hardware on my shelf, and nothing gets things working right. I use two different configs for monitors... 3x 1200p and a single 2650x1600...I got into multi GPU because 2560x1600 required it. Then I went with multi-monitor because of Eyefinity… that was crap too... swapped to SLI and things were better... and then things went downhill from there.

So great, it works for you, but it doesn't woprk for me. How is that possible? We use our PCs differently. To me, for SLI to be a success (or Crossfire, for that matter), we should all have the same experience, but we don't. It's too bad.
Posted on Reply
#70
hyp36rmax
cadavecaI have both the SLI+ and and HB-SLI bridge, as well as a lit HB SLI bridge, all MSI, to go with my MSI cards on what was supposed to be an MSI motherboard. The perfect example I have of SLI NOT working is PUBG... when crossfire works fine. That's crap support. That's why I have the complaints I do… because I've been using both SLI and Crossfire since you could (A8N SLI board, IIRC), and today support is far worse than it used to be, even with titles that apparently have support. What's more concerning is that SLI works fine for me in benchmarks... just not games. I'd like support in Destiny 2 as well... but again, it doesn't work (just tried again today, I get graphical anomolies).

So I'm left with a question, maybe my bridge is bad? OK, try new one. Nope, still doesn't work. try new board.. nope. New platform (I have them all), nope. Maybe it's a bad card? nope.. cards work fine on their own.

So, why do I have problems with getting SLI to work? I'm the guy with many machines and a store's-worth of hardware on my shelf, and nothing gets things working right. I use two different configs for monitors... 3x 1200p and a single 2650x1600...I got into multi GPU because 2560x1600 required it. Then I went with multi-monitor because of Eyefinity… that was crap too... swapped to SLI and things were better... and then things went downhill from there.

So great, it works for you, but it doesn't woprk for me. How is that possible? We use our PCs differently. To me, for SLI to be a success (or Crossfire, for that matter), we should all have the same experience, but we don't. It's too bad.
PUBG is a terrible example for SLI/Crossfire support. I agree it was working when V1.0 dropped for both Crossfire and SLI, but the update after that killed any existence of both for me. I have SLI GTX 1080Ti FTW3's and Crossfire VEGA 64's. Had to run that game in 4K and/or 2560x1440P since that game also has terrible support for Triple monitors. With that said a single high-tier GPU was good enough.

What issues are you experiencing with Destiny 2? Seems to work for my setup with both VEGA 64's and GTX 1080Ti's.

Are you playing at 60hz for your triples and 1600P? I'd argue your setup is probably still good for a single high tier gpu. I don't imagine you would be stressing a single GTX 1080(Ti) and VEGA 64 at that resolution and refresh of 60hz. What other games are you trying to run with multi-gpu?

For reference I also own a GTX 1070 FTW, GTX 970 FTW, VEGA 56 NANO, Crossfire FURY X's, Crossfire VAPOR-X 290X's 8GB, Crossfire HD7970's that i've tested with my setups.
Posted on Reply
#71
cadaveca
My name is Dave
hyp36rmaxPUBG is a terrible example for SLI/Crossfire support. I agree it was working when V1.0 dropped for both Crossfire and SLI, but the update after that killed any existence of both for me. I have SLI GTX 1080Ti FTW3's and Crossfire VEGA 64's. Had to run that game in 4K and/or 2560x1440P since that game also has terrible support for Triple monitors. With that said a single high-tier GPU was good enough.

What issues are you experiencing with Destiny 2? Seems to work for my setup with both VEGA 64's and GTX 1080Ti's.

Are you playing at 60hz for your triples and 1600P? I'd argue your setup is probably still good for a single high tier gpu. I don't imagine you would be stressing a single GTX 1080(Ti) and VEGA 64 at that resolution and refresh of 60hz. What other games are you trying to run with multi-gpu?

For reference I also own a GTX 1070 FTW, GTX 970 FTW, VEGA 56 NANO, Crossfire FURY X's, Crossfire VAPOR-X 290X's 8GB, Crossfire HD7970's that i've tested with my setups.
The real problem with SLI, as I came to find out (and is present in my cards), is that if your cards reach differing clocks due to Boost, it throws frame sync off in a big way, and can lead to post-processing graphical glitches, such as lighting and fog issues (half the frame is out of sync). In the past, NVidia restricted cards to the same clocks, but they do not any more, and this causes problems. I found I could mitigate some of these problems by having the slower card as primary, but if your secondary card runs slower than the primary by more than about 39 MHz, you start getting bad glitches.

This is such a basic issue that has come back that NVidia dealt with before, but stopped because benchmarks look better when your cards run at the maximum clocks possible. (IMHO).
Posted on Reply
#72
hyp36rmax
cadavecaThe real problem with SLI, as I came to find out (and is present in my cards), is that if your cards reach differing clocks due to Boost, it throws frame sync off in a big way, and can lead to post-processing graphical glitches, such as lighting and fog issues (half the frame is out of sync). In the past, NVidia restricted cards to the same clocks, but they do not any more, and this causes problems. I found I could mitigate some of these problems by having the slower card as primary, but if your secondary card runs slower than the primary by more than about 39 MHz, you start getting bad glitches.

This is such a basic issue that has come back that NVidia dealt with before, but stopped because benchmarks look better when your cards run at the maximum clocks possible. (IMHO).
I agree this really is a problem, not sure if Nivida designed their drivers to simply use the lowest spec gpu as the base when using SLI. Generally you want to utilize two identical cards for the best performance. You really should have considered that. Unlike AMD's crossfire which does exactly that since you're allowed to Crossfire with a lower tier from the same family. Example: Fury+FURYX, VEGA 56+VEGA 64.

EDIT: To be clear yours cards should be matching including the Core and memory clocks. Same exact card and SKU

Another thing to think about is SLI / Crossfire currently have limitations with post processing such as Temporal AA that has the possibility of crippling performance. Some games have that as default without any means to disable, such as Gears of War 4. Disabling certain post processing should increase your performance.

It's little details like this is the reason SLI, Crossfire, and Multi-GPU is more of an advanced user feature that most people are not ready or familiar with.
Posted on Reply
#73
RealNeil
cadavecaThe real problem with SLI, as I came to find out (and is present in my cards), is that if your cards reach differing clocks due to Boost, it throws frame sync off in a big way, and can lead to post-processing graphical glitches, such as lighting and fog issues (half the frame is out of sync).
Would this be so with matching GPUs?
Posted on Reply
#74
cadaveca
My name is Dave
RealNeilWould this be so with matching GPUs?
Yeah I guess its possible. I have one plain gaming 1080 and one gamingX 1080, and one card runs 1836, the other 1949. it took me a long time to figure out it was the clock difference that was causing a fair amount of the issues I have. These are the same card, really, same PCB etc... just different clocks.
Posted on Reply
#75
RealNeil
Ok, all of my SLI setups are using exact same model number cards. (1070, 1070Ti, 1080FE)
They use the same clocks.
Maybe that's why I'm not seeing the stuttering that you are?
Posted on Reply
Add your own comment
Apr 23rd, 2024 09:40 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts