• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Debating multi gpu? Just do it!

@Vayra86 Not in idle.

And trog, have you ever tested any of your claims?
 
See that is where the assumption goes wrong: you think that dynamic power targets actually are the same as running a single GPU, but this is not true. SLI systems ALWAYS take more power by default, because there are two GPU's running. In Idle, you use twice the idle power, at load, you have a higher base power usage. Regardless of how you limit it. I'm fine with you doing whatever you do, but I'm not fine with going along with the assumptions you connect to that, because they are simply wrong.

That's not actually always the case, as the review here: http://www.eteknix.com/amd-radeon-r9-380-2gb-crossfirex-graphics-cards-review/ shows.
Crossfire R9 380 using less power than a titan X, producing less heat under load, while producing comparable frame rates.

It was actually this that finally twisted my arm into taking the risk with crossfire.
 

Attachments

  • C__Data_Users_DefApps_Windows Phone_AppData_INTERNETEXPLORER_Temp_Saved Images_power3.jpg
    C__Data_Users_DefApps_Windows Phone_AppData_INTERNETEXPLORER_Temp_Saved Images_power3.jpg
    22.9 KB · Views: 308
  • C__Data_Users_DefApps_Windows Phone_AppData_INTERNETEXPLORER_Temp_Saved Images_temps1.jpg
    C__Data_Users_DefApps_Windows Phone_AppData_INTERNETEXPLORER_Temp_Saved Images_temps1.jpg
    24.9 KB · Views: 302
That's not actually always the case, as the review here: http://www.eteknix.com/amd-radeon-r9-380-2gb-crossfirex-graphics-cards-review/ shows.
Crossfire R9 380 using less power than a titan X, producing less heat under load, while producing comparable frame rates.

It was actually this that finally twisted my arm into taking the risk with crossfire.

So now you go off comparing wildly different architectures, single card Nvidia versus Crossfired AMD, and the biggest die versus a mid-range die.

Also, if anything the first graph actually shows how right I am. Look at the idle power draw of the single Titan X versus the CF setup. CF is higher in idle. The *bigger* chip is less power hungry at idle. Then look at the performance differences, and how the small gap at 1080p gets wider and wider as soon as the resolution goes up, until you can see 30% performance gaps or greater appear...
 
Last edited:
Guys, let's send OP a watt meter so he can measure his idle because them graphs are so backwards.

Using that logic I should've gotten a second GTX660 because "it's less power consumption than a GTX780 at idle, and I'll get so much SLI support."
 
So now you go off comparing wildly different architectures, single card Nvidia versus Crossfired AMD, and the biggest die versus a mid-range die.

Also, if anything the first graph actually shows how right I am. Look at the idle power draw of the single Titan X versus the CF setup. CF is higher in idle. The *bigger* chip is less power hungry at idle.

I compared to the titan X because it's the closest match in performance throughout that review, and also a vastly more efficient architecture, yet only by 14w at idle over this dual card set up.

If I didn't have 8 led fans in push pull on my radiators I would save those 14w.

The point being, at idle the difference is negligible, at load it's actually a saving to use the dual card set up and get very similar performance.
 
Not always, but certainly most of the time. You don't add another 250W card and expect 10W more... That which you have linked is anomalous.
That's not actually always the case, as the review here: http://www.eteknix.com/amd-radeon-r9-380-2gb-crossfirex-graphics-cards-review/ shows.
Crossfire R9 380 using less power than a titan X, producing less heat under load, while producing comparable frame rates.

It was actually this that finally twisted my arm into taking the risk with crossfire.
In one benchmark... cool.

Idle power does NOT double assuming the ULPS stuff is enabled. Essentially, the second card shuts down to only a few watts until it is needed.

Part of the issue is the performance per watt argument. More than typically, due to scaling of multiple GPUs, you will not see that go down.


In the end, this is a to each their own type thing. I, like many others, choose to stay away from multi GPU configs unless they are actually needed (read: 4K, 3x 2560x1440, or if yuo are a FPS snob). Otherwise in MOST cases, power consumption, noise, are less with a single card.

at load it's actually a saving to use the dual card set up and get very similar performance.
In one benchmark bubba....
 
Just. Don't. Do. It.
 
I compared to the titan X because it's the closest match in performance throughout that review, and also a vastly more efficient architecture, yet only by 14w at idle over this dual card set up.

If I didn't have 8 led fans in push pull on my radiators I would save those 14w.

The point being, at idle the difference is negligible, at load it's actually a saving to use the dual card set up and get very similar performance.

Trust me bro, I've been there. I went from GTX 660 x2, to a GTX 770. The 10-15% performance gain is most certainly not worth the drawbacks of dual GPU, and the price aspect gets mitigated entirely by the higher power draw + motherboard/PSU requirements and the immense time investment of getting shit to run well on release day of new games.

Basically there is only one real incentive to go dual GPU: you actually *need* the horsepower for playable frames at a higher resolution. Any other use case, it's just better in every way to keep single GPU and go for the strongest you can afford. There is also a better upgrade path in single cards: you lose a LOT of resale value with 2x a lower tier card versus 1x a higher tier one. Basically single high end GPU gives you access to more regular upgrades at a high resale value.

I remember being just like you when I just SLI'd the 660's. It feels great when you are actively looking at FPS meters, and then you start focusing on actual gaming, and you notice the stutters, you notice the noise, and you notice how support is lacking on new games.
 
Last edited:
To be honest, I couldn't care less about power consumption, I don't know how accurate that review is and I don't care either... I have a 750w psu running all my hardware just fine, and a 1200w in my spares box should I need it. Yes, I pay my own bills... No saving a few pennies and pounds a year doesn't concern me.

In my case, I do have a 2560x1440 monitor with 60Hz refresh, so the extra oomph is necessary for me... Assuming I actually want to max out the quality settings while maintaining 60fps, which I do.

This thread quickly became all about the potential and assumed issues with crossfire / sli set ups, when in fact my intention with this thread was simply to report that in my own testing with the games I've listed I can report no issues found, and in my case this set up has done for me exactly what I wanted it to do.
 
You should care how accurate the review you used to defend your position/talking point was... you don't find it odd that a 380x jumped up quite a bit while your card didn't (from the save reviewer)? Something is up with that result there.

I run a single 980ti at 2560x1440... A 980 would easily do that too (as would a single 390/390x/Fury/Nano/Fury X)... I run at 60 FPS+ while maxing out the games I play.

See, what you do not seem to understand is that your own single experience isn't The Gospel. We are all happy it worked out for you, but, without a doubt, people have issues with multiple GPU setups. It isn't as bad as some make it out to be, but it isn't always plug and play. If you can handle the increased power consumption, heat, noise, potenial lack of scaling, cost, then by all means, go for it. But make no mistake about it, there are MORE issues with multiple GPUs than there are with single on many fronts.
 
Last edited:
This thread quickly became all about the potential and assumed issues with crossfire / sli set ups, when in fact my intention with this thread was simply to report that in my own testing with the games I've listed I can report no issues found, and in my case this set up has done for me exactly what I wanted it to do.
of70q.jpg
 
To be honest, I couldn't care less about power consumption, I don't know how accurate that review is and I don't care either... I have a 750w psu running all my hardware just fine, and a 1200w in my spares box should I need it. Yes, I pay my own bills... No saving a few pennies and pounds a year doesn't concern me.

In my case, I do have a 2560x1440 monitor with 60Hz refresh, so the extra oomph is necessary for me... Assuming I actually want to max out the quality settings while maintaining 60fps, which I do.

This thread quickly became all about the potential and assumed issues with crossfire / sli set ups, when in fact my intention with this thread was simply to report that in my own testing with the games I've listed I can report no issues found, and in my case this set up has done for me exactly what I wanted it to do.

Do you see the irony? The *reason* people SLI/Crossfire should never be a cost or performance/dollar aspect, because there is no price advantage here even though benchmark comparisons + price tags tell you otherwise. The cost advantage exists ONLY on the actual purchase of the cards. Beyond that, you slowly lose that money you saved (you speak of pennies, but the reason you went SLI is because you didn't want to shell out for a better single card, so your entire motivation is exactly what you are denying it to be...).

I'm not here to bash on your purchase, I'm here to point out the fallacy of reasoning that is happening here. If it works for you that's great, and nothing changes that.
 
You should care how accurate the review you used to defend your position/talking point was... you don't find it odd that a 280x jumped up quite a bit while your card didn't (from the save reviewer)? Something is up with that result there.

I run a single 980ti at 2560x1440... A 980 would easily do that too (as would a single 390/390x/Fury/Nano/Fury X)... I run at 60 FPS+ while maxing out the games I play.

See, what you do not seem to understand is that your own single experience isn't The Gospel. We are all happy it worked out for you, but, without a doubt, people have issues with multiple GPU setups. It isn't as bad as some make it out to be, but it isn't always plug and play. If you can handle the increased power consumption, heat, noise, potenial lack of scaling, cost, then by all means, go for it.

I do understand that my experience is not gospel, that's why I keep referring to it as "my experience" and why I haven't blasted anyone for having a preference to single card set ups.

As I said cost was a major factor for me, the cheapest card (and worst performing) you listed there is the r9 390, which here in the UK is £360.

I would of had precisely that much to spend had I sold my original 380 and put it towards the amount my second one cost, yet I wouldn't be getting the performance I'm getting now, so to me that would have been a bad choice.

Sure I could've put the money I spent on an Enthoo and the 500GB EVO towards it as well and bought a 980ti, but then I'd still want to buy those things next month and still only be getting roughly the same performance I'm getting now.

I've saved money, got a build that performs how I want it to, and I'm not sat here thinking "I still need to buy this that and other" just for the sake of buying a £600 single gpu to start with.
 
Another thing to note is that adding a GPU does not scale up VRAM capacity. Right now SLI and CF have to duplicate memory contents across both cards, which means you only get 2 GB of memory from 2x 2 GB cards.

Newer games tend to use a lot more video memory. Even if adding a second card gives you enough raw GPU power to run at max settings for quite a while, VRAM capacity will likely become an issue. For example, GTX 580s in SLI would effectively have 1.5 GB of VRAM. A lot of games now don't perform well with less than 2 GB.
 
I'll freely admit Crossfire isn't really worth it, and I've always had Crossfire since a pair of 4850s. Quite a few AAA title are horribly lacking in functionality with multiple GPUs. I have ran Crossfire for years, and probably will continue to do so, but I also Crossfire a pair of high end cards, instead of mid range cards, so I'm not necessarily dependent on it. I do it because it looks sexy in my rig, which is stupid, but I like the look of dual GPUs.

With a game that isn't supported, I simply disable Crossfire for that particular title (I'm looking at you NFS16) and any decent high end card is sufficient to push playable frame rates.

I would never recommend it to anyone unless you just want to screw around with it, but NEVER count on it for stable framerates on a high-res monitor, it is simply is too hit or miss.

The back and forth can continue until everyone is blue in the face, but the simple fact of the matter is that Crossfire is great for Benchmarks and certain titles, but not for a rig that would be crippled using only a single card out of the pair.

Word to the wise, get the single fastest card you can afford, so that you can depend on the frame rates it should provide.

JAT
 
Another thing to note is that adding a GPU does not scale up VRAM capacity. Right now SLI and CF have to duplicate memory contents across both cards, which means you only get 2 GB of memory from 2x 2 GB cards.

Newer games tend to use a lot more video memory. Even if adding a second card gives you enough raw GPU power to run at max settings for quite a while, VRAM capacity will likely become an issue. For example, GTX 580s in SLI would effectively have 1.5 GB of VRAM. A lot of games now don't perform well with less than 2 GB.

I thought about that in advance and got the 4GB versions, for only about £20 more each it was definitely worth the extra investment.

The problem as I see it is that yes this thread was meant to be myopic, it's about my own personal experience that I got into through my own personal circumstances, but there are so many people saying "crossfire has issues", "90% of games don't work", "it doesn't produce playable frame rates"... In my experience, however limited, that simply is not the case and so it's misleading to people considering crossfire for whatever reason to be told categorically that it is the case.

At the same time though this was never meant to be the definitive answer to whether or not crossfire is worth it, what's worth it or not is a very personal thing, and for me it was.
 
Well...

1. "Crossfire has issues" is true...it has more issues than single card. Period. THE END. (same goes for SLI)
2. "90% of games don't work" is bologna and whoever said that is flat out wrong. I would bet the majority of games have multiple GPU scaling in some form. Some however, do not. But those are in the minority.
3. "doesn't produce playable frame rates" is also bologna and whoever said that is flat out wrong as well.

Don't defend the BS... defend what actually needs to be defended.

While it isn't supposed to be the definitive answer, you are telling everyone, because of YOUR experience to do it/that it is OK and there are no issues.
Don't let anyone put you off buying a multi gpu set up if it's more financially viable for you to do so, from what I've seen the scare stories about it not working are just that, scare stories!
They are NOT stories... we are informing people of REAL potential issues with that type of configuration. I agree with the end message, but the way its being delivered leaves a lot to be desired.
 
Last edited:
I thought about that in advance and got the 4GB versions, for only about £20 more each it was definitely worth the extra investment.

At the same time though this was never meant to be the definitive answer to whether or not crossfire is worth it, what's worth it or not is a very personal thing, and for me it was.
I've never run SLI/CF myself, but that's a situation where multi GPUs would make sense. If your GPU is getting old and doesn't have enough raw power but does have enough VRAM, adding a second card is pretty viable. You risk SLI/CF issues, but you're also paying a lot less for a used second card (compared to a new, more powerful card).

But personally, I don't think I'll be going SLI/CF anytime soon. Trying to cool the cards just looks like a nightmare.
 
What I was getting at, which should have been quite clear in my first post and thread title, is that anyone already considering crossfire should not be put off by the many (many!) people saying the majority of games don't work and it won't produce good results in those that do work.

I'm sure crossfire does have issues, not in anything I've tried so far, but its well known that games at release often do have issues.

But, and it's a big but, anyone considering crossfire is likely aware of that and willing to accept they may have to wait for patches/profiles, but nothing is stopping them lowering the res/settings for a month to run it in single card mode if necessary.

I just think its important for people considering it to know that the rubbish about 90% pure issues is just that.
 
Yeah, you've hit two of the nails directly on the head. It depends on the game, and whether a single card is enough already. If one is enough, then you actually add in a bit of latency using two cards. This became really evident to me when I had 2900XT Crossfire on the same monitor, I got decent frames with two cards, but things where smoother and better matched to my inputs with just one card, even though I only had half the FPS, and most times around 30 FPS.

The whole frame pacing thing was well documented. Since then, that particular issue has been not such a problem, but it still exists depending on the driver used.

And that's the other factor.. which drivers are in use.


Since there are all these factors that can come into play, and none of them are an issue when one card is used, I do have to stand my ground on my position of not recommending multi-GPU usage. One card usually just works fine no matter what game you are playing, and leads to a better overall experience, provided your GPU can push your monitor. That's why when I see threads asking what parts to use, my first question is always "which monitor?" because that's also an important factor as well, and is really the biggest factor to consider when building a system. Well, at least in the way I do things it is.

You do also need to keep in mind I now have about a decade of multi-GPU use documented here on these forums. It is only recently after having tried newer cards from both AMD and NVidia that I simply gave up for the most part. I still like to play racing games on three monitors, and that's why I have dual GTX980's.
While I Do respect you're opinion,, I have to say xfire 2900 XT latency issues, are not going to be about if I xfired 2x R9390 as both NV and AMD have worked very hard on latency within their gpu Architecture to reduce latency for VR , its still there yes but no where near as bad and good enough for liquid Vr with a produo for example.
 
Last edited:
I thought about that in advance and got the 4GB versions, for only about £20 more each it was definitely worth the extra investment.

The problem as I see it is that yes this thread was meant to be myopic, it's about my own personal experience that I got into through my own personal circumstances, but there are so many people saying "crossfire has issues", "90% of games don't work", "it doesn't produce playable frame rates"... In my experience, however limited, that simply is not the case and so it's misleading to people considering crossfire for whatever reason to be told categorically that it is the case.

At the same time though this was never meant to be the definitive answer to whether or not crossfire is worth it, what's worth it or not is a very personal thing, and for me it was.


and for me.. :)

first off i bought a ready made PC with one 970 card.. i had no intentions of doing much with it except using it for general purpose stuff.. but me being me i soon added a few bits.. one more 970 card and a better psu and cpu cooler.. the altered machine performed very nicely with not the slightest sli problem.. all in all a very cost effective and satisfactory upgrade..

i then for no valid reason other than i like messing with things and buying new stuff took one more upgrade step.. a pair off 980tI cards.. one would not have been an upgrade so it had to be two or nothing.. i did have slight problems with cooling but that is only because my case is now over full with toasty hardware..

but i dont have the slightest problems with how sli works.. a pic of my machine.. its a little overfull of stuff but i dont really want a larger case and my cooling issues are all sorted nicely..

i speak from my own experience.. which has been sli positive.. my own experience does makes me think most of the negative comments dont have much validly behind them.. but i think the same thing about most of the stuff i read on the internet.. :)

i aint overly bothered about taking forum flack so dont mind voicing my own opinion which is always based on my own experience.. not on what i read.. he he

internals-2.jpg


trog
 
Last edited:
and for me.. :)

trog
Glad it worked out for you! One problem I noticed with open air blower designs like those (I have an evga 980 ti) is that CPU cooling takes a huge hit if the cooler is above the GPU. I use a H60 for CPU cooling, and after moving the radiator to the front as an intake, CPU temperatures dropped by about 20 C. It's not a big deal because the CPU is an i5-6600K that barely draws power anyways, but it's nice to see it in the 30s range under load instead of >50 C.
 
Trog, you sure do like to show off your rats nest of cables.
 
Back
Top