• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Help getting more performance from CROSSFIRE

Yes exactly, my min FPS is same as single card but at least my avg fps is around FPS with everything to Max....

well... I guess I'm just gonna keep looking... I'd like you guys to tell me how much of a performance decrease you think I'm getting because my mobo is 2 times 8x pcie when in crossfire (I have 16x pcie with a single card tho)...
 
try a game that loads the CPU,

metro dosent effectively use more then 1.5 cores its still based coded off stalker engine which only uses 1 core.

battlefield Bad Company 2 i can hit 800w power draw day in day out with my system. to the point since i play it as often as i do had to back my overclocks down to stay stable.

most other games i use are fine

but again your comparing a 1250w unit that has 6 12v rails thats really only a 1000w unit with 6 rails pull 250w off those rails averaged across now remember thats the peak not continuous rating, on top of which the design is so inefficient it didnt even pass 80+ basic, im guessing its the power supply but if anyone else has a better idea then feel free to give the guy another answer that makes as much sense.

still, obviously... you seem to know much more than I do about PSUs... please point me towards what you would suggest to me for my current RIG... including a link would be helpful ;)
 
antec corsair enermax, others can fill you in on what ones from there

i know personally from corsair the HX line and AX lines are top notch. but theres many good units out there but the 3 companies above tend to put out the best
Corsair HX AX
Antec Signature series
Enermax not sure what units i believe the Modu+ units cant remember for sure.
 
Ok thx, Also guys I've noticed that I get momentary little freezes when I play crysis with v sync on... it goes away when I turn it off...

and I have some weird artifacts in crysis every now and then... Sometimes (especially in tree leaves) I'll see gold sparkles appear while moving the camera... in crysis warhead, there's one type of three that gets surrounded with bright white lines around its edges... weird stuff...

I reckon it could be anything...
 
Ok thx, Also guys I've noticed that I get momentary little freezes when I play crysis with v sync on... it goes away when I turn it off...

You need a steady 60+ FPS to have V-Sync on and not get stuttering. V-Sync means that it is syncing the FPS with your monitors refresh rate which is 60 hz.
 
You need a steady 60 FPS to have V-Sync on and not get stuttering. V-Sync means that it is syncing the FPS with your monitors refresh rate which is 60 hz.

yeah, if you can't get 60fps you get a little freze whilst it skips a frame or 2

@EROCKER: Good to see the old avatar is back
 
yeah, if you can't get 60fps you get a little freze whilst it skips a frame or 2

It's still frustrating for me tho... because I was certain I would get a steady 60 FPS with 2 radeon 6950 in crossfire with only one monitor... :confused:

And I've been tweaking settings on this comp since december 2010 and constantly reading on forums and overclocking... and @ this point in time... Let's just say this 6950 crossfire fail just gets me Mad :banghead:

Because I've spent enough hours tweaking this beast... not to mention all the money involved... While xbox 360 users don't have no fussing around... just put the cd in and play... You'd think that us PC owners, that go the extra mile (techwise, moneywise, overclocking wise...) would outperform by far the Xbox 360....

Arrrrrrrr... I've had it... I'm gonna play some halo reach on my Xbox 360... I just can't look @ my pc anymore lolzzz :shadedshu
 
try a game that loads the CPU,

metro dosent effectively use more then 1.5 cores its still based coded off stalker engine which only uses 1 core.

battlefield Bad Company 2 i can hit 800w power draw day in day out with my system. to the point since i play it as often as i do had to back my overclocks down to stay stable.

most other games i use are fine

but again your comparing a 1250w unit that has 6 12v rails thats really only a 1000w unit with 6 rails pull 250w off those rails averaged across now remember thats the peak not continuous rating, on top of which the design is so inefficient it didnt even pass 80+ basic, im guessing its the power supply but if anyone else has a better idea then feel free to give the guy another answer that makes as much sense.

I'm sorry but I just ran furmark and three threads of Linx and I'm topping out at around 500w at the wall including my monitor. I'm just not too sure where you are seeing these numbers. Like I said I don't see how you could get a similar system to draw anywhere near 850w.

I think that the op is just dealing with application/driver issues.
 
Last edited:
I'm sorry but I just ran furmark and three threads of Linx and I'm topping out at around 500w at the wall including my monitor. I'm just not too sure where you are seeing these numbers. Like I said I don't see how you could get a similar system to draw anywhere near 850w.

I think that the op is just dealing with application/driver issues.

With one card or two? The OP has two. Plus the shader unlock? Virtually impossible to pull just 500w with two cards, W1zz's own numbers show 190W per card in furmark, before unlock, nevermind my own numbers.
 
With one card or two? The OP has two. Plus the shader unlock? Virtually impossible to pull just 500w with two cards, W1zz's own numbers show 190W per card in furmark, before unlock, nevermind my own numbers.

Thats with two cards. I am using two unlocked 6950s at stock voltage. So if each card with a 6970 bios draws 190w in a worst case scenario (furmark). What 250 with an insane overclock that would annoy the heck out of me on the stock cooler lets just say that it becomes 250w per card, worst case. What does a massively overclocked cpu draw? 150w on air, tops? The rest of your pc, fans, hard drives, usb, motherboard, maybe another 50w at the most? Thats 700w with everything being fully stressed. You wouldn't even see that in real world situations unless you wanted to run Linx and furmark at the same time and that is still quite a way from 850w.

Here is where tweaktown saw 550w with two 6970s in crossfire and an i7 980x overclocked to 4.2ghz. http://www.tweaktown.com/articles/3741/amd_radeon_hd_6970_2gb_video_card_in_crossfire/index16.html

I don't think that this has anything to do with the OP's "problem". That is most likely software related. It might not be a bad idea to replace his psu with something more reliable one of these days but I would be shocked if that were to solve his "problem".
 
Last edited:
I don't think that this has anything to do with the OP's "problem". That is most likely software related. It might not be a bad idea to replace his psu with something more reliable one of these days but I would be shocked if that were to solve his "problem".


Yeah, I agree, probably software, but I suppose the power issue shouldn't just be discounted as well. It would be the last thing I considered.

But it was 190w without the unlocked shaders, as far as I understood.(pretty sure, LINK).

If you are only pulling 500W...OMG, how well do those cards clock? :laugh:
 
Yeah, I agree, probably software, but I suppose the power issue shouldn't just be discounted as well. It would be the last thing I considered.

But it was 190w without the unlocked shaders, as far as I understood.(pretty sure, LINK).

If you are only pulling 500W...OMG, how well do those cards clock? :laugh:

I thought that you were quoting Wizzards 6950 bios mod article. So yeah, it's 252w in furmark with a 6970 bios. 189w on a stock 6950 which at stock voltages is probably closer to what my cards see.

Playing with the voltage and core clock I saw my ups reporting that I was drawing a little over 600w in just gputool fullscreen but it took 1.3v.
 
im wondering bababooey do you have the power slider at default 0% or at 20% with your cards, because if i set mine to 0% yea they use alot less power but it also throttles my gpus clock speeds back in both metro 2033, battlefield bad company 2, being the only 2 games i tested power tune with, my overclocks if set at 0% will drop nearly 60-70mhz on the core or more since 20% ups the max draw, i peak alot higher to maintain those clock speeds. not to mention having my clock speeds bounce up and down causes alot of jitter for me in those titles. much larger fps spread between min max and avg.
 
Alright guys, I'm back with some more questions... Obviously, I agree with crazyeyesreaper... My PSU is definitly not a "top brand"... I didn't know at that time that the PSU could be so radically important in a RIG. That was my bad...

Meanwhile, I went into the bios and CCC to find different options that might help my cause. In my BIOS, I see a PCI frequency option (which is right now @ 100... should I raise that number... and if so would it help increase performance??)

I aslo have CPU Spread Spectrum option and a PCI spread Spectrum option... I can leave it @ auto, disable or enable. What should it be?

As for a 2x Radeon 6950 crossfire setup, is it better for me to play on a 120hz monitor, or I should be fine with my 60hz? I figured maybe owning a 120hz monitor would allow me to turn off V-sync and let the FPS go to a 100 or so without having to worry about screen tearing...

Let me know what you think about the questions in blue...
 
Alright guys, I'm back with some more questions... Obviously, I agree with crazyeyesreaper... My PSU is definitly not a "top brand"... I didn't know at that time that the PSU could be so radically important in a RIG. That was my bad...

Meanwhile, I went into the bios and CCC to find different options that might help my cause. In my BIOS, I see a PCI frequency option (which is right now @ 100... should I raise that number... and if so would it help increase performance??)

I aslo have CPU Spread Spectrum option and a PCI spread Spectrum option... I can leave it @ auto, disable or enable. What should it be?

As for a 2x Radeon 6950 crossfire setup, is it better for me to play on a 120hz monitor, or I should be fine with my 60hz? I figured maybe owning a 120hz monitor would allow me to turn off V-sync and let the FPS go to a 100 or so without having to worry about screen tearing...

Let me know what you think about the questions in blue...

Leave the PCI-E frequency at 100

Spread spectrum should be disabled

120hz monitor won't matter with V-Sync off. With V-Sync off, you get all the FPS your system can give.
 
Meanwhile, I went into the bios and CCC to find different options that might help my cause. In my BIOS, I see a PCI frequency option (which is right now @ 100... should I raise that number... and if so would it help increase performance??)

I aslo have CPU Spread Spectrum option and a PCI spread Spectrum option... I can leave it @ auto, disable or enable. What should it be?

As erocker said, leave your pci-e @ 100Mhz.

But before you try anything else, I'd put everything at stock clocks and see how Metro2033 runs then.

Only if it runs fine on stock clocks would I suggest you change overclock settings...
 
Have you made sure that in the CCC it is set to use applications settings and not the CCC's. I know this does drastically screw up the FPS. Always use the applications settings and not the CCC's. Well not always but 99% of the time.

For example.... when I had my two HD6950's.... if I ran metro with the CCC's settings I get 20 to 35 FPS. But if I used the Applications settings... I got 50 to 60 FPS (With Vsync Forced of course).

The only thing I used CCC for was Morphological AA and that depends on the game.
 
Leave the PCI-E frequency at 100

Spread spectrum should be disabled

120hz monitor won't matter with V-Sync off. With V-Sync off, you get all the FPS your system can give.

yah but with v-sync off I get more fps than my refresh rate which is 60hz right? so with 120hz... wouldn't all those frames fit instead of tearing...?
 
Yeah, I've done some further testing for Crysis, and it ain't that bad... When I activate V-sync, I'm locked @50fps but it's not that bad considering all the settings are @ the max. Also, I sometimes have drops to 30Fps... which is annoying... but it happens only when I get in an intense area of a map... which is normal for a beast of a game like crysis...

I've tested crysis warhead, and I get the same as Crysis in terms of performance. Thx for the info about the PCI spread spectrum... I'll go ahead and turn that off.
 
yah but with v-sync off I get more fps than my refresh rate which is 60hz right? so with 120hz... wouldn't all those frames fit instead of tearing...?

That's not how it works. With V-Sync off hz doesn't matter in the way you need/want it to.
 
where can I find good quality PSUs for a good price... I've seen some of you guys have a link to a certain "heatware" website... How does that work?

Can I find someone who has a decent power supply to sell me there? or to trade me?
 
where can I find good quality PSUs for a good price... I've seen some of you guys have a link to a certain "heatware" website... How does that work?

Can I find someone who has a decent power supply to sell me there? or to trade me?

HEATWARE is a site that tracks trades/sales only. It doesn't serve any other purpose but to serve as a guide for those wanting to know a bit about the history of deals made by the user.


You can buy stuff at the store...online...:p...many forums have for sale sections...
 
where can I find good quality PSUs for a good price... I've seen some of you guys have a link to a certain "heatware" website... How does that work?

Can I find someone who has a decent power supply to sell me there? or to trade me?

I have a Corsair TX750 that I bought brand new from Newegg on 3/20, used for about 2-3 weeks. PM me if you're interested.

Edit: I just noticed you're now on a 1200W, so my offer might be too low for you
 
Here is a good comparison with total system power consumption of a few different setups including 6970 crossfire and a pretty heavily overclocked 6990. Link Even the overclocked 6990 tops out at 554w.
 
From your link, bababooey:

HWC said:
In addition, anything more than 400W of draw could spell trouble for some multi rail power supplies with 18A rails. For these cases, we recommend that you be very careful with which connectors are used for the HD 6990.
 
Back
Top