• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD CrossFireX Scaling is Actually Pretty Terrible with Mixed RX Vega

Well, this is why I'm insisting on a single GPU rule. I rather spend 800€ on a top of the line graphic card than pay 400€ now and 400€ next year for another same card just to hopefully scale the thing somewhere. Never been let down. Had other issues but never performance (or scaling) ones.
 
Just because it's advertised as "functional", that doesn't mean it'll perform good. It's not like multi GPU was released just last year. It has been around for decades. And we all know it only really works well when same cards are paired.
Note to TPU staff: stop testing stuff that doesn't put out nice numbers.
 
Well, this is why I'm insisting on a single GPU rule. I rather spend 800€ on a top of the line graphic card than pay 400€ now and 400€ next year for another same card just to hopefully scale the thing somewhere. Never been let down. Had other issues but never performance (or scaling) ones.
Thats been most people's rule.

That doesnt change the fact that mixed existed previously and worked a lot better. Id hardly call it functional either. Another rushed implementation.
 
Would you guys give poor @RejZoR a break? Geez. :D


To me it feels like RTG is committed to kill crossfire support from now on. They have made their attempt clear. Honestly I strongly feel like RTG then under Raja's lead was trying to exit consumer graphics as a whole. Raja was probably betting big on AI market.
 
My 2nd V64 just came in today, hardware installed and testing but I have to go away on a trip tonight, so will not get to play with the new driver until Sunday. ;)

Perfect timing other than that!
 
While the article is informative and good and bravo for doing it and everything, I do not understand why the author keeps judging a FIRST BETA driver like it is the FINAL driver. The way this article is written, it's like "This is it folks. This is the best it can get. Ever. And it is disappointing".

I am sorry if that's the impression you got, but this is not a small thing that is brushed under the rug. A launch driver, beta or otherwise, with an announcement and major feature comes with expectations from the company and this was a case of internal and beta testing not done well.
 
Nvidia is slowly killing SLI too , these technologies have no future. It would have been better if AMD would've just dropped support altogether and not even attempt to waste precious resources with this.
 
Last edited:
Good thing they are showing Raja the door. Everyday the Vega is becoming uglier and uglier. Polish a turd 1000 times and it will still remain a turd.

For people saying hybrid Crossfire isn't a thing you are dead wrong. People even crossfired FuryX and Fury which are both different in clock speed as well as shader units and texture units.


You want proof? Here you go. By AMD RTG employee Matt himself from AMD's own community forum

https://community.amd.com/thread/186648


More proof:
http://www.overclock.net/t/1611844/r9-fury-x-r9-fury-crossfire/0_100


Something is seriously wrong with Vega's design. I am sure at the point it is no use to patch a sinking ship.


2015 right after FuryX launch RTG is spun off AMD, then lead by Raja. Now 2017 with Raja giving two consecutive flops Polaris and Vega he is shown the door. I hope Lisa Su can correct the path of RTG before it is too late.

Yea, exactly why Apple is the one demanding AMD fulfills their contract for Vega to insure a end of year launch of those new iMac that use it. Understood...
 
People will bitch about anything. They have BOTH had their share of issues. In my personal experience, I have had better luck with NVIDIA. Certainly, others have had better luck with AMD. What is amazing is how polarizing the issue is when reality dictates NONE OF US have a FUCKING CLUE which drivers are ACTUALLY better.

They are both pretty damn good. It's just that people like crapping on AMD more.
 
I'd be interested in seeing a budget Vega card with GDDR5.

As discussed in another thread, never going to happen due to increased power consumption of GDDR over HBM. The end result would be a Vega that is both slower and more power hungry (in the order of tens of watts) than HBM Vega.
 
Back then, mixing 2 different cards is common place. Now? No one is going to do that kind of setup, let alone building SLI or CFX gaming PCs...
 
I am sorry if that's the impression you got, but this is not a small thing that is brushed under the rug. A launch driver, beta or otherwise, with an announcement and major feature comes with expectations from the company and this was a case of internal and beta testing not done well.
It's the impression the article tries to pass. From "a level of lenience" the article goes straight to "no excuse", for a feature that it is there for users to check or just play with it, but not advertised as working flawlessly. Not even mentioned, if I am not missing something. Does AMD advertise Vega 56 + Vega 64 CrossFire? If not, then assuming that Vega is still a GCN card and so CrossFire should be working as it was on older cards, it's not exactly something I am expecting to read on a tech site's article. Maybe in the comments section.
 
They are both pretty damn good. It's just that people like crapping on AMD more.
Lol, please....

The amount of shit people peddle knows no boundaries for brand. There is misplaced hate everywhere. Mostly from the clueless muppets that loooove to post here.
 
80% for normal crossfire I have heard.

Is this some sort of FUD to put "crossfire" and "terrible scaling" into the same sentence? Why mix Vega 56 and Vega 64? Does it work well with 1070 and 1080?
 
80% for normal crossfire I have heard.

Is this some sort of FUD to put "crossfire" and "terrible scaling" into the same sentence? Why mix Vega 56 and Vega 64? Does it work well with 1070 and 1080?
Because it was a feature they said would work, and did work a lot better with fury... as was mentioned previously in thread.

So they were right... it works. But many titles have negative scaling or none. Hopefully the can improve upon it or ditch it.
 
Others are seeing different results with 2 x V64, so for the moment, mixing them does not seem to work well.
 
Why on Earth would you mix Vega 64 and 56 ? Just because it works, it doesn't mean it'll work well. Dual card setups NEVER worked well with two different cards. Why is this a shock to people like 20 years after dual card setups existed?
I have a 290x and 290a , in CFX it works as well as 295x2 , no complaints here.
 
80% for normal crossfire I have heard.

Is this some sort of FUD to put "crossfire" and "terrible scaling" into the same sentence? Why mix Vega 56 and Vega 64? Does it work well with 1070 and 1080?
There is no support for SLI on Nvidia 10 series cards
 
For anyone who is not sure I have been running crossfire on all of my cards since the 6850. My latest setup is a Rx 480 and RX 470 in crossfire @ 1305 clock and 1750 memory. Before anyone who doesn't have that setup tells me that the 480 is running at the 470 level. There is a program called Sapphire Trixx that allows you to match speeds with your crossfire cards. I am at work so can't post any of 3D Mark scores but I am pretty sure I get 7900 in Timespy DX12. I also see benchmarks for Dues Ex Mankind divided at 80 FPS, Shadow of Mordor is consistently at 120-130 FPS, Sleeping dogs is 180 FPS, AOS is in the mid 60s consistently and I have gotten averages of 79 FPS in TW Warhammer. I always turn everything to the highest settings and do not use AA nor Ambient Occlusion. This is all with a 2560x1440 27" monitor and R7 1700. It is my humble opinion that those crossfire results are due to drivers more than anything else. There is also the fact that DX12 lets you use whatever video resources you have regardless of AMD or Nvidia nor the series of card(s). The problem with that is it is left to the game's developer to implement and refine. As an example you have to go into the script for TW Warhammer and enable 2 GPUs. Proof that most complaints about PC games performance should be directed to the developer is in game like Vikings Battle for Asgard a lock of 30 FPS regardless of your equipment. People also forget that like a fine wine AMDs drivers get better with time I know that when I had my 7950 crossfire setup I did not feel a need to change until Polaris and I will tell you the thing I like the most about Polaris is the power draw vs the Tahiti GPU cards.
 
For anyone who is not sure I have been running crossfire on all of my cards since the 6850. My latest setup is a Rx 480 and RX 470 in crossfire @ 1305 clock and 1750 memory. Before anyone who doesn't have that setup tells me that the 480 is running at the 470 level. There is a program called Sapphire Trixx that allows you to match speeds with your crossfire cards. I am at work so can't post any of 3D Mark scores but I am pretty sure I get 7900 in Timespy DX12. I also see benchmarks for Dues Ex Mankind divided at 80 FPS, Shadow of Mordor is consistently at 120-130 FPS, Sleeping dogs is 180 FPS, AOS is in the mid 60s consistently and I have gotten averages of 79 FPS in TW Warhammer. I always turn everything to the highest settings and do not use AA nor Ambient Occlusion. This is all with a 2560x1440 27" monitor and R7 1700. It is my humble opinion that those crossfire results are due to drivers more than anything else. There is also the fact that DX12 lets you use whatever video resources you have regardless of AMD or Nvidia nor the series of card(s). The problem with that is it is left to the game's developer to implement and refine. As an example you have to go into the script for TW Warhammer and enable 2 GPUs. Proof that most complaints about PC games performance should be directed to the developer is in game like Vikings Battle for Asgard a lock of 30 FPS regardless of your equipment. People also forget that like a fine wine AMDs drivers get better with time I know that when I had my 7950 crossfire setup I did not feel a need to change until Polaris and I will tell you the thing I like the most about Polaris is the power draw vs the Tahiti GPU cards.
I don't think anyone claimed Crossfire doesn't work in general. That's obviously not the case.
But afaict, between matching the right cards (both price and performance-wise), needing to power (and house) two cards, finding the right driver and occasionally needing a third party application (Trixx), it's more hassle than people usually care for, in its current form.
 
There isn't?
https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080_SLI/20.html

Also, please edit your posts to add... don't triple post. :)

Got it! I will
I don't think anyone claimed Crossfire doesn't work in general. That's obviously not the case.
But afaict, between matching the right cards (both price and performance-wise), needing to power (and house) two cards, finding the right driver and occasionally needing a third party application (Trixx), it's more hassle than people usually care for, in its current form.
I don't think anyone claimed Crossfire doesn't work in general. That's obviously not the case.
But afaict, between matching the right cards (both price and performance-wise), needing to power (and house) two cards, finding the right driver and occasionally needing a third party application (Trixx), it's more hassle than people usually care for, in its current form.

I understand what you are saying but there were a couple of comments on this thread that stated that Crossfire is a waste of time.

I think anyone who takes the time to read threads like this would be the kind that try to get the most performance from their machines. All I wanted to establish is that I have never had a problem with my crossfire setups.

As far as SLI I do feel like a noob because my MB did come with a HB SLI bridge.
 
Back
Top