• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

ASUS Radeon R9 Fury STRIX 4 GB

the STRIX GTX 980 is OC version, the STRIX Fury is running at stock clocks. Still +50Mhz OC 980 wont change much tbh.
the strix 980 is strix version and so is strix Fury strix version, if they could or couldnt factory overclock it is not our problem, it is just a shame that Wizzard wont add at least geforce 980 strix in comparison as i believe it would be very helpful for all of us who are buying card in that 550-600$ range

I am grateful for the review but anyway
 
wont add at least geforce 980 strix in comparison
there is also the issue that i don't have the 980 strix anymore, so i wouldnt be able to test it on new drivers
 
One other thing I disliked:
AMD claims that the pump noise has been totally eliminated in a new batch of these cards.
I don't think AMD claim any such thing. From their statement on the matter:
adjustments in the sound baffling adhesive compound were applied in the assembly of the high speed cooling pump to address the specific sound a few end users experienced as problematic. This improved the acoustic profile of the pump, and repeat testing shows the specific pitch/sound in question was largely reduced through adjustments to the sound-baffling adhesive compound in the pump.
How much probably depends on the user and the specific card in question, but it falls short of the claim you are making on AMD's behalf.
 
LOL that Far Cry 4 anomaly :laugh:
Far Cry 4 seems to be CPU limited on AMD cards to around 78 FPS, which causes some large random performance swings at the "wall".
I've rerun FC4 on Fury and Fury X and updated the graphs accordingly.
 
well, its hard for me to say it : good card. worth it
 
there is also the issue that i don't have the 980 strix anymore, so i wouldnt be able to test it on new drivers
thank you for quick response, I really appreciate it and your tests, especially relative performance which were one of the first on scene if not the firs
more questions: do you consider to make also FCAT tests in near future and why not/yes, do you mean it is important as they show also user experience not only avg. fps?
 
Far Cry 4 seems to be CPU limited on AMD cards to around 78 FPS, which causes some large random performance swings at the "wall".
I've rerun FC4 on Fury and Fury X and updated the graphs accordingly.

Hey can you maybe add Assetto Corsa and Dirt Rally to list of games tested or at least one of them.They are probably two best games when it comes to physics in racing genre so it would be really useful for us sim racers.
Thank you.
 
"I wish more AMD partners built R9 Fury cards because this SKU is definitely a better rounded package than the R9 Fury X."

Hit the nail on the head. I'd go further and say that Fury is a better card than Fury X. It may not perform as well, but it's positioned much better so that it actually offers a compelling alternative to GTX 980. I don't know what, if anything, AMD can do about the price, because Fiji is already hella expensive.

It feels to me that AMD should've skipped Fury X altogether (at least for now, while they're still working out the overclocking) and released plain Fury as their first Fiji SKU. A few months down the line, they could've released a better Fury X (similar to how nVIDIA released 980 Ti) and a lot of people would've been a lot happier.
 
FCAT tests in near future
I have no plans for FCAT testing, I rather provide a large selection of games and resolutions, other reviews have FCAT data. You should always consider multiple reviews anyway.

Hey can you maybe add Assetto Corsa and Dirt Rally
i'm thinking about F1 2015 which has the new version of the ego engine
 
Great review and great card. If nvidia want to release yet another cut down gm200 card to better compete with this, they have no names left(well maybe gtx980 2560 like old fermi gtx560ti 448. Most likely they just lower the price of gtx980, if even needed).

All Fury cards will be custom PCBs, there is no reference PCB for Fury. And the Fury X PCBs are all manufactured by AMD(Sapphire actually) and given to the other companies for distribution, so they can't even use that PCB for their Fury cards. Well, Sapphire can.

Afaik reference pcb is the same as seen in Fury X, see anandtech Tri-X review:
http://www.anandtech.com/show/9421/the-amd-radeon-r9-fury-review-feat-sapphire-asus/2
 
Funny how much the review style comes into question after so many years, just to try and make AMD look better in the light. What even, the card isn't even bad, definitely better all around than the Fury X, or at least more interesting. Really glad to see multi-monitor usage is down so much, almost seems like it took a third party vendor to fix that..
 
Possibly dumb question, than why is the Fury X still double this cards? I've also seen in other reviews, the Fury X was still quite a bit higher in multi-monitor. Just didn't seem like AMD themselves fixed the core issue.
 
Possibly dumb question, than why is the Fury X still double this cards? I've also seen in other reviews, the Fury X was still quite a bit higher in multi-monitor. Just didn't seem like AMD themselves fixed the core issue.
Most reviews I've seen puts multi-monitor around 20-25 watts on both Fury and Fury X. Do you have a particular review that you're looking at that says otherwise? The reduced multi-monitor idle usage is definitely a perk.
 
Possibly dumb question, than why is the Fury X still double this cards? I've also seen in other reviews, the Fury X was still quite a bit higher in multi-monitor. Just didn't seem like AMD themselves fixed the core issue.
The pump
 
forget about birdie :pimp: he's a Amd fanboy

Actually I've never owned a single AMD GPU but I'd like to see the credit where it's due.

Riva TNT2 ->MX 440 8x->GeForce FX 5600 (owned for just a month - I hated it)->GeForce 6600-> GeForce 7600 GT->GeForce 8800 GT->(currently) Gigabyte GeForce GTX 660->Intend to buy GeForce 960 Ti if it gets released - if it's not, I will wait for Pascal

The reason why I've always avoided ATI/AMD is due to their awful drivers. They still are.
 
  • Like
Reactions: nem
Yes, and only Sapphire is using it, because they were the only ones to produce it in the first place. I addressed this in the original post.

Yeah I did that spoiler thing to hide my own mistake :laugh:

Is there any news for other AIB:s. I have hard time to believe that xfx, powercolor,his,msi, gigabyte etc. will skip this card.
 
I'm not sure how I should handle non-ref cards, even if I wanted to. Nearly everyone sends me his cards, so do I include them all? Or just certain manufacturers? Which ones?

I personally agree with the suggestion to somehow incorporate it. You could add the best 1-2 in the noise section, or just the closest one to that model (strix vs strix). Perhaps even not needed to add them to the graph but mention them in text

One thing I'd love to see for the best coolers (Strix, MSI Gaming etc) is ways to quiet down the cooler without affecting performance. Perhaps have a 75-80C target temperature.

GPS are so loud, even people who don't care that much about noise will rather pay 30$ on upgrading to a Strix
 
With amazing 10+2 phase, Asus set power limit at mere 216W for Fury Strixx. A custom bios is needed to release this beast.
 
With amazing 10+2 phase, Asus set power limit at mere 216W for Fury Strixx. A custom bios is needed to release this beast.

Power phases don't always allow far higher performance. It can be quite a sales gimmick - I know having owned some cards with silly numbers. If the chip runs at it's top end for the architecture, the power phases really mean the card runs at a better power limit, in other words, if the Fiji chip is near 100% of it's 'theoretical' performance, the added power circuitry helps it use power more effectively, thus drawing less power.

This shows quite a good comparison of what benefit the power phases give:

75693.png


The Strix draws 29 watts less (7% lower draw) than the Sapphire Ref speed card.
 
75693.png


The Strix draws 29 watts less (7% lower draw) than the Sapphire Ref speed card.

It's clear that you don't know about Sapphire limit of 300w. The higher power limit allows the card stay in boost clock in more circumstances, hence higher power consumption. Not to mention the difference in stock volt btw
 
Actually I've never owned a single AMD GPU but I'd like to see the credit where it's due.

Riva TNT2 ->MX 440 8x->GeForce FX 5600 (owned for just a month - I hated it)->GeForce 6600-> GeForce 7600 GT->GeForce 8800 GT->(currently) Gigabyte GeForce GTX 660->Intend to buy GeForce 960 Ti if it gets released - if it's not, I will wait for Pascal

The reason why I've always avoided ATI/AMD is due to their awful drivers. They still are.

Is it not a tad ignorant to claim "awful drivers" if you never used the product?
What makes you even think they are awful?
Ive been using my HD6950 for years now and I have had no issues of any kind with the drivers
(in opposite of my previous Nvidia 8800GTS (G92) and 7900 GTO before it where settings reset and the control panel crashed upon trying to start it and having to DL extra software to be able to tweak stuff etc, personally I always found Nvidias software to feel a lot more crude and unsophisticated vs CCC, an opinion born out of experience).
 
It's clear that you don't know about Sapphire limit of 300w. The higher power limit allows the card stay in boost clock in more circumstances, hence higher power consumption. Not to mention the difference in stock volt btw

I intentionally mentioned the Reference Sapphire card. It's a better comparison to the power phase efficiency of the Strix. At 1000Mhz for the Sapphire reference, it draws more power than the 1020(?) of the Strix. That's what the power phase is good for. It's clear you dont understand the point of a good power circuit.
 
Back
Top