Thursday, June 18th 2015

ASUS Tames AMD's Feisty Grenada Silicon, Intros 0 dBA Idle STRIX Graphics Cards

ASUS managed to tame AMD's feisty "Grenada" silicon, which powers the Radeon R9 390 and Radeon R9 390X, by announcing two high-end graphics cards based on its new triple-fan STRIX DirectCU 3 cooling solution. The cooler turns its fans off when the GPU is idling (common desktop / light-3D loads), and begins to spool up only under heavy 3D loads. The company claims that this will be the quietest R9 390 series cards you can buy.

The STRIX DirectCU 3 cooler is the same as the one pictured cooling the GeForce GTX 980 Ti STRIX, which we spotted at Computex. It features a huge monolithic aluminium fin-stack heatsink, to which heat drawn from the GPU is fed by four 10 mm thick nickel-plated copper heat pipes. This heatsink is ventilated by three 100 mm spinners. This heatsink has contact bases even over the card's 8-phase VRM, and a base-plate that draws heat from its 16 GDDR5 memory chips, that make up 8 GB. The R9 390 STRIX offers factory OC of 1050 MHz (vs. 1000 MHz reference); while the R9 390X STRIX offers 1070 MHz (vs. 1050 MHz reference). The memory ticks at 6.00 GHz on both cards. ASUS didn't announce pricing.
Add your own comment

21 Comments on ASUS Tames AMD's Feisty Grenada Silicon, Intros 0 dBA Idle STRIX Graphics Cards

#1
GhostRyder
Not a bad looking design (Similar to that 3 fan GTX 960). I think I love the outputs available on it more than anything!
Posted on Reply
#2
RejZoR
I don't get this gimmick. If any of the manufacturers paid some attention to the fan curves, we'd have absolutely silent graphic cards for few years already. Instead, on my massive WindForce 3X, the fan curve was carved by an idiot. Typical summer idle is 55°C, meaning anything below this has to be minimum RPM, in my case 30% speed which is absolutely silent. And when temperature increases, increase fan speed from here on. Not already at 40°C and make it unbearably noisy when it hits just 60°C. It's idiotic. These GPU's can handly 85°C with ease. So why the hell make such massive cooler cards so noisy?

I'm currently running HD7950 core at 1,2GHz using 1,3V and VRAM at 6 GHz and the card hardly makes much noise and yet the temperature remains below 80°C. If I can pull this off in my office, surely a multibillion company like ASUS or Giagbyte can as well. But they just don't bother doing it right.
Posted on Reply
#3
the54thvoid
Meh.

I just can't get excited about it. Much like I wasn't excited about the 280X cards, or the GTX 770's. Old tech is old tech.
Posted on Reply
#4
hojnikb
RejZoR, post: 3300250, member: 1515"
I don't get this gimmick. If any of the manufacturers paid some attention to the fan curves, we'd have absolutely silent graphic cards for few years already. Instead, on my massive WindForce 3X, the fan curve was carved by an idiot. Typical summer idle is 55°C, meaning anything below this has to be minimum RPM, in my case 30% speed which is absolutely silent. And when temperature increases, increase fan speed from here on. Not already at 40°C and make it unbearably noisy when it hits just 60°C. It's idiotic. These GPU's can handly 85°C with ease. So why the hell make such massive cooler cards so noisy?

I'm currently running HD7950 core at 1,2GHz using 1,3V and VRAM at 6 GHz and the card hardly makes much noise and yet the temperature remains below 80°C. If I can pull this off in my office, surely a multibillion company like ASUS or Giagbyte can as well. But they just don't bother doing it right.
So... when are we expecting to see some shoe eating madness ? :)
Posted on Reply
#5
Joss
This should had been the 290x launch 18 months ago.
Posted on Reply
#6
RejZoR
hojnikb, post: 3300266, member: 148747"
So... when are we expecting to see some shoe eating madness ? :)
For as much as they are asking for R9-390X's, I'm not eating any shoes.
Posted on Reply
#7
the54thvoid
Joss, post: 3300269, member: 152251"
This should had been the 290x launch 18 months ago.
The maturity of the 28nm process means TSMC have dialled in the power consumption. Allows higher clocks for less heat and power.
Its still a 290X though. Bring sold at the same price...
Posted on Reply
#8
GhostRyder
the54thvoid, post: 3300318, member: 79251"
The maturity of the 28nm process means TSMC have dialled in the power consumption. Allows higher clocks for less heat and power.
Its still a 290X though. Bring sold at the same price...
Yea but they are a lot better than what was believed and what even I thought was going to be the end result.

I don't applaud the rebrand, but it seems the process has improved at least a decent amount to achieve those numbers. Sadly though I wish we had GCN 1.2 on those cards for the other improvements on that board as well as the power draw. But at least this is a decent step up (Especially considering that's lower power while containing 4 more gb of ram and a higher core clock of 1100mhz on that model). The process matured quite a bit for that improvement.
Posted on Reply
#9
$ReaPeR$
its not bad.. but its not exciting either.. old tech is old tech even if it gets the job done.
Posted on Reply
#10
Joss
the54thvoid, post: 3300318, member: 79251"
The maturity of the 28nm process means TSMC have dialled in the power consumption. Allows higher clocks for less heat and power.
I'm talking about the non reference cards being readily available and with proper coolers.
Posted on Reply
#11
the54thvoid
GhostRyder, post: 3300329, member: 149328"
Yea but they are a lot better than what was believed and what even I thought was going to be the end result.

I don't applaud the rebrand, but it seems the process has improved at least a decent amount to achieve those numbers. Sadly though I wish we had GCN 1.2 on those cards for the other improvements on that board as well as the power draw. But at least this is a decent step up (Especially considering that's lower power while containing 4 more gb of ram and a higher core clock of 1100mhz on that model). The process matured quite a bit for that improvement.
To quote a very relevant statement from your links conclusion:
You can tweak any 290X close to the 390X performance wise, the faster memory won't make a big difference as the 290X already had a ton of memory bandwidth
The MSI Gaming is 1100MHz which is 10% above Hawaii speeds. So it trades blows with a card that replaced the 780ti (arguably Nvidia's aim with GM204 was to drastically reduce power consumption but match GK110 performance).

So really, R9 390X versus GTX 980 is really, R9 290X versus GTX 780ti all over again. Yawn (for both). One caveat, the GTX 980 (GK110 performance rival) draws 171 Watts in your linked review. The 390X draws 258. So 390X equals 980 performance using 50% more power. Is that really an achievement?

And before the light brigade charges in with crap about TDP doesn't matter - AMD themselves are certainly espousing Fiji's power/performance metrics, even more so with their Nano.

No, these cards are dull. Fury is (looking to be) awesome. Even if it is on par with 980ti - I think it's a prestige looking part. 6 more days, then we can talk about 980ti / Fury X battles.
Posted on Reply
#12
GhostRyder
the54thvoid, post: 3300342, member: 79251"
To quote a very relevant statement from your links conclusion:



The MSI Gaming is 1100MHz which is 10% above Hawaii speeds. So it trades blows with a card that replaced the 780ti (arguably Nvidia's aim with GM204 was to drastically reduce power consumption but match GK110 performance).

So really, R9 390X versus GTX 980 is really, R9 290X versus GTX 780ti all over again. Yawn (for both). One caveat, the GTX 980 (GK110 performance rival) draws 171 Watts in your linked review. The 390X draws 258. So 390X equals 980 performance using 50% more power. Is that really an achievement?

And before the light brigade charges in with crap about TDP doesn't matter - AMD themselves are certainly espousing Fiji's power/performance metrics, even more so with their Nano.

No, these cards are dull. Fury is (looking to be) awesome. Even if it is on par with 980ti - I think it's a prestige looking part. 6 more days, then we can talk about 980ti / Fury X battles.
Was not claiming it to be amazing or super interesting, just a note that its better than it used to be at least all around. I mean just talking straight core clocks and memory all my 290X's are at 1125/1475 so mine outperform even the MSI. However they probably use a significant amount more power just to do that. I am not claiming this is the greatest thing since sliced bread, just that for what your getting its not a bad deal including that it is improved a bit more than just the an exact copy (which has been where a lot of the debate on how bad the 3XX series is).

I as well am mostly interested in the Fury more than anything else in the series. However its to see there was some improvements down the line at least instead of just putting a different sticker on a box.

I agree, I think the GTX 980 was aimed to just reduce power (err the whole series) over its last predecessor instead of offer serious improvements in the performance area (Which was reserved for the GTX 980ti/TitanX).
Posted on Reply
#13
the54thvoid
GhostRyder, post: 3300343, member: 149328"
Was not claiming it to be amazing or super interesting, just a note that its better than it used to be at least all around. I mean just talking straight core clocks and memory all my 290X's are at 1125/1475 so mine outperform even the MSI. However they probably use a significant amount more power just to do that. I am not claiming this is the greatest thing since sliced bread, just that for what your getting its not a bad deal including that it is improved a bit more than just the an exact copy (which has been where a lot of the debate on how bad the 3XX series is).

I as well am mostly interested in the Fury more than anything else in the series. However its to see there was some improvements down the line at least instead of just putting a different sticker on a box.

I agree, I think the GTX 980 was aimed to just reduce power (err the whole series) over its last predecessor instead of offer serious improvements in the performance area (Which was reserved for the GTX 980ti/TitanX).
Don't get me wrong - not having any fight with you. But really, the silicon is refined to it's maximum but it's been almost 2 years (Fall 2013?). You want impressed by maturity - go back to the days of Fermi GTX 480 and watch as the GTX 580 rose from the ashes (metaphorically, though some would say literally!). A chip that some people said couldn't be done.

It's fine if people want to be marginally impressed by it but in terms of performance I'd expect nothing less. It performs better with less power than 290X - good. But compared to it's green competitor, it's still a power monster, so in that perspective - no change from 1-2 years ago.

But lest I be branded by the circling red talons - I can't wait for the reviews of Fury X. I want to see it perform better than GM200. I want to know it's noise (pump/fan) and it's gaming pedigree. I hope for AMD's sake it's a game changer (it already is with HBM) but really - a shot in the arse for Nvidia would be good.
Posted on Reply
#14
xkm1948
Enough with the re branded stuff, give us Fury X already!
Posted on Reply
#15
HumanSmoke
GhostRyder, post: 3300329, member: 149328"
Yea but they are a lot better than what was believed and what even I thought was going to be the end result.
I don't applaud the rebrand, but it seems the process has improved at least a decent amount to achieve those numbers.
The OC of the MSI card makes the result more marked (and the differing driver used might also contribute). When the clocks are normalized, the difference is marginal.


HardOCP's power consumption figures are at odd's with Hilbert's also

Posted on Reply
#16
Tallencor
TPU's First Patreon
Can two of these 390x's be crossfired and use 16G of mem total with Dx 12?
Posted on Reply
#17
Nordic
RejZoR, post: 3300250, member: 1515"
I don't get this gimmick. If any of the manufacturers paid some attention to the fan curves, we'd have absolutely silent graphic cards for few years already. Instead, on my massive WindForce 3X, the fan curve was carved by an idiot. Typical summer idle is 55°C, meaning anything below this has to be minimum RPM, in my case 30% speed which is absolutely silent. And when temperature increases, increase fan speed from here on. Not already at 40°C and make it unbearably noisy when it hits just 60°C. It's idiotic. These GPU's can handly 85°C with ease. So why the hell make such massive cooler cards so noisy?

I'm currently running HD7950 core at 1,2GHz using 1,3V and VRAM at 6 GHz and the card hardly makes much noise and yet the temperature remains below 80°C. If I can pull this off in my office, surely a multibillion company like ASUS or Giagbyte can as well. But they just don't bother doing it right.
How good is your case airflow? How hot is your climate? I bet the gpu vendors take this into account and make a fan curve that will be effective for most users. You, knowing your hardware configuration, can perfect your fan curve. This is one of the benefits of owning a pc.
Posted on Reply
#18
btarunr
Editor & Senior Moderator
Tallencor, post: 3300445, member: 143014"
Can two of these 390x's be crossfired and use 16G of mem total with Dx 12?
You can CrossFire up to four of these cards.
Posted on Reply
#19
RejZoR
james888, post: 3300460, member: 96457"
How good is your case airflow? How hot is your climate? I bet the gpu vendors take this into account and make a fan curve that will be effective for most users. You, knowing your hardware configuration, can perfect your fan curve. This is one of the benefits of owning a pc.
miniATX case with graphic card crammed next to the PSU and at the bottom of the case, like 2cm above the bottom of the case. Ambient temperature right now, 30°C. Not exactly ideal conditions and yet I can hardly hear it. If you stick such card in a larger case, it would work even better...
Posted on Reply
#20
Casecutter
HumanSmoke, post: 3300371, member: 98425"
HardOCP's power consumption figures are at odd's with Hilbert's also
Brent goes through the Clock to Clock exercise to disprove any differences on use the extra memory... at the lower speed on 1440p. We knew that answer.... Brent. I was hoping to see power. I suppose to truly find any ASIC quality differences they'd need an 8Gb 290X (hopefully of the same AIB) that would've offered if there was any proof in AMD's pudding in enhanced efficiency. (or lower the TDP from 290W to now 275W)

Edit: Ok just looking at W1zzard numbers there no way they have any claim to that lower TDP...
Posted on Reply
#21
GhostRyder
HumanSmoke, post: 3300371, member: 98425"
The OC of the MSI card makes the result more marked (and the differing driver used might also contribute). When the clocks are normalized, the difference is marginal.


HardOCP's power consumption figures are at odd's with Hilbert's also


Reviews are all over the place for power consumption currently, I frankly do not know what to believe now...

Casecutter, post: 3300853, member: 94772"
Brent goes through the Clock to Clock exercise to disprove any differences on use the extra memory... at the lower speed on 1440p. We knew that answer.... Brent. I was hoping to see power. I suppose to truly find any ASIC quality differences they'd need an 8Gb 290X (hopefully of the same AIB) that would've offered if there was any proof in AMD's pudding in enhanced efficiency. (or lower the TDP from 290W to now 275W)

Edit: Ok just looking at W1zzard numbers there no way they have any claim to that lower TDP...
I am lost after seeing that, all the reviews out there show some different things when regarding power. I am starting to wonder if these samples vary that much or if there is something up with his specific card.
Posted on Reply
Add your own comment