Friday, September 12th 2014

AMD Readies Radeon R9 390X to Take on GeForce GTX 980

It turns out that the big OEM design win liquid cooling solutions maker Asetek was bragging about, is the Radeon R9 390X, and the "undisclosed OEM" AMD. Pictures of a cooler shroud is doing rounds on Chinese tech forums, which reveals something that's similar in design to the Radeon R9 295X2, only designed for single-GPU. The shroud has its fan intake pushed to where it normally is for single-GPU cards; with cutouts for the PCIe power connectors, and a central one, through which liquid cooling tubes pass through.

One can also take a peek at the base-plate of the cooler, which will cool the VRM and memory under the fan's air-flow. The cooler design reveals that AMD wants its reference-design cards to sound quieter "at any cost," even if it means liquid cooling solutions that can be messy with multi-card CrossFire setups, and in systems that already use liquid-cooling for the CPU; and leave it to AIB partners to come up with air-cooled cards, with meatier heatsinks. Other specs of the R9 390X are unknown, as is launch date. It could be based on a member of the "Pirate Islands" family of GPUs, of which the new "Tonga" GPU driving the R9 285 is a part of. A possible codename of AMD's big chip from this family is "Fiji."
Source: VideoCardz
Add your own comment

112 Comments on AMD Readies Radeon R9 390X to Take on GeForce GTX 980

#51
buildzoid
the54thvoidGosh, lot's of angst in here.....

Everyone seems to be missing the point. Why is Maxwell good? More efficiency than Kepler. Why is this good? Because for the entire industry, efficiency is good. For the end user, efficiency is good.

This post is about a water loop design for a card from the outset. That sets alarm bells ringing in this house. The 295x2 was AMD's Lance of Ungodly Awesomeness. It truly slayed Nvidia's most powerful beast and AMD deserve all the credit they can get for making green look dumb as fuck.

HOWEVER, if their next gpu flagship (single chip) requires a water loop as the base construct.... WTF? Every single chip vendor, be it Qualcomm, Nvidia, Intel, AMD APU's etc are driving toward efficiency. It seems unreasonable to slap a water block on a card that didn't need it so you have to assume that if this is for AMD's next single chip, it's already in the wrong place.

I'm all for a 390X to compete with 980 (or 980Ti, depending on release dates) but not if AMD need to cool it with a water solution.

I'm going to fly against all of you people and call this out as FUD. In the current ecosystem, any move to higher power consumption on a new architecture is a negative move. And if it's not got higher consumption, why slap a water block on it?

Nah, something smells here. I'll believe it only when I see it.
Well Nvidia's 980 looks like a power efficient side grade compared to the 780Ti. Amd probably decided that it's a great chance to make a card that is much more power full even if it pulls a ton of power. If they make a 3328-3840 SP GPU it will be much faster than anything Nvidia is making the only downside will be that it'll probably pull 300+W
Posted on Reply
#52
the54thvoid
Intoxicated Moderator
Sony Xperia SI am against what you say and for what AMD want to do. Because ultimately higher power consumption equals to higher performance than if the former was lower. ;)

I am positive towards aggressive performance jumps. ;)
Worst case for a 3Gb buffer for a stock 780Ti versus the 290X Lightning. 780Ti has a meagre 3% lead....



On W1zzard's average power chart the lower performing 290X uses 47W more power.



More power does not equate to more performance. And even on the same arch, more power can lead to poorer returns of scale on performance due to design tolerances.

It is simply foolish and naive in today's tech world to move your design regime to a higher power draw chip. Especially if you think it's going to need water cooling from the offset.

The one caveat of rationale is that AMD have scored a cost effective cooling solution and simply want a product that is whisper quiet and very cool, i.e. it doesn't actually need to be water cooled. That would be good, except if you already own a custom loop.

The second caveat (of technology lore) is that it is so much faster than what they have now they figure the power draw is acceptable. But given what we've seen the past few years, that's unlikely (from either camp).

No, simply, no - if AMD are pushing the power limits skywards, that's a dumb move when everyone is demanding lower power. And not to forget, there are legislators out there right now cracking down on excessive domestic power consumption.....
Posted on Reply
#53
Lionheart
Hitman_ActualGarbage
I agree.. Your comment is garbage!
Posted on Reply
#54
GhostRyder
the54thvoidGosh, lot's of angst in here.....

Everyone seems to be missing the point. Why is Maxwell good? More efficiency than Kepler. Why is this good? Because for the entire industry, efficiency is good. For the end user, efficiency is good.

This post is about a water loop design for a card from the outset. That sets alarm bells ringing in this house. The 295x2 was AMD's Lance of Ungodly Awesomeness. It truly slayed Nvidia's most powerful beast and AMD deserve all the credit they can get for making green look dumb as fuck.

HOWEVER, if their next gpu flagship (single chip) requires a water loop as the base construct.... WTF? Every single chip vendor, be it Qualcomm, Nvidia, Intel, AMD APU's etc are driving toward efficiency. It seems unreasonable to slap a water block on a card that didn't need it so you have to assume that if this is for AMD's next single chip, it's already in the wrong place.

I'm all for a 390X to compete with 980 (or 980Ti, depending on release dates) but not if AMD need to cool it with a water solution.

I'm going to fly against all of you people and call this out as FUD. In the current ecosystem, any move to higher power consumption on a new architecture is a negative move. And if it's not got higher consumption, why slap a water block on it?

Nah, something smells here. I'll believe it only when I see it.
Just because it has a water cooler does not mean it was absolutely necessary.

Take for example the recent announcement of the 6+8Pin connectors on the 970 and 980 when people thought they would not need that much power. But then it comes back to do they even really need that as well? Well maybe so or maybe not its just there to give the extra headroom for the people buying it or relieve any potential stress similar to the GTX 750ti and how it can have the connector or not depending on overclocks.

Similar with the R9 295X2 having a water cooler it turned out to be great but not completely necessary as we have seen with the R9 290X Dual Core Devil 13.

Just because there is something like a water cooler in reference does not mean it was absolutely necessary. I think people need to look more towards this being something completely different and cool rather than "ZOMG ITZ GOING TO USE MOAR POWER".

I think its more of AMD heard people complaining about reference coolers and said ok lets show them what we can do.
Posted on Reply
#55
the54thvoid
Intoxicated Moderator
GhostRyder....Just because there is something like a water cooler in reference does not mean it was absolutely necessary. I think people need to look more towards this being something completely different and cool rather than "ZOMG ITZ GOING TO USE MOAR POWER".

I think its more of AMD heard people complaining about reference coolers and said ok lets show them what we can do.
Read my last few lines of my post :p

The bit where I say, unless they don't actually need it.....
Posted on Reply
#56
GhostRyder
the54thvoidRead my last few lines of my post :p

The bit where I say, unless they don't actually need it.....
That post came after I had already started writing/finishing up the post I was working on so I missed it lol :P. But yea I doubt we are going to run into a card that requires something like 3 8 Pin connectors or anything crazy on the reference at least (And a single GPU).
Posted on Reply
#57
the54thvoid
Intoxicated Moderator
GhostRyderThat post came after I had already started writing/finishing up the post I was working on so I missed it lol :p. But yea I doubt we are going to run into a card that requires something like 3 8 Pin connectors or anything crazy on the reference at least (And a single GPU).
What Bullzoid said might not be far from the truth. There is a chance it does have high-ish power consumption because AMD have seen that the 750ti Maxwell was good on efficiency but not exactly fast (did it not equate on speed to a 660?). With that they figure 980 might be that side ways upgrade.

The way I see it coming from Nv is that they'll maybe try and tout the 980 sli pitch for 4k gaming with low power consumption. Frankly, that's what I'm looking for in my next gpu - a 4K performance that doesn't eat power like a silicon valley troll.

But.... GM210 will probably be a very fast card. Nv might not focus too much on it's power draw. Let's face it, as Humansmoke pointed out - the Asetek news article does state 2015 release. Next year will be the war of the big parts and perhaps 390X versus 980Ti could well be a fiery, ungodly eco-disaster!
Posted on Reply
#58
HumanSmoke
the54thvoidMore power does not equate to more performance. And even on the same arch, more power can lead to poorer returns of scale on performance due to design tolerances.
Very easily seen with overclocking. I'm not too sure that some people are aware of how transistor leakage scales with input power. A case of diminishing returns, as you've noted.
the54thvoidThe one caveat of rationale is that AMD have scored a cost effective cooling solution and simply want a product that is whisper quiet and very cool, i.e. it doesn't actually need to be water cooled.
Quite possibly. A good vapour chamber air cooler isn't overly cheap to produce, and if Asetek cut AMD a sweetheart deal that involves not just their next single GPU flagship but the series after that and dual-GPU parts, they'd likely cut their asking price for a long-term association.
the54thvoidThe second caveat (of technology lore) is that it is so much faster than what they have now they figure the power draw is acceptable. But given what we've seen the past few years, that's unlikely (from either camp).
The third caveat is that die shrinking and smaller process nodes tend to generate high localized temps and make the silicon somewhat more sensitive to voltage/transistor leakage/clock speed limitation/cooling. The AIO solution maybe to forestall any heat dissipation issues with 16nmFF/16nmFF+ (and 14nm if they go that direction). Taking the cooling out of the equation would give the chip design team more leeway in how much they pack into the GPUs
the54thvoidNo, simply, no - if AMD are pushing the power limits skywards, that's a dumb move when everyone is demanding lower power. And not to forget, there are legislators out there right now cracking down on excessive domestic power consumption.....
Unfortunately, the R9 295X2 kind of flies in the face of that reasoning (however logical it does seem). The card obliterated the PCI-SIG spec and contravenes the electrical specification for PSU cabling. I wouldn't be surprised to see both AMD and Nvidia continue to butt up against the PCI-SIG spec for the high end and concentrate their Perf/watt metrics in the lower tier (volume) cards.
Posted on Reply
#59
GhostRyder
the54thvoidWhat Bullzoid said might not be far from the truth. There is a chance it does have high-ish power consumption because AMD have seen that the 750ti Maxwell was good on efficiency but not exactly fast (did it not equate on speed to a 660?). With that they figure 980 might be that side ways upgrade.

The way I see it coming from Nv is that they'll maybe try and tout the 980 sli pitch for 4k gaming with low power consumption. Frankly, that's what I'm looking for in my next gpu - a 4K performance that doesn't eat power like a silicon valley troll.
Well the 750ti seems to equate to around a 650ti boost (with a little more than it) while consuming less power.

Well Nvidia also realized pretty fast that 3gb on the 780 series was not enough for 4K hence the push for the Titan/Titan Black/Titan-Z advertised more as 4k gamers including OEM builds. This round the 970 will be enough based on its benches and extra ram in at least SLI to perform well in a 4k game (Or should based on where the others already fall).

I do not think either way any of us are running into an eco disaster with the next series of cards. There are going to be some limits that even AMD would not want to cross when it comes to power usage because it would cause more risks.
Posted on Reply
#60
Casecutter
A mock-up SLA that’s painted with rattle can for who knows what? AND the crowd goes… Full Speculation Mode!

I’ll throw my version of guesswork… Asetek actually has developed a fan/pump- closed loop waterblock- radiator system that's going to be all contained within the envelope of that shroud.
Posted on Reply
#61
HumanSmoke
CasecutterA mock-up SLA that’s painted with rattle can for who knows what? AND the crowd goes… Full Speculation Mode!
Well, isn't that that the nature of a tech site? I seem to remember people basing much more speculation on much less at-hand information in the past.
CasecutterI’ll throw my version of guesswork… Asetek actually has developed a fan/pump- closed loop waterblock- radiator system that's going to be all contained within the envelope of that shroud.
Good luck with that speculation. Radiator fin density would be very fine pitched to fit within the confines of the ATX specification (especially when you take into account pump placement in relation to the radial fan) for add-in cards and also need a prodigious fan rotation rate. Both seem at odds with the concept of a "quiet" solution, assuming that is feasible in the first place. If the mock up is indicative of the final product then the mid-shroud cut-out tends to point to an external tubing/rad option. If the mock up isn't indicative then all anyone has to go on is the contract price for one or more SKU's sometime next year which may, or may not, be from AMD (basically Asetek's press release). If you're referring to Asetek's interposer design, it doesn't include the radiator.
Posted on Reply
#62
HTC
Maybe they're just opting for a water solution because they have VERY low expectations of their air coolers, noise wise. Pretty much ALL of the reviews thrashed R9x and R9 because of sound issues stating the cards were good BUT the noise held them back (as well as a shoddy stock cooler).

In the previous generation, they held themselves back with that poor-excuse-of-a-stock-cooler: maybe this time around they've actually learned from their mistake?

Ofc, it could be the card DOES need this water solution and is STILL noisy ... but one can hope they can make it much more quieter this time around ...


Personally, i don't give a rat's ass who "wins" the performance war: as long as the difference to the competitor is small so they can battle it out price wise ...
Posted on Reply
#63
fullinfusion
Vanguard Beta Tester
Just hurry up amd I has 2 slots to fill plus winter is coming :fear:
I need a room heater and Im not being a smart ass... What way to heat a room!

And these -40c Canadian winters, OMG are they great for overclocking :eek:
Posted on Reply
#64
GhostRyder
fullinfusionJust hurry up amd I has 2 slots to fill plus winter is coming :fear:
I need a room heater and Im not being a smart ass... What way to heat a room!

And these -40c Canadian winters, OMG are they great for overclocking :eek:

Sorry had to make the joke when I saw that comment ;)

I agree, my room heater is waiting for the cold winter nights. I love not having to run the heater much in the winter (I am a Texas man so excuse me :)).

I wish we had more details surrounding this card.
Posted on Reply
#65
LeonVolcove
Must i wait for this card or should i buy a Radeon R9 290 instead?
Posted on Reply
#66
overpass
That cooler def' inspired from 295x behemoth. A refreshing change it was, an AIO CLC LCS. Here's hoping that Asetek will make it more configurable so that the Crossfire setups will use the two rads that could be combined with redundant holes replaced by caps, in one elegant solution of one single loop! A liquid cooling bridge now included in place of Crossfire bridge! It can be done, and will be abso-freakinlutely awesome! :clap:And the tube should be transparent with red fluid clearly showing with nano-particles that reflect UV light, and if possible invent a fluid that changes color with temperature that will be so cool
Posted on Reply
#67
RejZoR
I guess i'll be buying Radeon again after all. I'll have to wait for some benchmarks, but the lack of interest from NVIDIA's side to actually boost any performance for next gen that was so friggin awesome to skip a model number in the end also means AMD won't be going mad with performance either (because there won't be any need to do so). Meaning we'll be friggin stagnating on GPU's segment yet another year or two. Yay.
Posted on Reply
#68
fullinfusion
Vanguard Beta Tester
GhostRyder
Sorry had to make the joke when I saw that comment ;)

I agree, my room heater is waiting for the cold winter nights. I love not having to run the heater much in the winter (I am a Texas man so excuse me :)).
But a degree is a degree lower so it for sure helps for sure lol

I wish we had more details surrounding this card.
Texas in the winter is like a mild summer day here... Man some ppl lol ;)
Posted on Reply
#69
bubbleawsome
I used to live in Texas. One 500w PC kept half the house heated well. :P My room was ~75 all winter.
Posted on Reply
#70
fullinfusion
Vanguard Beta Tester
bubbleawsomeI used to live in Texas. One 500w PC kept half the house heated well. :p My room was ~75 all winter.
Texas is like florida minus the big ass spiders lol.... @cadaveca you still have that picture when you were out in your shed benching when it was like -30C or better to show this lad what winter is about? lol
Posted on Reply
#71
Sony Xperia S
the54thvoidthe lower performing 290X uses 47W more power
Honestly, I don't give a fuck about those +47 W.

Do you know why?

Because the time, I spend using the card at those consumption levels, is so little, that it will make a very negligible cost increase in my electricity bill. I am not greedy for several dozens cents, or even few euros.

And yes, you are right, there is a sweet spot where performance / power consumption is optimal and beyond that you spend more power for lower performance increase.

But do you really give a fuck or just for the sake of arguments? :D
Posted on Reply
#72
the54thvoid
Intoxicated Moderator
Sony Xperia SHonestly, I don't give a fuck about those +47 W.

Do you know why?

Because the time, I spend using the card at those consumption levels, is so little, that it will make a very negligible cost increase in my electricity bill. I am not greedy for several dozens cents, or even few euros.

And yes, you are right, there is a sweet spot where performance / power consumption is optimal and beyond that you spend more power for lower performance increase.

But do you really give a fuck or just for the sake of arguments? :D
1) I owned a Titan and now a Classified 780Ti - the expense to me isn't relevant.
2) It's the overall ethical stand I'm taking - It's better that the tech industry moves to lower power devices, for all our sakes.
3) I'm not arguing about anything, just stating facts and my opinions.
4) I'm happy you don't give a fuck, have a balloon and some candy.
Posted on Reply
#73
1d10t
When people start arguing eff vs heat,there's no end to this.Take a look at CPU side,Intel got "more efficient" and "way faster" yet people blindly forgot they operates nearly twice temperatures AMD had.I mean come on, Intel CPU idles temperature is AMD load temperatures.
Same matters applied here.If AMD decide to slap some closed loop watercooling for theirs next single or dual or whatever GPU,it clearly indicates fan legacy are not viable option.Yes it will hot,but not as hell as if you've been there before.It's seem AMD FIRST move to embedding closed watercooling for their hig-end GPU's will continue year ahead...and likely followed by nVidia sooner or later :p
I don't mind more power hungry or excessive heat,i just need the ability to do MSAA HBAO+SSAO multi monitor at reasonable prices :roll:
Posted on Reply
#74
HumanSmoke
1d10tI don't mind more power hungry or excessive heat
Just as well, since it's here to stay. Both AMD and Nvidia need GPGPU to survive - HSA, cloud computing/VDI, HPC, workstation. Nvidia built an empire from professional graphics, and while bang-for-buck might be OK for the consumer market (although in truth all it has done is allow AMD to retain parity), the pro markets are predicated upon high core count, high bandwidth parts. The pro markets also tend to pay the bills better than most other segments...5% of graphics unit sales account for 30% of discrete board revenue, and that percentage grows as the low end of the market gets gobbled up by IGPs. With OpenCL finally starting to see some kind of adoption, AMD don't really have any reason not to bifurcate their product line as Nvidia has done. Monolithic GPU for pro and high end consumer (where the large die and R&D is largely paid off by high ASP pro parts), and stripped down "good enough" mainstream/low end parts area-ruled for maximum yield and prioritized feature set.
Posted on Reply
#75
Sony Xperia S
the54thvoidIt's the overall ethical stand I'm taking - It's better that the tech industry moves to lower power devices, for all our sakes.
Why do you think it's better? Because the current trend says so or you have another point about caring about the planet or so?

For the sake of it, it would be better if there are no devices at all which need electricity from the grid. :D

Keep in mind, that not always lower temporary power consumption, would mean lower absolute power consumption in the end. Simply because when you make the calculation in the end, it might turn that you actually burnt more power using your device longer time. :D
Posted on Reply
Add your own comment
May 9th, 2024 16:04 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts