Sunday, February 28th 2016

AMD Radeon Fury X2 Reference Air Cooled?

AMD, which has been timing its upcoming dual-GPU "Fiji" graphics card to launch sometime this year, may have demoed a production version of the card in one of its launch partners, Falcon Northwest's, Tiki high-end gaming desktop, as a "VR developer box." AMD's Roy Taylor, in a recent tweet, captions a picture of this dev box as being "the world's best DirectX 12 VR developer box," leading the press to speculate that it's running the company's dual-GPU "Fiji" card.

A close look at AMD's VR dev box, through its windowed graphics card compartment, reveals an air-cooled AMD reference graphics card, which VideoCardz' trigonometry pins as being shorter than a Radeon R9 390X reference board. It could be a reference R9 380X, but then a reference dual-GPU "Fiji" PCB is roughly of the same length, and a R9 380X wouldn't earn the title of being the "world's best" from a senior AMD exec while there are faster AMD cards, such as the R9 Fury. The ability of the full-spec "Fiji" silicon to cope well with a rather simple air-cooler in the R9 Nano fans even more speculation that a dual-GPU "Fiji" board could make do with a powerful air-channel cooler.
Sources: VideoCardz, TweakTown
Add your own comment

39 Comments on AMD Radeon Fury X2 Reference Air Cooled?

#26
nickbaldwin86
64KNvidia releases dual GPU cards on a petty regular basis. 9800 GX2, GTX 295, GTX 590, GTX 690 and Titan Z. Frankly I'm a bit surprised that they haven't released a dual Titan X but they may not believe it will sell well especially if they already have a Pascal Titan lined up to release this summer. They may do it anyway. I have no idea how well the Titan Z sold but there is only one member here that I know of that bought one at $3,000 :eek: I think they are going for around $1,500 new right now.
They have skipped 790 and 990 as of yet.... I honestly don't count Titan Z because of the price tag and I know they didn't sell well. they were a tight niche item, cant imagine spending a $1000 on a GPU so $3000 shocking!

Or we could just say Nvidia doesn't release 2x GPU cards to the general consumer like AMD does. I have personally seen more 295 cards then I have any other NV x2 gpu variant. So yes cost plays a huge factor.
Posted on Reply
#27
GhostRyder
nickbaldwin86pfft I am sure they have it engineered they just need to sell it... but yeah was honestly more of a joke then a real question because it will likely never happen. NVidia rarely releases 2xGPU cards... and I really never thought any of them were worth it.
Well I would not really say that because there have not been that many generations since dual GPU's became a thing. Really only a few generations of desktop GPU's were missed that I recall (GTX 4XX series and GTX 9XX). They have probably stopped mostly because it is such a niche product line and the designs for coolers being a bit difficult for that much power.
64KNvidia releases dual GPU cards on a petty regular basis. 9800 GX2, GTX 295, GTX 590, GTX 690 and Titan Z. Frankly I'm a bit surprised that they haven't released a dual Titan X but they may not believe it will sell well especially if they already have a Pascal Titan lined up to release this summer. They may do it anyway. I have no idea how well the Titan Z sold but there is only one member here that I know of that bought one at $3,000 :eek: I think they are going for around $1,500 new right now.
Well, the Titan Z didn't do well mostly because it made little to no sense in most scenarios. It was so niche and costly that all I could think of for scenarios to use it were scenarios you had only 1 PCIE 16x slot on the system or 2 slots and wanting to run Quad. Problem was for the price you could buy 3 Titan Blacks or 4 780ti's (And still have some change to build part of the rest of the system) which is probably why its a very rarely seen/niche product around.
the54thvoidGiven the ass backwards naming scheme:

Titan - single gpu
Titan Z - dual Titan (underclocked)
Titan X - single gpu

I suppose the only letter left is 'Y'.

And that would some up everyone's thoughts.

Titan 'Why?'
I would think of it more as Titan Y not :P
Posted on Reply
#28
BiggieShady
GhostRyderIt was so niche and costly that all I could think of for scenarios to use it were scenarios you had only 1 PCIE 16x slot on the system or 2 slots and wanting to run Quad.
I'd say there is only one single scenario for buying Titan Z - you are a scientist running double precision simulations for the living and you find Tesla to be too expensive, so you splurge 3K for the cheap Titan Z.
Posted on Reply
#29
Keullo-e
S.T.A.R.S.
KananIf it's a hybrid, it would run on maximum of 1000 or 1050 MHz, but then again, only 2x 8 pin plugs? That's little, compared to a Fury X with only 1 Fiji GPU that also has 2x 8 pin plugs then, which makes it somewhat unlikely again.
Aaaaand? 295X2 had 2x8pin also and draws ~600W maximum.
Posted on Reply
#30
cdawall
where the hell are my stars
It could just be two fury nano's as far as TDP programming goes which would easily to air cool...
Posted on Reply
#32
Kanan
Tech Enthusiast & Gamer
9700 ProAaaaand? 295X2 had 2x8pin also and draws ~600W maximum.
Yep, but we all know what a failure THAT was. It needs special PSUs to run, even a highend PSU could burn, or basically explode because of it. I think PCGH had a Enermax 1000W PSU go heaven because of the 295X2 and their over-the-specification 2x 8 pin cables drawing 400-600W. So, what I though was, when they do a full fledged dual GPU card again (no throttleing), they need more than just 2x 8 pin. But we will see, if they release the card someday. ;) And if it's indeed "Nano X2" then 2x pin would be enough. I think the latter would be the smarter choice, it's more than fast enough and a lot more efficient than a Fury X2 or 2x Fury X. 2x 175W (350) vs. 2x 275W (550) TDP. Speed only 10-30% slower (tops).

btw. the reason the Titan Z sucked, was also because it was too slow, they didn't had triple fans on it or hybrid like AMD had with the 295X2, they went for a single fan triple slot design, what basically was a big mistake as it slowed both GPUs down considerably (because the card was too hot for the GPUs to run at high clocks). And don't forget SLI scales somewhat poorly compared to Crossfire, that didn't help the card too. So, who buys a Titan Z for 3k$ when he can have a 295X2 for 1200$ (or was it 1500?), that also is faster and cooled by water? Yeah. I think BiggieShady nailed it.

The reason why Nvidia isn't doing a Titan X2 is simple: 1) Fury X CF is faster, if they go simple cooler again, because they run on max speed and the Titan X2 not. 2) Crossfire scales better than SLI. 3) they earn more money with customers buying 2x Titan X or 2x 980 Ti instead of just one card priced at 1000 or 1500-3000$ (depends if its a 980 Ti X2 or Titan X2). But basically the 1st reason is a big problem, they don't want a comparison where they lose again, like they lost to the 295X2 with their overly expensive Titan Z. 2x 980 Ti/Titan X is doing fine though.
Posted on Reply
#33
PP Mguire
KananYep, but we all know what a failure THAT was. It needs special PSUs to run, even a highend PSU could burn, or basically explode because of it. I think PCGH had a Enermax 1000W PSU go heaven because of the 295X2 and their over-the-specification 2x 8 pin cables drawing 400-600W. So, what I though was, when they do a full fledged dual GPU card again (no throttleing), they need more than just 2x 8 pin. But we will see, if they release the card someday. ;) And if it's indeed "Nano X2" then 2x pin would be enough. I think the latter would be the smarter choice, it's more than fast enough and a lot more efficient than a Fury X2 or 2x Fury X. 2x 175W (350) vs. 2x 275W (550) TDP. Speed only 10-30% slower (tops).

btw. the reason the Titan Z sucked, was also because it was too slow, they didn't had triple fans on it or hybrid like AMD had with the 295X2, they went for a single fan triple slot design, what basically was a big mistake as it slowed both GPUs down considerably (because the card was too hot for the GPUs to run at high clocks). And don't forget SLI scales somewhat poorly compared to Crossfire, that didn't help the card too. So, who buys a Titan Z for 3k$ when he can have a 295X2 for 1200$ (or was it 1500?), that also is faster and cooled by water? Yeah. I think BiggieShady nailed it.

The reason why Nvidia isn't doing a Titan X2 is simple: 1) Fury X CF is faster, if they go simple cooler again, because they run on max speed and the Titan X2 not. 2) Crossfire scales better than SLI. 3) they earn more money with customers buying 2x Titan X or 2x 980 Ti instead of just one card priced at 1000 or 1500-3000$ (depends if its a 980 Ti X2 or Titan X2). But basically the 1st reason is a big problem, they don't want a comparison where they lose again, like they lost to the 295X2 with their overly expensive Titan Z. 2x 980 Ti/Titan X is doing fine though.
I ran a 295x2 on my 1000W TT Toughpower with 0 issues. I'd be willing to bet their PSU was ready to give up the ghost.

Nvidia isn't doing a dual GPU card because they don't need it.
Posted on Reply
#34
Casecutter
This is a Card developed solely for use in V-R machines, it is not a designed for 3D Gaming performance.

Two Nano styled thermal chamber coolers feed by a bower fan that moves air/pressure better than the blade fan of the relatively quiet Nano. Probably the best binned chips at fairly sedate 700-750Mhz clocks I doubt that fan would need to run all that fast. Heck I'll be surprise if it has two 8-pins

It's all about V-R...
Posted on Reply
#35
PP Mguire
And VR is currently being pushed by the gaming community.
Posted on Reply
#36
Kanan
Tech Enthusiast & Gamer
PP MguireI ran a 295x2 on my 1000W TT Toughpower with 0 issues. I'd be willing to bet their PSU was ready to give up the ghost.

Nvidia isn't doing a dual GPU card because they don't need it.
Still, going over-specification wasn't a wise decision. Many users had problems with the 295X2, it was a extreme graphics card and afait(heorize) AMD won't do it again. Informations gathered in news about the Fury X2 made it relatively clear it will be like Nano X2, so about 350-400W TDP, unlike the 295X2 with its consumption of 450-600W. Meaning it will be a relatively efficient dual gpu card in best tradition of HD5970/6990/7990.
Posted on Reply
#37
PP Mguire
KananStill, going over-specification wasn't a wise decision. Many users had problems with the 295X2, it was a extreme graphics card and afait(heorize) AMD won't do it again. Informations gathered in news about the Fury X2 made it relatively clear it will be like Nano X2, so about 350-400W TDP, unlike the 295X2 with its consumption of 450-600W. Meaning it will be a relatively efficient dual gpu card in best tradition of HD5970/6990/7990.
200W worth of OC'd CPU and 600W tops for stock 295x2 doesn't come to my continuous wattage, even if we equate the 646 max from the TPU review. I ran 3 Titan X's with my overclocked 3960x on that 1000W TT and pulling 1245 from the wall it still ran all the benchmarks I put it through. That's why I said, it seems like their unit was on the brink of dying anyways. Any good PSU worth it's salt has some overhead, so 850W tops on a 1000W PSU shouldn't have killed it. Sure, I've read the reports on the cards and the juice sucking. My good bud didn't have any issues running it on his 1000W G2 either and he's the one using the card currently (granted he moved to a 1300W eVGA).
Posted on Reply
#38
Kanan
Tech Enthusiast & Gamer
PP Mguire200W worth of OC'd CPU and 600W tops for stock 295x2 doesn't come to my continuous wattage, even if we equate the 646 max from the TPU review. I ran 3 Titan X's with my overclocked 3960x on that 1000W TT and pulling 1245 from the wall it still ran all the benchmarks I put it through. That's why I said, it seems like their unit was on the brink of dying anyways. Any good PSU worth it's salt has some overhead, so 850W tops on a 1000W PSU shouldn't have killed it. Sure, I've read the reports on the cards and the juice sucking. My good bud didn't have any issues running it on his 1000W G2 either and he's the one using the card currently (granted he moved to a 1300W eVGA).
That's okay, but doesn't change my opinion. Also I didn't say 650W I did say 450-600 what is pretty realistic for a 295X2 (I don't care much about maximum consumption, it's maybe a few seconds and in a few games only). The 200W CPU is your problem, I'm only talking GPUs here. The 295X2 is a "okay" card with problems here and there, this is how I see it. The Nano X2 could be much better. Nvidia never did extreme cards, this was a wise decision on their part.
Posted on Reply
#39
PP Mguire
KananThat's okay, but doesn't change my opinion. Also I didn't say 650W I did say 450-600 what is pretty realistic for a 295X2 (I don't care much about maximum consumption, it's maybe a few seconds and in a few games only). The 200W CPU is your problem, I'm only talking GPUs here. The 295X2 is a "okay" card with problems here and there, this is how I see it. The Nano X2 could be much better.
I'm only refuting the above about needing "special" PSUs to run. The CPU figure is there to show exactly how much wattage is being pulled from the unit itself. All it needs is a good PSU to run. You don't buy a juice sucking monster to put it on a tiny power plant. I do believe it should have had 3 8pins though, which would have been epic, but alas it still runs fine the way it is.

A dual Nano card would indeed be better.
Posted on Reply
Add your own comment
May 21st, 2024 16:53 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts