• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Radeon R9 Nano CrossFire

TechPowerUp, you guys seriously have to take out of your test suite Project CARS, Wolfenstein: The New Order and World of Warcraft. This 3 games really damage the real performance index of the AMD cards. Just take a look at all other games and notice that the performance the Nano, Fury and Fury X cards presents do not agree with the final performance summary, and that is just because of those three aforementioned games.

No, AMD let their cards performance be damaged by not having the best DX11 hardware and/or driver solutions.
But, rejoice and flap thy red cape, future games that choose to go down certain feature paths in DX12 will seriously level the playing field. The new Deus Ex title is AMD sponsored and comes out in Feb. If it uses heavy Asynchronous shading, it'll be very good for Tonga, Hawaii and Fiji based cards.
But then, should I ask for that to be taken out of future benchmarks for being unfair to Nvidia?
 
I think those games should stay there.

Looking at those 2-3 games and comparing the results with the other 20+ games, reveals the developers who will happily take someone's money and screw, in my opinion, their own customers by giving them an inferior product, while asking the full price. We should NOT protect those developers from hiding the benchmark results in their games. No. Those games should be there for everyone to see. And then put those developers in a black list and NEVER pay the full price for their games. Wait until those games come down to 1/4 of the original price and only then consider buying them.

Just my opinion of course, let those games there. Let everyone know who's games NOT to buy. Developers who happily screw AMD owners today, in a monopoly tomorrow will happily screw every owner of a "last gen" hardware in favor of the "next gen" hardware, so to force them to upgrade.
 
Last edited:
Thanks ks wiz, another great job. Keep up the good work!

That's another thing we should stop complaining about: it's not the GPU makers fault if a game does not run well on its GPU, its the game itself that was poorly built. The GPU and it's driver (or its arquiteture GCN) were already there for the game developers to make good use of it. Hardware companies should stop trying to make shitty games run well if it wasn't meant like that in the first place.

So if you can't just remove games that don't run well on AMD hardware, to balance things out, it would be fair to include games that run well on AMD hardware but not on Nvidia hardware.
Oy...

Just quoting this in hopes he reads his own post to see how asinine this suggestion actually is on so many levels.

@john, dirt rally runs terrier on amd hardware for some reason... it's an amd game too iirc.
 
Great review but, I think I'll go with a second 390 considering two 390s cost about the same as a Nano. I'll hold off on first generation technology and let all the early adopters enjoy the high price tag, so people like me can live vicariously through the review but, still go with the cheaper option. Good job @W1zzard . You make me wish I had more money for computer components. :p
 
@john, dirt rally runs terrier on amd hardware for some reason... it's an amd game too iirc.
Dirt Rally Performance Review - GeForce GTX 970 Versus Radeon R9 390 - Page 4 of 4 - Legit ReviewsCodemasters: Dirt Rally Performance Benchmark
It doesn't look bad.
Codemasters-Dirt-Rally-Charts-1920x1080-645x314.jpg

Even if we go before the introduction of 300 series
Dirt Rally: Ersteindruck des geistigen "Colin McRae 2015" mit Benchmarks von 20 Grafikkarten
the results are what someone would expect. Not every card from one manufacturer being faster than every card from the other manufacturer.
 
Last edited:
"Far Cry 4 is an interesting test. The R9 Nano CrossFire not only doesn't scale at 4K, but also sees a performance drop. In the same test, the dual-GPU R9 295X2 scales just fine. It goes to show that AMD still needs to refine drivers for the "Fiji" GPU"

AMD doesn't need to do anything. They have fixed poor performance in drivers 15.7.1 and 15.8. You see poor performance here because you guys used the 15.7 drivers, for whatever reason.
 
AMD doesn't need to do anything. They have fixed poor performance in drivers 15.7.1 and 15.8. You see poor performance here because you guys used the 15.7 drivers, for whatever reason.
Help me please, what driver have I used for Nano?
 
Last edited:
That's another thing we should stop complaining about: it's not the GPU makers fault if a game does not run well on its GPU, its the game itself that was poorly built. The GPU and it's driver were already there for the game developers to make good use of it. Hardware companies should stop trying to make shitty games run well if it wasn't meant like that in the first place.

So if you can't just remove games that don't run well on AMD hardware, to balance thing out, it would be fair to include games that run well on AMD hardware but not on Nvidia hardware.
No, AMD let their cards performance be damaged by not having the best DX11 hardware and/or driver solutions.
But, rejoice and flap thy red cape, future games that choose to go down certain feature paths in DX12 will seriously level the playing field. The new Deus Ex title is AMD sponsored and comes out in Feb. If it uses heavy Asynchronous shading, it'll be very good for Tonga, Hawaii and Fiji based cards.
But then, should I ask for that to be taken out of future benchmarks for being unfair to Nvidia?

No You don't, because it will be more balanced.
You see, what I'm trying to say is that Nvidia has a history of making games run bad on AMD cards, and probably that is whats going on with those 3 games.
Also Gameworks is a terrible thing for the gaming community...
 
Last edited:
From the review, "AMD R9 Nano: 15.201.1102"

Latest 15.8 driver is 15.201.1151 :). Far Cry 4 performance has been fixed.
Hmm .. I used the latest driver from AMD's FTP, recommended for R9 Nano reviews, marked as "September 1" build. As far as I know there is no newer driver for R9 Nano.

Your .1151 driver is from August 23, so older... amd-catalyst-15.8beta-64bit-win10-win8.1-win7-aug23.exe
 
Last edited:
  • Like
Reactions: nem
From the review, "AMD R9 Nano: 15.201.1102"

Latest 15.8 driver is 15.201.1151 :). Far Cry 4 performance has been fixed.

Evidently not fixed. This is always going to be a problem with dual GPU cards.
 
I think only one thing.. in 2015 it is no longer possible continue to publish reviews with a number and that's it, it's absurd. Should be published a chart with the trend of fps for the whole duration of the test.
 
Maybe I missed it, but in your conclusion you neglected the power usage as a benefit of the Nanos in crossfire vs. Fury/Fury-X. Seems that performance @~350W would be a definite plus.
 
A few cents? Architectural changes are more expensive than a few cents, get your informations right please. Is it not obvious to you, that they would have done it, would it have been a few cents.

Architectural changes? You do know how active display port to HDMI converters work, right? Those little dongles contain a tiny chip that functions by processing the DP video signal to convert it to an HDMI video signal, a very small piece of silicon no bigger than a few squared millimeters in size, and those cost how much to companies like AMD? Yeah you guessed that right, cents, no need to change anything in the actual architecture of the GPU to add a tiny video converter to the board to correctly support HDMI 2.0, but they decided to transfer that burden to the people who want to use this card in a home theater situation when plugged it to thousands of existing 4K TVs, so it seems like you should get your information right.

Also, I've been looking for this fabled Display port to HDMI 2.0 converter you say people have mentioned a "zillion times" it turns out such converters are not even available as of sept 2015, as found in numerous forum threads filled with people looking for this sought after piece of hardware:

http://www.avsforum.com/forum/35-ca...-1-2-hdmi-2-0-adapter-there-manufacturer.html

http://www.tomshardware.com/forum/id-2258794/powering-hdmi-devices-mini-gpu.html

http://hardforum.com/showthread.php?t=1853226

There are some cheap adapters you can find online claiming to convert DP signal to HDMI 2.0, but the reviews for these adapters are filled with angry customers warning other people to stay away from these adapters, as they are HDMI 1.4 at most and thus, capable of only 30Hz at 4K.

One such adapter claims to be able to drive 4K at 60Hz, but it doesn't even list official HDMI 2.0 support, and some reviewers report this adapter uses a trick similar to what Nvidia did to enable 60Hz on early maxwell cards by downsampling the video signal to 8bit

http://www.amazon.com/dp/B00ZA067MA/?tag=tec06d-20

So, if you know where to find this adapter you speak of, would you kindly share a link to the product page so the hundreds of people who are looking for it in hardware forums can purchase it?
 
Last edited:
no need to change anything in the actual architecture of the GPU to add a tiny video converter to the board to correctly support HDMI 2.0
Probably they didn't wanted to add complexity on their board, or a third party chip or even increase the size of the board a "few squared millimeters" considering we do not talk about a big graphics card. In fact, in the case of Nano a "few squared millimeters" is a really big deal considering how they market the card.
So, their only option would probably be to upgrade the chip to support HDMI 2.0. They should have done that, they didn't. I don't know how much more it would have cost them in time and money to implement HDMI 2.0 support. I mean, OK, they just took two Tonga chips and glue them together, change the memory subsystem to support HBM and that's it? Not enough money/engineers/time to add HDMI 2.0 support? Did they had to change significant part of the chip's architecture to support HDMI 2.0?
 
They need to make an XL-ITX board for crossfire Nanos with matching power supply. XD


If you're going to blow $650-1300 on graphics cards, there's a good chance you can afford a DisplayPort TV/monitor too. HDMI 2.0 can't do 4K without cutting corners; DisplayPort can. I think that's the message AMD is trying to send by excluding HDMI 2.0.

There has to be a technical reason why DisplayPort to HDMI 2.0 converters don't exist. I wish I knew it.

Edit: It sounds like DP->HDMI 2.0 converters should becoming by the end of the year.
 
Last edited:
With no HDMI 2.0 it is 4k hell

AMD are going release a adapter that solves the HDMI 2.0 issue, i would of though it be connected though the DP connecters.
 
AMD are going release a adapter that solves the HDMI 2.0 issue, i would of though it be connected though the DP connecters.

Using an adapter on a plus 600 dollar gpu? No thank you if I'm spending good money I want my tech to be high tech
 
I doubt that. Active DisplayPort converters are not cheap. AMD can't afford to be handing them out and they really have no reason to start selling them either (other companies like Startech, Monoprice, and Belkin will get all over that).
 
Using an adapter on a plus 600 dollar gpu? No thank you if I'm spending good money I want my tech to be high tech

I don't disagree with you, was just saying that their will be.
 
Using an adapter on a plus 600 dollar gpu? No thank you if I'm spending good money I want my tech to be high tech

If it was "high tech". Would you not prefer DisplayPort ?

HDMI FAQ
4k@60hz 10-bit 4:2:0

DisplayPort FAQ

4k@60 10-bit 4:4:4
DisplayPort FAQ said:
DisplayPort 1.2a systems today can support 4K displays at 60Hz refresh and full 30-bit 4:4:4 color (non-chroma subsampled).

You can always channel your diatribe towards the non-inclusion from both GPU vendors of DisplayPort 1.3 which was passed last year . At least you'd be advocating for a superior standard not going backward for convenience.
 
Last edited:
If it was "high tech". Would you not prefer DisplayPort ?

HDMI FAQ
4k@60hz 10-bit 4:2:0

DisplayPort FAQ

4k@60 10-bit 4:4:4


You can always channel your diatribe towards the non-inclusion of DisplayPort 1.3 which was passed last year from both GPU vendors. At least you'd be advocating for a superior standard not going backward for convenience.
Absolutely! Don't know why people keep complaining about. DisplayPort is a must for 4K monitors...
 
If it was "high tech". Would you not prefer DisplayPort ?

HDMI FAQ
4k@60hz 10-bit 4:2:0

DisplayPort FAQ

4k@60 10-bit 4:4:4


You can always channel your diatribe towards the non-inclusion from both GPU vendors of DisplayPort 1.3 which was passed last year . At least you'd be advocating for a superior standard not going backward for convenience.
Forget the fact that the actual connector for HDMI is crap with respect to build quality and longevity. DP is a much more rigid design than HDMI. A clip to hold a connector in and an L shaped (keyed,) internal connector? You would think that after using screws with HDMI and VGA that they would realize that there is a need to hold cables in place and for the connector to be rigid and HDMI fails in that respect. The only benefit of HDMI is convenience because it has weaseled its way on to just about every device you can find.

As I understand it though, 4:4:4 can look a lot nicer than 4:2:0 although I've only heard and read something about it once.
 
Back
Top