Tuesday, June 16th 2015

Radeon Fury X Outperforms GeForce GTX Titan X, Fury to GTX 980 Ti: 3DMark Bench

AMD's upcoming $650 Radeon R9 Fury X could have what it takes to beat NVIDIA's $999 GeForce GTX Titan X, while the $550 Radeon Fury (non-X) performs close to the $650 GeForce GTX 980 Ti, according to leaked 3DMark 11 and 3DMark (2013) benches by Korean tech publication ITCM.co.kr. The benches see the R9 Fury X score higher than the GTX Titan X in all three tests, while the R9 Fury is almost as fast as the GTX 980 Ti. The cards maintain their winning streak over NVIDIA even with memory-intensive tests such as 3DMark Fire Strike Ultra (4K), but buckle with 5K. These two cards, which are bound for the market within the next 30 days, were tested alongside the R9 390X, which is not too far behind the GTX 980, in the same graphs. The R9 Nano, however, isn't circulated among industry partners, yet. It could still launch in Summer 2015.
Source: ITCM (Korea)
Add your own comment

100 Comments on Radeon Fury X Outperforms GeForce GTX Titan X, Fury to GTX 980 Ti: 3DMark Bench

#51
buggalugs
. AMD did a good job to get this performance from 28nm and we get new technology with HBM.

Its not over yet, there should be some interesting non-reference overclocked versions like ASUS DCUII or MSI Lightning etc.

There was a rumour about versions of fury with GDDR5 to allow for higher Vram, but either way there should be lots of interesting options based on this GPU. Anyway 4GB is enough for the vast majority of gamers.

Its going to be a long time before pascal comes next year
Posted on Reply
#52
ZoneDymo
HumanSmokeDoes it? I wasn't aware the card had been reviewed yet?
The only benchmarks I've seen that have some validity, are from people with cards in hand. Performance looks pretty similar.
Of course if DG Lee is actually benching the cards, I'd assume he must have had the result on screen at some stage - why not post screencaps if this is the case?
Im talking about the slides shown in this news post, the title reads "
Radeon Fury X Outperforms GeForce GTX Titan X, Fury to GTX 980 Ti: 3DMark Bench"

and we see in the first slide that the Fury X does indeed outperform TitanX by.. well about 30 points, which can hardly be called outperforming.

Now a bigger jump can be seen at the bottom of the slide, the 390x scoring about 1500 points higher then the 290x.
Now if this is just the clocks, then that makes the "outperforming" part of the Fury X over the Titan X even less relevant, a 5mhz bump should easily level the 2 out.
Posted on Reply
#53
RCoon
ZoneDymoIm talking about the slides shown in this news post, the title reads "
Radeon Fury X Outperforms GeForce GTX Titan X, Fury to GTX 980 Ti: 3DMark Bench"

and we see in the first slide that the Fury X does indeed outperform TitanX by.. well about 30 points, which can hardly be called outperforming.

Now a bigger jump can be seen at the bottom of the slide, the 390x scoring about 1500 points higher then the 290x.
Now if this is just the clocks, then that makes the "outperforming" part of the Fury X over the Titan X even less relevant, a 5mhz bump should easily level the 2 out.
Those slides are made up benchmarks using educated guesses. They are not actual benchmarks run on the actual hardware. It's entirely guestimation.
Posted on Reply
#54
ZoneDymo
RCoonThose slides are made up benchmarks using educated guesses. They are not actual benchmarks run on the actual hardware. It's entirely guestimation.
"according to leaked 3DMark 11 and 3DMark (2013) benches by Korean tech publication ITCM.co.kr."

Maybe not the most credible source but it goes a bit far to just call them "made up" right?
Posted on Reply
#55
Vayra86
ZoneDymo"according to leaked 3DMark 11 and 3DMark (2013) benches by Korean tech publication ITCM.co.kr."

Maybe not the most credible source but it goes a bit far to just call them "made up" right?
Come on. It's being done every single day. Ignore this stuff and wait for real world performance, much safer, much closer to the actual facts. We don't even have bench specifics, clocks, etc. All of which you would expect if these benchmarks were real and not a wild guess.

In my opinion AMD's slideshow falls in the exact same category.
Posted on Reply
#56
rruff
naxeemActually, no.

1. You do need this card at 1080 and 1440 because not even then some games will be playable in 60 or 144 FPS. For as long as you don't have 60 FPS as minimum, you're not experiencing the best and perfect fluidity. At least on average you should have 60 FPS for smooth gameplay. Then, we have 144Hz monitors on 1080 and 1440 that provide great experience and need power.
So yes, you NEED this card.

2. Unfortunately, yes, 4GB might not be enough for some 4K games, but for some it will be. You don't need AA anyway at 4K so most games will work fine.

3. However, those games that need more, like GTA5 at 5K don't stop at 6GB anyway. You need more than that and 980Ti can't help you. If you play at 5K, yes, for some games only TitanX works, but in most cases even at 4K Fury X will be fine.
You are actually agreeing with him, and he makes a good point. In some games even now at 1440p, 4GB of vram limits eye candy. That's running at 72 fps locked, and he doesn't say what the card is, but it's not one of the latest. Sure, in most of them it won't be an issue and maybe it isn't a big deal to tone down the settings a little. But for some people it is a little disappointing to have a new $600 card with this limitation.
Posted on Reply
#57
haswrong
xenocideIn DX12 it would, but I wouldn't hold your breath for an abundance of DX12 games. Developers probably won't release games that fully support DX12 until 2018.
thats ok. im not interested in games primarily. im interested in the functionality (i was reacting to the drop of performance in the 5k,8k scenarios of firestrike probably due to 4GB memory buffer clutter). and if im able to use both gpus from programmatical point of view without any special enabling profiles from amd driver team, id be most satisfied. buying a dual gpu card only to be able to use one of those would be quite an inefficient endeavour.
Posted on Reply
#58
GhostRyder
Still waiting for reviews across the web instead of leaks. I want to see them from everyone because then we can get a good idea of where the real performance is!

Either way if this holds to be true, then its a great day!
Posted on Reply
#59
haswrong
EboI think some of our members have been spreading FUD no matter what.

AMD have launched a new lineup, based on completely new tech with new menory, a bandwith Nvidia can only dream of.

1 member in perticular have been very negative allready from the start. He wants maximum preformance for under 400 euros which is his max....well that NOT going to happen and even within his price range he is still unhappy while his machine have a 3 year old setup. I just dont understand it at all.

If you want the newest and best, theres a price to pay, and still AMD delivers even this time arround, at a lower price, than Nvidia has of now.
Looking at how some have been bashing AMD for not delivering stabile drivers ?, just look at Nvidia their last 3 drivers havent been stabile and caused a lot of problems for a lot of people, until the last one, so my friend tells me.

I have made my mind up, Im going for the Fury X or the Fury, all comes down to which hit the shelves first and I really dont care about the price. It can be 500-6-7 euros, I dont care, all I know is that I want it.
look, there is a distant or not too distant possibility, that amd rented a technology from nvidia (who may have developed it themselves or rented elsewhere) to produce the improved price/performance ratio, just to keep the nvidia-amd tandem going. we may be just witnessing a perfect cartell going without realizing it. you simply cant tell. so 400 bucks for a top notch card is as possible as 650 bucks. but both under different economical conditions. do amd deserve their product to be bought? i dont think theres a natural deserve system in this world. if you think they deserve, then buy it for their price and then donate on top of it. there are people in this world for which its not easy to spend 400 on one brand new component. do they deserve to only buy used or old ones? your call..
Posted on Reply
#60
moproblems99
haswronglook, there is a distant or not too distant possibility, that amd rented a technology from nvidia (who may have developed it themselves or rented elsewhere) to produce the improved price/performance ratio, just to keep the nvidia-amd tandem going. we may be just witnessing a perfect cartell going without realizing it. you simply cant tell. so 400 bucks for a top notch card is as possible as 650 bucks. but both under different economical conditions. do amd deserve their product to be bought? i dont think theres a natural deserve system in this world. if you think they deserve, then buy it for their price and then donate on top of it. there are people in this world for which its not easy to spend 400 on one brand new component. do they deserve to only buy used or old ones? your call..
Yes. Because they are not owed a 600 graphics card for 400 either. You either have the money or you don't. I don't have 600 so I won't buy it. Will I complain that it is 600 and not 400? No.
Posted on Reply
#61
kaligon
Damn... you internet "experts" are funny as hell! Still can't figure out how nobody around here realizes you're comparing DDR5 technology to HBM technology, which is something brand new and much more efficient in terms of data processing speed. We don't even know how far its performance extends, nor what it can do with those 4GBs. Stop misinforming people, because some might even believe this bullcrap. Read up on how HBM works and stop making stupid comparisons. The only thing you can count on will be the upcoming benchmarks, aside from that, nothing should be taken into consideration.
Posted on Reply
#62
rruff
kaligonDamn... you internet "experts" are funny as hell! Still can't figure out how nobody around here realizes you're comparing DDR5 technology to HBM technology, which is something brand new and much more efficient in terms of data processing speed. We don't even know how far its performance extends, nor what it can do with those 4GBs.
Not talking about speed here, but rather capacity. 4GB HBM is still 4GB of digital memory, and when the game needs >4GB to store the information it needs, then you will have issues.
Posted on Reply
#63
HABO
kaligonDamn... you internet "experts" are funny as hell! Still can't figure out how nobody around here realizes you're comparing DDR5 technology to HBM technology, which is something brand new and much more efficient in terms of data processing speed. We don't even know how far its performance extends, nor what it can do with those 4GBs. Stop misinforming people, because some might even believe this bullcrap. Read up on how HBM works and stop making stupid comparisons. The only thing you can count on will be the upcoming benchmarks, aside from that, nothing should be taken into consideration.
Rofl you are the internet expert.... when your cerebral capacity is small, its irrelevant how fast you can speak, you are dumb, thats the problem.... you know what i mean... thats why there is significant performance drop in this "fake" performance tests in 8k resolution....
Posted on Reply
#64
Vayra86
Let's just all call each other dumb and be happy about it.

Makes life a lot more enjoyable :clap:
Posted on Reply
#65
Haytch
Wow, so much anger going on over guesstimated benchmarks from an unreliable source.
Allow me to lay down the facts to help resolve the arguing.
1. The more ram, the better, regardless of the game, the resolution etc.
2. The faster the speed of the ram, the better!
3. The faster the GPU, the better.
I am sure you get my point.

If have concerns regarding power-draw or maybe consider these 'highest-end available' consumer products as overkill, then, perhaps you should not be considering the purchase of the 'highest-end available' model and go for something aimed at your budget/requirements.

Personally, I play at 4k except when I am using my racing simulator which results in 10440 x 2160 resolution.
Next year i plan to get myself another 3 4K screens and mount them which would result in 10440 x 4320 resolution.
Why would i do this ? Because i can!
Is this reasonable thinking ? Depends if 17PSi Turbo is reasonable in my car with my miltek exhaust, clutch upgrade, RS500 brakepads, Slotted & Vented front/back discs, big-assed intercooler, dreamscience cold air intake, modded swaybars, Quaife LSD and so on.

I am proud of AMD R&D. HBM is a huge leap and will improve with time. I really can see HBM taking cards to 64Gb and maybe beyond. GPU speed might be a limiting factor, but that's another matter. It is more of a case, whether we can or not.
I applaud AMD for being able to continue to survive, and not only that, bring us high-end quality products that make the opposition cry and even bitch. Nvidia has a much larger market-share, much larger R&D fund and has a much larger sum worth, and yet AMD is either on-top, even or slightly under, not the defeated little company in the corner of the market that Nvidia wants them to be.

Don't get me wrong, i love my Titans and i am not some AMD fan-boy, in-fact, i think both companies are playing us and i hate that about both of them, but i can't do anything about that. I can only give credit where it is due.

The last time i was this proud of either AMD or Nvidia was when Nvidia released the 8800GTX.
Posted on Reply
#66
Jax2Elite
Lets not forget this is AMD's next generation and it is only just and I will repeat "just" ahead of what is now getting on for a 12 month old Nvidia architecture. If we look at Maxwell I (GTX 750Ti) then that is actually 16 months old.

I am super happy AMD is finally releasing something that is not going to set my house on fire and well done to them. Unfortunately the performance gap is certainly not big enough to suffer drivers that are equally as bad as Creative Sound Blasters with U.I design on par with Windows 95.

Performance is only one part of a much bigger picture.
Posted on Reply
#67
Haytch
Jax2EliteLets not forget this is AMD's next generation and it is only just and I will repeat "just" ahead of what is now getting on for a 12 month old Nvidia architecture. If we look at Maxwell I (GTX 750Ti) then that is actually 16 months old.

I am super happy AMD is finally releasing something that is not going to set my house on fire and well done to them. Unfortunately the performance gap is certainly not big enough to suffer drivers that are equally as bad as Creative Sound Blasters with U.I design on par with Windows 95.

Performance is only one part of a much bigger picture.
That's a damn good first post. Welcome to TechPowerUP!
I do not have a problem with the heat coming from my AMD GPU's because they seem to have awesome fans from the factory capable of cooling the cards appropriately, problem is, they sound like an aircraft landed on your desktop!
Posted on Reply
#68
Nate00
ironcerealboxO.o
o.O
O.o
o.O

If this is accurate (still too early to tell until more benchmarks are released), then Nvidia might look like this:
LMAO, that is funny. Good one.
Posted on Reply
#69
moproblems99
HaytchIs this reasonable thinking ? Depends if 17PSi Turbo is reasonable in my car with my miltek exhaust, clutch upgrade, RS500 brakepads, Slotted & Vented front/back discs, big-assed intercooler, dreamscience cold air intake, modded swaybars, Quaife LSD and so on.
What kind of car do you have?
Posted on Reply
#70
erocker
*
Keep on topic folks. I won't ask again.

Thank you.
Posted on Reply
#71
RealNeil
EboI think some of our members have been spreading FUD no matter what.

AMD have launched a new lineup, based on completely new tech with new memory, a bandwidth Nvidia can only dream of.

1 member, in particular, has been very negative already from the start. He wants maximum performance for under 400 euros which is his max....well that NOT going to happen and even within his price range he is still unhappy while his machine have a 3-year-old setup. I just don't understand it at all.

If you want the newest and best, there's a price to pay, and still AMD delivers even this time around, at a lower price, than Nvidia has of now.
Looking at how some have been bashing AMD for not delivering stable drivers ?, just look at Nvidia their last 3 drivers haven't been stable and caused a lot of problems for a lot of people, until the last one, so my friend tells me.

I have made my mind up, I'm going for the Fury X or the Fury, all comes down to which hit the shelves first and I really don't care about the price. It can be 500-6-7 euros, I don't care, all I know is that I want it.
The Fury looks good to me too. How, and why they're getting such good numbers out of Fury intrigues me to no end. Is 4GB of memory somehow relevant again?
samljerCard not relevant for its given performance....

AMD FUKED up this card putting only 4.

I really wanted this card too, I waited so long, now I'm going to end up with the 980ti.


Take my advice guys... 4GB is not enough for this GPU.
and if your gaming at 1080.... then you don't need this GPU anyway.
my wife still uses a GTX 680 that pushes 1080 at 200fps + in most titles
with the new stuff still hitting 60-65 without breaking a sweat.
Maybe a totally redesigned memory interface and redesigned memory have combined to enhance the performance of that measly 4GB?
Is it not possible that what we know as fact could be changing because of it? Could 4GB of a brand new design change the way we game on our PCs?
Posted on Reply
#72
Flazza
I am fairly certain even in capacity alone you cannot compare 4gb gddr5 with 4 gb hbm, simply because hbm being clocked at a much lower speed doesnt have all the re-transmitted bus memory errors which you normally get with gddr5.

Someone will hopefully correct me if I am wrong, i am just going by from what I remember of an old white paper on some of the technologies built into gddr5 (error detection and re-transmission).
Posted on Reply
#73
GAR
FAKE, FAKE, AND FAKE. Why report BS? Just wait one more week.
Posted on Reply
#74
Flazza
GARFAKE, FAKE, AND FAKE. Why report BS? Just wait one more week.
Very true

Human nature I suppose, when little information is presented we tend to fill in the blanks ourselves.

Not that I will be buying any cards any time soon... I think my dual r9 290's will last a fair bit longer under dx12.
Posted on Reply
#75
buildzoid
ZoneDymonobody going to mention the R9 390x seems to preform quite a bit better then the r9 290x?
5% core clock increase and 20% mem clock increase. 10% overall increase. Looks reasonable and not unexpected IMO.
Posted on Reply
Add your own comment
Apr 23rd, 2024 04:09 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts