Friday, June 19th 2015

Radeon R9 Fury X Faster Than GeForce GTX 980 Ti at 4K: AMD

AMD, in its press documents, claimed that its upcoming flagship single-GPU graphics card, the Radeon R9 Fury X, will be faster than NVIDIA's recently launched GeForce GTX 980 Ti, at 4K Ultra HD resolution. This puts to rest speculation that its 4 GB of video memory hampers performance against its competitor with 6 GB of memory. From the graph below, which was extracted from AMD's press material, the R9 Fury X will be faster than the GTX 980 Ti, in even the most memory-intensive games at 4K, including Far Cry 4, The Witcher 3: Wild Hunt, Crysis 3, Assassins Creed: Unity, and Battlefield 4. Bigger gains are shown in other games. In every single game tested, the R9 Fury X is offering frame-rates of at least 35 fps. The Radeon R9 Fury X will launch at $649.99 (the same price as the GTX 980 Ti), next week, with market availability within the following 3 weeks.
Source: The TechReport
Add your own comment

102 Comments on Radeon R9 Fury X Faster Than GeForce GTX 980 Ti at 4K: AMD

#51
HM_Actua1
face palm.....Apples to oranges..... Fiji to Maxwell.

wait until Pascal drops. Have your tissues ready.
Posted on Reply
#52
TheGuruStud
Using vsync with freesync:wtf:

Use a frame limiter. It's even built-in in a lot if newer games that I have.
Posted on Reply
#53
the54thvoid
Intoxicated Moderator
I can see the TPU benchmark forums starting to get interesting again. It's been Nvidia on top for ages now. This might be good for some red/green jolly japes.

This Hexus review shows the Zotac Amp Overclock (not the Amp Extreme version mind)

Given the AMD slide puts the 980TI at about 33fps (it's not far off Hexus 31-36) it shows a decent lead at 40-44 fps for an overclocked 980ti (>20% faster than stock). Exciting times ahead - I hope the Fury X is as capable at overclocking as Maxwell. Makes my purchasing decisions harder though. :laugh:

Posted on Reply
#54
v12dock
Block Caption of Rainey Street
Hitman_Actualface palm.....Apples to oranges..... Fiji to Maxwell.

wait until Pascal drops. Have your tissues ready.
2H 2016 that also gives time for AMD to refine HBM, move to 16 FinFET, and make architectural improvements.
Posted on Reply
#55
moproblems99
GreiverBladewhy should they compete with a card that is not even a gaming card (but that people buy as a gaming card) that is 4% faster (averaged) at 2160p than a 980Ti and cost 450$ more ~
Didn't the Titan X get neutered for compute? Which makes it a $1000+ 12GB gaming card.
Posted on Reply
#56
xorbe
btarunrOver 2.5 million 4K monitors have been sold to end-users so far.
Living room TVs, or computer screens hooked up to gaming PCs?
Posted on Reply
#57
64K
the54thvoidThey can do it if they want, which is a bit annoying. A full Titan core, 8 pin power, higher tdp, AIB coolers and higher stock clocks. Would make a 10-20% faster (than 980ti) card.
I'll hang on for Fury but given the leaked AMD benches, I see a Classy 980ti on water being my next card, unless Fury has decent OC headroom (which AMD imply it has).
A very good day for enthusiasts. Not so for those that buy rebrands.
An OC Classified 980 Ti on water cooling will be a beast indeed!
Posted on Reply
#58
btarunr
Editor & Senior Moderator
xorbeLiving room TVs, or computer screens hooked up to gaming PCs?
I believe I used the word "monitor," so that excludes televisions.
Posted on Reply
#59
N3M3515
HumanSmokeI'm hoping it is faster than the 980 Ti and Titan X. It might be the ONLY way to get a fully enabled GM 200 with balls-to-the-wall voltage control and high clocks at a reasonable price.
Wasn't the titan x a fully enabled GM 200?
Posted on Reply
#60
TheGuruStud
N3M3515Wasn't the titan x a fully enabled GM 200?
He means if fury is better than titan x, then it could force nvidia to release a higher clocked, OC friendly version of titan x - titan xxx :laugh:
Posted on Reply
#61
the54thvoid
Intoxicated Moderator
TheGuruStudHe means if fury is better than titan x, then it could force nvidia to release a higher clocked, OC friendly version of titan x - titan xxx :laugh:
You laugh but you know what - they bloody might well just do something as 'guff' as call it 'Triple X'.

But then Fury can do the exact same thing. Battle of the hardcore Pr0n.
Posted on Reply
#62
xfia
www.maximumpc.com/ces-2015-amd-demonstrates-freesync-technology-and-nano-pc-video/
got the scientist here showing you should have your settings high enough to be in the 45-60fps range. i dont where that slide come from buts its not something they really like to say about how it works but as mentioned alot of games sync frames and have pretty good dynamic framerate. they are releasing a driver bases dynamic frame control soon enough.
i think the difference in fps we see from amd may be the experience with catalyst..
why not settings like this if a game is running easy max over your refresh rate

or why not settings like this in a more balanced scenario

what if a apu needs a little boost in performance
Posted on Reply
#63
arbiter
bobbavetWhy did the Product slide produce earlier by AMD state 54fps for Crysis and yet this graph shows around 45fps?
THat is pretty suspect how they claim one fps one day then 20% less the next. AMD marketing and tech side haven't really been on same page for anything for a while.
m6tzg6rSo its faster in 4K? Well the 13 people who game in 4K are probably happy to hear that.
sakai4evaWell, for someone like me who doesn't game in 4k but would like to in the future, getting a card that is capable of doing that now would make it possible for incremental upgrades that doesn't break the bank every once in a while.
Does seem like from settings page the settings were tuned to keep ram usage under 4gb.
xfiamaybe not so much with a beast 512bit bus, 8gb vram and compression.
The compression was a feature of GCN 1.2, 390x is GCN 1.1. 380 is gcn1.2. Even then benchmarks of 390x show marginal performance boosts with higher memory clock only 5 to 8% for what is a 20% memory boost. 380 is only 256bit
mirakulI dare say that Sleeping Dogs used more Directx 11 tech than some Gameworks titles from Ubi$oft. It's a beautiful and fun game,
Sleeping dogs was complete CRAP of a game. Graphic's crap, controls were worse then GTA4. it made GTA4 look like a well running game.
Kaynaruhmmm comparing similar models, i'd say a gsync screen is about 200$ more than a freesync.
xfiafreesync has complete frame rate control for 4k and up to like 90 something fps(hz) at 1440p while syncing the refresh rate down to like 9fps(hz)
freesync is practically perfect.. its gsync with the 1 frame of latency.
i'll groups your 2's post together since both talkin same stuff. G-sync does cost more cause its difference between 2 techs of 1 that was worked on for years to prefect it and 1 that was thrown together in a month to compete. Freesync is perfect? Yea sure if you don't mind Ghosting, or tearing when fps drops under 40fps. Before you try to blame the ghost on the panel its not the panels fault its not. Nvidia took time and effort to test all types and models of panels to see which ones work well doing VRR and which ones suck and made a list of ones that are good to use for G-sync. G-sync only has a small fps lose which is only around 1% on kepler cards since they had to a part of the work in drivers since kepler cards lacked hardware to do a certain thing. But that does make a ton of nvidia cards that support g-sync then AMD had for freesync. its been confirmed that r7/r9 370(x) card doesn't even support freesync, that is pretty sad. I would call that pretty unacceptable.
Posted on Reply
#64
HumanSmoke
the54thvoidI see a Classy 980ti on water being my next card, unless Fury has decent OC headroom (which AMD imply it has).
Word has it that AMD won't allow the memory to overclocked, and AMD's own benchmarks show that while the core can be overclocked, the net gain isn't overly spectacular.
From the AMD press deck. Fury X overclocked by 100MHz (9.5% overclock)


Seems in line with other current GPUs, but the 9.5% overclock margin isn't that impressive.
Posted on Reply
#66
Ferrum Master
HumanSmokeWord has it that AMD won't allow the memory to overclocked, and AMD's own benchmarks show that while the core can be overclocked, the net gain isn't overly spectacular.
From the AMD press deck. Fury X overclocked by 100MHz (9.5% overclock)


Seems in line with other current GPUs, but the 9.5% overclock margin isn't that impressive.
You know what was the CPU and resolution on that bench? If 4K... you know... it is good... 1080p ain't... and most importantly, I believe this card needs a hell of a CPU clocked high... you know it is an AMD.

ADD.

And overclocking memory... for more bandwidth... on HBM? Naah... i would locked it too.
Posted on Reply
#67
arbiter
TheGuruStudArbiter's trolling skills are insane.
Sad how one can speak truth's but called a troll.
Posted on Reply
#68
xfia
i wont argue how adaptive allows for a wide array of oem costomization..
so currently gsync does show some advantage at this point so nvidia does pull ahead at this point with adaptive sync but for how for long?
Posted on Reply
#69
the54thvoid
Intoxicated Moderator
HumanSmokeWord has it that AMD won't allow the memory to overclocked, and AMD's own benchmarks show that while the core can be overclocked, the net gain isn't overly spectacular.
From the AMD press deck. Fury X overclocked by 100MHz (9.5% overclock)


Seems in line with other current GPUs, but the 9.5% overclock margin isn't that impressive.
Ferrum MasterYou know what was the CPU and resolution on that bench? If 4K... you know... it is good... 1080p ain't... and most importantly, I believe this card needs a hell of a CPU clocked high... you know it is an AMD.

ADD.

And overclocking memory... for more bandwidth... on HBM? Naah... i would locked it too.
CPU is 5960k. I read the link to source. If the source info is true the overclock is quite feeble. A 980ti can go 20% over stock in performance.......

Still, awaiting Wednesday.

@W1zzard - when you bench (when you publish what you have benched - can you do an apples to apples, balls to the wall overclock on an intensive game, Fury X versus 980ti - both at max OC? Neutral, non Gameworks and ultra everything so VRam is high. This would be good to see.
Posted on Reply
#70
Ferrum Master
the54thvoidA 980ti can go 20% over stock in performance.......
Yes but we also saw those 20% on quite juicy clocked machines.
Posted on Reply
#71
N3M3515
HumanSmokeWord has it that AMD won't allow the memory to overclocked, and AMD's own benchmarks show that while the core can be overclocked, the net gain isn't overly spectacular.
From the AMD press deck. Fury X overclocked by 100MHz (9.5% overclock)


Seems in line with other current GPUs, but the 9.5% overclock margin isn't that impressive.
Bad news for amd then, the gigabyte 980Ti g1 gaming is already 15% better than stock 980Ti, and has room for 14% more according to w1zz review.
Posted on Reply
#72
GreiverBlade
moproblems99Didn't the Titan X get neutered for compute? Which makes it a $1000+ 12GB gaming card.
well ... nobody (except some enthusiast with more money than usual) consider a 1000$ single gpu board for gaming ... nvidia played dirty with the neutered compute maneuver :D
for me the Titan X is not a gaming card.
Hitman_Actualface palm.....Apples to oranges..... Fiji to Maxwell.

wait until Pascal drops. Have your tissues ready.
funny one ... Fiji is Maxwell contender ... not the 2yrs one that still hold it who populate the 3XX line ...

the next gen after Fiji is Pascal contender.
Posted on Reply
#73
Bansaku
bobbavetWhy did the Product slide produce earlier by AMD state 54fps for Crysis and yet this graph shows around 45fps?
Dislexia? :)
Posted on Reply
#74
GhostRyder
arbiteri'll groups your 2's post together since both talkin same stuff. G-sync does cost more cause its difference between 2 techs of 1 that was worked on for years to prefect it and 1 that was thrown together in a month to compete. Freesync is perfect? Yea sure if you don't mind Ghosting, or tearing when fps drops under 40fps. Before you try to blame the ghost on the panel its not the panels fault its not. Nvidia took time and effort to test all types and models of panels to see which ones work well doing VRR and which ones suck and made a list of ones that are good to use for G-sync. G-sync only has a small fps lose which is only around 1% on kepler cards since they had to a part of the work in drivers since kepler cards lacked hardware to do a certain thing. But that does make a ton of nvidia cards that support g-sync then AMD had for freesync. its been confirmed that r7/r9 370(x) card doesn't even support freesync, that is pretty sad. I would call that pretty unacceptable.
Ok first of all you need to actually research freesync and gsync before speaking about issues. Instead of making up issues to try and make one sound significantly more superior to the other... Gsync has had many issues with complaints about Flickering, Ghosting, ETC as well so don't act like Gsync is this perfect entity. Also your really complaining that the R7 370 does not support freesync? While I don't like that it doesn't how many people do you see running off to buy a 1440p 144hz monitor with freesync and then grabbing a an R7 370. Would be the same idea seeing someone grabbing a GTX 750ti (or 760) and doing the same thing...
Hitman_Actualface palm.....Apples to oranges..... Fiji to Maxwell.

wait until Pascal drops. Have your tissues ready.
Please explain how this is apples to oranges? These are this generations contenders???
TheGuruStudHe means if fury is better than titan x, then it could force nvidia to release a higher clocked, OC friendly version of titan x - titan xxx :laugh:
IF they do that, ill post it here and now I will purchase 3 of them (Call it the XXX Titan)

I want to see overclocking performance of the card. That is what will matter in the end and if there are any aftermarket variants for better overclocking (Lightning)
Posted on Reply
Add your own comment
Apr 24th, 2024 09:03 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts