• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Editorial AMD Didn't Get the R9 Fury X Wrong, but NVIDIA Got its GTX 980 Ti Right

The framerate target on AMD cards should bring down power consumption. Has this even been used so far? The tech makes sense though. No need to draw stuff at 250fps if 144fps is enough for your screen. Plus it can eliminate tearing without using V-Sync...
 
The Fury X is not a 1080p card, the performance doesn't scale well as you decrease resolution, unlike the 980ti. I know the argument can be "why would you buy a 980ti or Fury X for 1080p" honestly I would much rather a minimum 60fps @ 1080p in all my games than a 35fps 4k.

This.

Also, 4K gaming is still too early (IMHO) and I am guessing most PC gamers play their games on a 21-27" desktop monitor, 2 feet from their face instead of sitting on a couch 10 feet from the TV. 4K makes no sense on unless you're playing games on a 50"+ screen.
 
both cards performed well regardless of what benches says... those who are sticking with 1080p will find these 2 monsters more than enough to push new games with respectable quality & fps over the magical 60fps, though 1440p & higher resolution need to tone down on certain games to give a nice balance of smooth gameplay with decent fps. Those who expect too much might find them disappointing. But if you ask me which card is the most power efficient there is for today's demand & have around $700 to spend, I would pick the 980Ti. The Fury X is a good pick, but there's still some things AMD need to iron out for the otherwise a really good card for your money.
 
Competition is good and bring us all a better product and FuryX currently cough with incomplete drivers. Wait a month and definitely will win out more from the processor.
GTX980ti I would not brag because it does not have all processors(This is waste,filtter away processors in the production of MG-200 for TITAN-X and Quadro 6000. TITAN-X is It is competitive but are made for DirectX11 and is excessive expensive showe of . nVidia sorry you're here so far behind in the technology and front at the price.:shadedshu:
I want to see as soon as possible on the market 2xGPU FuryX card option with normal price.:)
 
This.

Also, 4K gaming is still too early (IMHO) and I am guessing most PC gamers play their games on a 21-27" desktop monitor, 2 feet from their face instead of sitting on a couch 10 feet from the TV. 4K makes no sense on unless you're playing games on a 50"+ screen.

I've once played Natural Selection 2 on my former 42 inch FullHD LCD TV at 1m distance. The sensation was amazing, it felt like you're standing in the game since all I could see was the ingame world. But 1080p was really a problem there, because of the screen size. Gotta try my Philips 42PUS7809 4K LCD once just for the lulz to see how that feels in same conditions. Should be significantly better...
 
Well and then there is DSR, VSR and other SSAA implementations for fullhd users, who value eye candy...
 
This was a good read & would have to agree with you on pretty much everything. I'm still baffled on which card to go with, Fury X or GTX 980Ti. Then again I don't play that many high demanding games but my itch for new shiny OP hardware overtakes me :D I'm gonna have to be patient & see how the normal Fury & Nano will perform.
 
Exactly my point... ;)

8800 Ultra was $830
8800GTX was $600-$650...

... that was 9 years ago.


So here's what Inflation has done.

$650 in 2006 is equal to $766
$830 in 2006 is equal to $979

This is according to the Bureau of Labor Statistics.


So... not all that bad. :)
 
Quite an interesting read. Thank you, btarunr.


Interesting article. It's nice to see a real look at the AMD Nvidia competition lately.

Now, I hate to be that guy, but there are a lot of comma errors in here that are really distracting (things you do in some of your other posts too).

"The timing of NVIDIA's GeForce GTX 980 Ti launch, had us giving finishing touches..." The comma here is unnecessary. It breaks up the subject and the predicate, which commas shouldn't do. This comes up frequently. Such as here: "The Radeon R9 290 series launch from Fall-2013, stirred up the high-end graphics market in a big way." Again, no comma here, or here; "$300 gets you not much different from what it did, a month ago."

Again, I really enjoyed the article, so I'm sorry to point this stuff out like this, but I've seen it pop up in other articles you've written and thought it might help.

Another linguistics enthusiast! Do my eyes deceive me?
 
This.

Also, 4K gaming is still too early (IMHO) and I am guessing most PC gamers play their games on a 21-27" desktop monitor, 2 feet from their face instead of sitting on a couch 10 feet from the TV. 4K makes no sense on unless you're playing games on a 50"+ screen.

As a member of the 8'' from a 65'' nutjobs...I disagree. We're a growing segment of the population. This is evident not only by the Nano (mini ITX), but the excitement surrounding the Fury form-factor in general. It is much more attune to the performance required for such a setup while staying within a reasonable (microATX) form-factor...something long-since coveted. People want this, the industry knows it; look at the scrambling to make 'Steambox' accessible and quantifiable by levels of expected performance or even things like AMD's Quantum (which, barring the buffer size for future high-resolution/badly optimized titles [outside the possibilities of DX12]), should have the brute force for 4k30 and/or Oculus/Vive at 90hz. The big players know exactly where the target lies across the spectrum, and nothing perfectly satiates it at this juncture. That will change next gen.

The point I have long (going on three years) contended is that 28nm was never going to get us to any sustainable level of 4k nor 1080p120; it was always going to about a good 1440p experience (and it has pissed me off to no end these solutions are being sold touting 4k). It's a combination of many factors: bandwidth, buffer size, power requirements of units/clocks to make it feasible to scale throughout the life of this console generation...etc. We were always destined to get painfully close, although I very much thought both companies would be more on par to what AMD achieved (with Hawaii/Fiji) than what nvidia pulled off with GM204/GM200 (overclocked; absolute performance). There is certainly an argument to be made for those product as we wait for the next generation; but the fact remains they don't quite reach the next threshold per segment.

That alll said, that next-generation is where things get real. 4k30 or 60 sustainable on a single card without the fuss of multiple cards. Large buffers. Small card sizes; performance of gpus lining up closer to the ram configurations in consoles scaled across resolutions (even if it shouldn't have to be that way) PER segment...power and price.

For geeks like me, I am so completely excited by the amount of bandwidth/floating-point that these cards should bring. Why, beyond gaming? MADVR.

I'm (within realistic budget) an image snob...I want my porn media to look good. The idea of what these cards should be able to achieve (as remember multi-card doesn't work for real-time image processing) is tremendously exciting, and should be leaps and bounds better than what is currently possible with most consumer cards. Piled on top of video blocks that actually decode highly-compressed (see realistically steamable) VP8/9 (plastic) and h.264/h.265 (noisy) codecs, the idea remains in my head that scaling/processing many different aspects of video, even with FRC, should be possible while all looking good.

It's all about the WHOLE package, and this generation just didn't get there compared to what they want us to believe. It's not AMD nor nvidia's fault; it's a process limitation. If this month of launches has me grateful for anything, it's simply because we're that much closer to 14/16nm. I've said it before, but I'll say it again; it doesn't matter what kind of user you are; high or low rez, high or low framerate, simply IoT or hardcore gamer/video snob; 2016 is going to be one hell of a year. The convergence of technology that should be possible is going to be nothing short of amazing.
 
Last edited:
But Nvidia is trying to throw AMD out of the mid-hi end market, before AMD comes out with Zen. They did it with 970, they did it again with 980Ti.
Zen is a Cpu not GPU so nvidia wouldn't be doing a thing in that market that is intel's territory
There's still the Fury card to be released and this could offer pretty similar performance to the Fury X at a cheaper price.

seen some reports that will be cut down version of around 10%. Besides that wonder how well the card will do using an air cooler will be the telling spot.
 
It does seem Nvidia have dictated the pricing this round, they thought how can we still make a healthy profit, rain of AMD's parade, whilst simultaneously putting the boot in on any potential margins.

AMD have released a card at the same price point which ultimately isn't any faster than a stock reference 980 Ti, using a huge chip with cutting edge HBM, an AIO cooler and while offering less VRAM in the process. Maxwell is still more efficient with GDDR5 and has the benefits of HBM power savings still to come, let along the new process node.

If the rumours of an $800+ price tag were even remotely true before the 980 Ti dropped, then it's Nv that have done consumers a favour, whether that really is at AMD's expense of course remains to be seen.
 
a pleasure to read, it nails all the points.

thanks @btarunr :toast:

:lovetpu:
 
I think there were two big problems that lead to the disappointing Fury X results. AMD's hype and AMD's ego.

They flat out over hyped this card, to the point of flat out directly lying about it. They put out so much hype false information that there was no way it could be anything but disappointing.

Also, their ego is making the price this card higher than it should be and the card disappoints at this price.

Their marketing killed them. If they would have said from the beginning that they were putting out a card that performed almost exactly the same as the 980Ti at 4k, and their launch price was $50 cheaper when they released Fury X, it would have been a lot better received. The product's first impression sticks with it, and if you put a bad taste in people's mouths at launch, it is hard turn things around. And the Fury X's launch put a bad taste in a lot of people's mouths. Even with it's flaws at lower resolutions(which I believe will be worked out with driver updates), the 4k performance is there. But instead they flat out said the Fury X outperformed the 980Ti at 4k, they also said typical gaming temperatures were under 50°C and that isn't true either. The marketing was bad, the card is solid though.
 
Zen is a Cpu not GPU so nvidia wouldn't be doing a thing in that market that is intel's territory
Technically Zen is both CPU (Summit Ridge - the next procs to bear the FX name), and APU ( Bristol Ridge - w/Arctic Islands GPU). Also a low power Basilisk series to combat Intel's Atom architecture.
I think Summit Ridge arrives first - since AMD desperately need to re-establish a server presence, but the introduction date has been pushed out from Q1 2016 to Q3 2016 - probably the reason that Intel moved out the 10nm Cannonlake architecture, and decided to get another series of 14nm introduced (Kaby lake) in its place.
 
The framerate target on AMD cards should bring down power consumption. Has this even been used so far? The tech makes sense though. No need to draw stuff at 250fps if 144fps is enough for your screen. Plus it can eliminate tearing without using V-Sync...

You now what the most retarded thing is about AMDs frame limiter?

It only works up to 95 Hz ... (brain fart)²
 
Zen is a Cpu not GPU so nvidia wouldn't be doing a thing in that market that is intel's territory

You are missing the point. A good Zen means more sales and more income for AMD, that will translate to more R&D also for GPUs. But that will come much later. What will be almost instant from a good Zen core, will be much better performing APUs. Today, AMD's APUs have a very nice iGPU that will have to go along with a very bad CPU. That means the iGPU in today's APUs underperforms. Think for example what the iGPU in A10-7850K can do, not with piledriver cores, but for example Ivy Bridge cores. Add to that DDR4 and probably latter HBM. And that's without talking about Intel's iGPUs. More EUs and more cache there in future models.

That's why Nvidia needs to become a monopoly in discrete GPUs, though hardware and mostly thought proprietary techs(software or hardware). That's why it tries to create problems to AMD through aggressive pricing, first with 970 and now with 980Ti.
 
After reading the reviews around the internet all I ended up with is the question what could Nvidia do with HBM?

It's not that I'm disappointed by what AMD showed, quite the opposite. It's just that the hype was to high and I wanted AMD to kick some green arse again.
I can't say if AMD is back in the game yet... It will take a generation or two to really see if they can stay afloat the competition.

I'm really eager to see what will come at the CPU side of AMD.
Better core performance at same TDP that will match Intel offerings?! I'll take it now!
 
You now what the most retarded thing is about AMDs frame limiter?

It only works up to 95 Hz ... (brain fart)²

Really? What the hell is the point of it then!? 144Hz monitors aren't exactly exotics anymore (if I could afford one)...
 
For peeps at 1440p upwards it's a good situation. Both cards are so equally placed that the prices might be liable to fluctuate. Given the performance (stock) can be said to be a dead heat, AMD can sell the Fury X as a more 'innovative' piece of tech. Cooler, quieter and damn sexy looking. Perhaps Nvidia will allow the 980ti price to drop down, put tremendous pressure on AMD, or will their cartel resurrect itself?

In a mildly modding way, Nvidia have the edge for now with a far more flexible overclocking device (core and memory) but either device would work in my rig. I'm waiting on blocks regardless so will watch pricing and driver maturity - though I doubt the optimism of some people will be realised with dramatic improvements. I think they released Fury X near as fast as it will get, given the reviews so far. Any changes may need to be BIOS revisions.

I'm still holding out for a 980ti 'metal' (mythical unicorn card) that is a full GM200 core with 6GB and unlocked TDP. Hell, if AMD do work wonders with making Fury X even better, I dare say Nvidia would do this for the reps.
 
both cards performed well regardless of what benches says... those who are sticking with 1080p will find these 2 monsters more than enough to push new games with respectable quality & fps over the magical 60fps, though 1440p & higher resolution need to tone down on certain games to give a nice balance of smooth gameplay with decent fps. Those who expect too much might find them disappointing. But if you ask me which card is the most power efficient there is for today's demand & have around $700 to spend, I would pick the 980Ti. The Fury X is a good pick, but there's still some things AMD need to iron out for the otherwise a really good card for your money.

too expensive for this performance. both of them. sorry. at this pricepoint we should be able to get steady 60fps in 4k minimum with maximum quality settings. Right now those cards are just good 1080p cards.... and good 1080p card shouldnt pay more than 400$.
 
Last edited:
Exactly my point... ;)

8800 Ultra was $830
8800GTX was $600-$650...

... that was 9 years ago.
and a continuation of the price hikes set by the 7800 GTX 512MB and 7900 GX2.

It's funny, back when ATI/NVidia were accused of price fixing cards topped out at 500$. After an anti competition lawsuit suddenly 500$ seems cheap.
 
The framerate target on AMD cards should bring down power consumption. Has this even been used so far? The tech makes sense though. No need to draw stuff at 250fps if 144fps is enough for your screen. Plus it can eliminate tearing without using V-Sync...

I take it you have never played a true FPS game at a competitive level. Trust me when I say that you can feel a difference between 300+ FPS and 120-144 FPS when fx. playing high level CSGO @ LAN. I know that "in theory" you cannot see more, but the game will still be showing you newer images, thereby increasing your immersion. One thing to argue though, is the limiting factor of online games, caused by tickrates.
 
Let's do some math together, and start seeing if our BS detectors go off.

Interest rates, in a "healthy" economy, are somewhere between 2 and 5 percent depending upon which model you prefer. Less than that, and the economy is stagnating. More than that and buying power is severely limited.
Interest functions as a power equation. This means that the dollar value will be multiplied by the interest rate to the power of the number of years that you've been tracking. IE > dollars adjusted = dollars original * (1+ % inflation)^years.
A little searching, and it seems like the GeForce 7800 GTX 512 came out in 2005 at $650 MSRP.

To those with a calculator, that means the cost for it in adjusted dollars, assuming a 3% inflation rate, is $873.60 adjusted. These cards are currently siting at the $600-$700 range, depending upon seller.
That's right kiddies, a decade of 3% inflation means 34% greater cost in adjusted dollars.
If we're to go back to 2000, the inflation increase amounts to 56%. In round numbers, that $500 card cost more than $750 today.


So, BS detector time.
1) Is the current price out of line; demonstrably not.
2) Are either Nvidia or AMD charging too much; if it sells in a free economy obviously not.
3) Who wins? This is a mixed bag. I feel that performance doesn't justify an upgrade. If you go into this cold, the consumer wins by getting a better card for less price (adjusted). Neither AMD nor Nvidia win, because there's no reason for the majority of users to upgrade. The real winner here is TSMC. They managed to fabricate with the same technology for 5 years. That's insane. If I were Nvidia and AMD right now I'd be making sure the TSMC boss could taste my shoe leather, despite having entered from the oposite end of the digestive tract.
 
The framerate target on AMD cards should bring down power consumption. Has this even been used so far? The tech makes sense though. No need to draw stuff at 250fps if 144fps is enough for your screen. Plus it can eliminate tearing without using V-Sync...
not true at all, you dont remove tearing if you cap, you only remove it if you sync

people should be capping/syncing in the first place for years, i know i am, why should any hardware uselessly add more fps (excluding competitive)? or worse, why should anything at any time go to 3,000 fps in a menu!?

anyway you dont 'bring down power consumption' if many sites are testing at 4k with fps in the 40s, you only bring it down if you're originally going past 60 for sustained periods (or whatever monitor refresh you have)

Ive had pretty much all the high end cards that have come out in the past 19 years, plenty of them have been from ati and some from nvidia, I have used the riva tnt 16mb, tnt 2, 3dfx voodoo 1 and voodoo2 12mb in sli, ive used the 5870, 7970, both I believe are one of ati's best cards, ive had the legendary 8800gtx, and the not so great 9800gtx, ive had 4850's in crossfire, and even the 4870x2, I love both companies and competition is only good for the consumer, remember that people.
proper enthusiast right here :respect:
 
Back
Top