Tuesday, December 20th 2011

AMD's Own HD 7970 Performance Expectations?

Ahead of every major GPU launch, both NVIDIA and AMD give out a document to reviewers known as Reviewer's Guide, in which both provide guidelines (suggestions, not instructions), to reviewers to ensure new GPUs are given a fair testing. In such documents, the two often also give out their own performance expectations from the GPUs they're launching, in which they compare the new GPUs to either previous-generation GPUs from their own brand, or from the competitors'. Apparently such a performance comparison between the upcoming Radeon HD 7970 and NVIDIA's GeForce GTX 580, probably part of such a document, got leaked to the internet, which 3DCenter.org re-posted. The first picture below, is a blurry screenshot of a graph in which the two GPUs are compared along a variety of tests, at a resolution of 2560 x 1600. A Tweakers.net community member recreated that graph in Excel, using that data (second picture below).

A couple of things here are worth noting. Reviewer guide performance numbers are almost always exaggerated, so if reviewers get performance results lower than 'normal', they find it abnormal, and re-test. It's an established practice both GPUs vendors follow. Next, AMD Radeon GPUs are traditionally good at 2560 x 1600. For that matter, the performance gap between even the Radeon HD 6970 and GeForce GTX 580 narrows a bit at that resolution.

Source: 3DCenter.org
Add your own comment

101 Comments on AMD's Own HD 7970 Performance Expectations?

#1
Super XP
trickson said:
Bull shit ! If ATI did not exist then why the hell is it on the fucking box ? When I go to the store I see ATI NOT AMD?ATI on the box ! So shut up bitch :laugh:
Umm, Rory (AMD's CEO) is bringing back the ATI logo and making a Discrete Graphics Card Division once again. :rolleyes:
Posted on Reply
#2
cadaveca
My name is Dave
Super XP said:
Umm, Rory (AMD's CEO) is bringing back the ATI logo and making a Discrete Graphics Card Division once again. :rolleyes:
:eek:


WHUT?


:wtf:
Posted on Reply
#3
BazookaJoe
cdawall said:
BOTH COMPANIES DO THAT

Thanks for playing though.
""Per-Game Optimizations" started as a good idea - to boost compatibility with various games, and both nVidia and AMD/ATI Use it "

Reading can be very difficult. Sorry you weren't a winner this time.

Thank YOU for playing.
Posted on Reply
#4
Tenxu24
This graph may be true or may be false, be water my friends !!!



It all depends on bechmarks when some computer expert has the amd 7970, I personally believe to be true due to 3 reasons, first and 1000mhz gpu design, the second speed virtual memory and the third the 3 gigabytes of virtual memory
Posted on Reply
#5
phanbuey
According to quantum mechanics, the graph can be true and false at the same time.
Posted on Reply
#6
Recus
Tenxu24 said:
1000mhz gpu design
AMD didn't mention that in any slide.

Posted on Reply
#7
Patriot
John Doe said:
Nope. You're spreading fanboyism and garbage. OCP first came out in 400 series cards, not 500. The reason they wrote it was to tame the monster that was the 480; for people who didn't know what they're doing. The card was built to withstand those temps. Those that did knew what they're doing disabled it.
http://www.techpowerup.com/forums/archive/index.php/t-139617.html
http://www.overclock.net/t/929152/have-you-killed-a-570-no-recent-deaths-buy-some-570s

nope my memory is correct thanks for playing ...
Nvidia has OCP ATI/AMD has Powertune...

and the 580 is just a fully working 480...btw... :banghead:
Posted on Reply
#8
brandonwh64
Addicted to Bacon and StarCrunches!!!
phanbuey said:
I can see Ben's point - they prolly pushed the release date up last moment. I envision it like this:

AMD VP: "OH S**T! Christmas is on the 25TH!!?!?! When the hell did that happen?, Suzy - Quick get Steve on the phone- I need that damn 7970 OUT NOW!"

Suzy: "But sir, the reviewers don't even have the cards yet"

AMD VP: "DAMMIT SUZY! I don't pay you THINK! Get Bob from marketing to make one of his famous graphs, then leak it on the internet."
This would be bad ass to have someone make a video of this and do a full reenactment
Posted on Reply
#9
(FIH) The Don
brandonwh64 said:
This would be bad ass to have someone make a video of this and do a full reenactment
would love to see that too instead of that same old hitler/untergang video clip
Posted on Reply
#11
phanbuey
The cast would have to be good:
AMD VP = Alec Baldwin
Suzy = Liz Lemon/ Tina Fey
Bob from Marketing = Tracy Morgan - "These cards are PHAT, they is at least 1.3x the speed of the other cards - I made this graph to show the relationship."
Posted on Reply
#12
dir_d
Not gonna lie...looks disappointing for the money. Looks like i might pickup 2 7xxx cards based on VLIW4. I dont use the GPGPU features so that extra money for that performance is not worth it for me.
Posted on Reply
#13
erocker
Senior Moderator
dir_d said:
looks disappointing for the money.
What's dissappointing for you? Performance looks to be pretty good and there has been no confirmed price yet. It looks like AMD has a card that handily beats Nvidia's top single GPU card.. of course they are going to want to milk a bit more money out of it.
Posted on Reply
#14
Crap Daddy
Those are benchmarks made by AMD, they look good but we'll have to see non-biased benchies at different resolutions to get a clearer picture. At 25x16 it destroys the 580 as it should be. It is a high-end card and it will be priced accordingly. The interesting thing is the 7950 compared to the GTX580, if it will perform better by a fair margin and cost less then NV should lower the price on their top card(s).
Posted on Reply
#15
phanbuey
^^ or if it is unlockable... that would be awesome
Posted on Reply
#16
cadaveca
My name is Dave
erocker said:
It looks like AMD has a card that handily beats Nvidia's top single GPU card..
The 7970 better damn well beat nVidia's last gen, and by a sizeable amount, too. This isn't the CPU division...AMD's GPUs in all forms compete directly with nVidia's cards, and most are going to expect that these cards are at least the same as a 6990 in terms of performance.

The specifications listed above hint that the 7970 is NOT double the performance of a 6970, so yeah, it's disappointing. With that said, there's very little reason for 6990 users to upgrade, and if the rumoured prices of $499 are correct, I see very little reason to purchase 7970, at all. It's not like current games push the 6950, even, nevermind the 6970.

Besides, it's fail becuase I cannot afford $500 for a GPU.:laugh:

:slap:
Posted on Reply
#17
Crap Daddy
All info we have until now suggest a higher price than 500$ and will retail certainly above the MSRP in the first month at least. So they should be much better than 2x6950 to be worth considering. I personally don't care, no game pushes my card at my resolution and I don't see one in the foreseeable future (a game that I might be interested in) but it's interesting to follow the developments. But again, to spend in excess of 500 Euro, which is what will cost over here, just to game, is not justifiable. If this equipmente helps you with your work then yes but I don't know if this GPU is good for anything else. (maybe somebody will correct me)
Posted on Reply
#18
phanbuey
Crap Daddy said:
All info we have until now suggest a higher price than 500$ and will retail certainly above the MSRP in the first month at least. So they should be much better than 2x6950 to be worth considering. I personally don't care, no game pushes my card at my resolution and I don't see one in the foreseeable future (a game that I might be interested in) but it's interesting to follow the developments. But again, to spend in excess of 500 Euro, which is what will cost over here, just to game, is not justifiable. If this equipmente helps you with your work then yes but I don't know if this GPU is good for anything else. (maybe somebody will correct me)
If the price is that high, then I agree - it will not be worth it - but usually these prices do not stay so high.

And they will need a 79xx product priced aggressively because when the Kepler from Nvidia comes out, if you already have a 79xx, you are less likely to upgrade... thus hurting the sales of the competition.

If however, the price is too high... then most people will want to wait for Kepler to either go that route, or to see if the price drops when there is competition in the space.

If I was them, I would release the 7970 at the same price as the 580 and then release the 7950 at the same price as the cheapest 6970 and make it unlockable. That would flood the market with 79xx parts and seriously :nutkick: Nvidia.

IMO i agree that the 7950 will be the product to watch.
Posted on Reply
#19
devguy
Will the 7950 be released/reviewed on the 22nd as well? Or just the 7970?
Posted on Reply
#20
phanbuey
devguy said:
Will the 7950 be released/reviewed on the 22nd as well? Or just the 7970?
Should be both, if past history is any indicator.
Posted on Reply
#21
Solaris17
Creator Solaris Utility DVD
This graph looks hardcore faked.

First. Why is it blurry? The Picture does not indicate that it was taken by a camera. No lines or glass or splotches.

Second. The lettering looks tampered with. It could be that someone attempted to blur it on purpose. But why in such a way? Their are far easier ways to blur a photo. Not to mention the graphs look doubled.

Third. The areas around the lettering. They look brighter white to me. Why? If a photo was blurred this should not be the case.


Someone said that this graph looks accurate according to rumors. But thats assuming the graph is legitimate. The graph could too easily and IMO probably is fake. Theirfor of course it would represent the current performance rumors.
Posted on Reply
#22
badtaylorx
wow if those DH leaks are spot on (like the BD ones) than Nvidia is gonna have a ruff go of it for a spell

:nutkick:
Posted on Reply
#24
Casecutter
I would say AMD/ATI can justify a $500 MSRP... remember TSMC bumped the 28Nm wafer cost and for that both AMD/Nvidia get less fully functioning top shelf chips. Then if the performance is something close to what's being presented, that affects on pricing compared to the competition. The last thing is the worst... strong demand and not enough inventory, which by all accounts will be a big motivator for all retail/E-tailers will set pricing above MSRP, why because there's enough folk willing to pay the inflated price for acouple of months any-who.

At $500... Nvidia hasn't any reason to worry about lowering the 580's (40Nm price is fixed) they might adjust a little and AIB will offer more custom solutions, but Nvidia has no reason to run out little the buildings on fire. They’ll hold and play their cards, because they know they'll be in the same situation when Kepler starts showing.
Posted on Reply
#25
phanbuey
Casecutter said:
I would say AMD/ATI can justify a $500 MSRP... remember TSMC bumped the 28Nm wafer cost and for that both AMD/Nvidia get less fully functioning top shelf chips. Then if the performance is something close to what's being presented, that affects on pricing compared to the competition. The last thing is the worst... strong demand and not enough inventory, which by all accounts will be a big motivator for all retail/E-tailers will set pricing above MSRP, why because there's enough folk willing to pay the inflated price for acouple of months any-who.

At $500... Nvidia hasn't any reason to worry about lowering the 580's (40Nm price is fixed) they might adjust a little and AIB will offer more custom solutions, but Nvidia has no reason to run out little the buildings on fire. They’ll hold and play their cards, because they know they'll be in the same situation when Kepler starts showing.
Yeah but we also dont know the margins on the cards or the production volumes... even if the 28nm cost is higher - if the margins are squishy enough to strategically price the card they should do so. After all, it won't help to keep a static % profit if it means you don't benefit from an early launch.

They'll want to cut into a nice chunk of Nvidia's market share, and, you're right, they won't do it with a $500 card. It would be unfortunate if they couldn't produce enough units to take advantage of what they have, but scarcity would be the only way the price would stay that high.

Posted on Reply
Add your own comment