Tuesday, December 20th 2011

AMD's Own HD 7970 Performance Expectations?

Ahead of every major GPU launch, both NVIDIA and AMD give out a document to reviewers known as Reviewer's Guide, in which both provide guidelines (suggestions, not instructions), to reviewers to ensure new GPUs are given a fair testing. In such documents, the two often also give out their own performance expectations from the GPUs they're launching, in which they compare the new GPUs to either previous-generation GPUs from their own brand, or from the competitors'. Apparently such a performance comparison between the upcoming Radeon HD 7970 and NVIDIA's GeForce GTX 580, probably part of such a document, got leaked to the internet, which 3DCenter.org re-posted. The first picture below, is a blurry screenshot of a graph in which the two GPUs are compared along a variety of tests, at a resolution of 2560 x 1600. A Tweakers.net community member recreated that graph in Excel, using that data (second picture below).

A couple of things here are worth noting. Reviewer guide performance numbers are almost always exaggerated, so if reviewers get performance results lower than 'normal', they find it abnormal, and re-test. It's an established practice both GPUs vendors follow. Next, AMD Radeon GPUs are traditionally good at 2560 x 1600. For that matter, the performance gap between even the Radeon HD 6970 and GeForce GTX 580 narrows a bit at that resolution.
Source: 3DCenter.org
Add your own comment

101 Comments on AMD's Own HD 7970 Performance Expectations?

#26
trickson
OH, I have such a headache
Just because BD was a bust doesn't mean ATI will be . I have yet to see one ATI card let me down .
Posted on Reply
#27
Unregistered
tricksonI have yet to see one ATI card let me down .
These were all nice and dandy (I didn't care about power consumption) till I put them in CrossFire. Having forced to change application names to get kicked out of Steam servers was frustrating... not to mention the heat in CF. And I had a GT too.

#28
trickson
OH, I have such a headache
John DoeThese were all nice and dandy (I didn't care about power consumption) till I put them in CrossFire. Having forced to change application names to get kicked out of Steam servers was frustrating... not to mention the heat in CF. And I had a GT too.

www.buraak.com/wp-content/uploads/2007/06/radeon_hd_2900_xt.jpg
What the flames on it did not give it away ?
Posted on Reply
#29
Unregistered
lol yeah, they were built solid though. With digital power delivery and such. I actually really liked them until they started giving issues in CF. ATi Tray Tools was nice for IQ tweaks. You still can DL it from Ray Adam's SkyDrive AFAIK.

I think that was the only "bad" ATi/AMD GPU. nVidia has had more disappointing GPU's than AMD had (*cough 590/570 cough*). AMD was the only one that provided more than sufficient power phases with Volterra's.
#30
pantherx12
John DoeYeah, this isn't new. These graphs are always deceiving. If the card was indeed 1.6 times faster, it'd have to get 100 FPS where a 580 gets 40 FPS...
Maths fail dude, 1.6 times 40 would be 64.

Considering 2 x 40 is only 80 he he:slap:
Posted on Reply
#31
BazookaJoe
*Rant in 3, 2, 1 ....

Yeah - ya know what? I've been reading articles on AMD/ATI's "EXPECTATIONS" for the last 5 years now and seen nothing but soggy half-arsed failure after soggy half-arsed failure that ONLY EVER achieves reasonable frame rates by REMOVING GRAPHIC PROCESSING from your games - reducing textures, texture processing and filtering against your will and without your permission , at driver level.

Forcing FSAA so be disabled with no way to enable it - and other cheap underhanded Bull-SCHYYTE.

I have to rename all my game exe's so I can at least use AA, BECAUSE THEIR ASS-HAT SOLUTION TO "IMPROVING PERFORMANCE" IS TO JUST FORCE DISABLE (IN THE DRIVER) ANY QUALITY IMPROVING SETTINGS , and I'm sick of it - this company has failed - it cant compete and it must stop LYING to its customers, because no matter WHAT result it claims to gets, it only gets it BECAUSE it skewed the results, the environment, AND the benchmark itself just to get them and it will never live up to that in real life without DISABLING half your options before you play any game.
Posted on Reply
#32
theJesus
pantherx12Maths fail dude, 1.6 times 40 would be 64.

Considering 2 x 40 is only 80 he he:slap:
Damn, beat me to it :laugh:
Posted on Reply
#33
Unregistered
pantherx12Maths fail dude, 1.6 times 40 would be 64.

Considering 2 x 40 is only 80 he he:slap:
No, it's not a math fail. I thought of it as %260 instead of %160.
#34
pantherx12
John DoeNo, it's not a math fail. I thought of it as %260 instead of %160.
That is still a maths fail.

At the very least it's a reading comprehension fail.

Either way you should just admit the fault perhaps laugh it off and move on.

:toast:
BazookaJoeI have to rename all my game exe's so I can at least use AA, BECAUSE THEIR ASS-HAT SOLUTION TO "IMPROVING PERFORMANCE" IS TO JUST FORCE DISABLE (IN THE DRIVER)


... Erm... what?

I've never had this problem EVER

You should probably change all the sliders in catalyst to quality instead of performance as this disables any optimisations.
Posted on Reply
#35
BazookaJoe
pantherx12I've never had this problem EVER
Unless you NEVER play games panther - that is flat out impossible.

Skyrim is the best recent example.

After a "Performance" Upgrade from AMD - Fsaa was simply REMOVED at driver level - no matter what you set in the game or your catalyst driver there is NO AA anymore - and the whole gfx goes to schyte with creepy crawly aliased edges allover the screen - but if you simply Alt-f4, find tesv.exe, Rename to quake.exe, and run the game again? BAM all processing now works perfectly again.

It is a very well known fact that AMD/ATI have been Cheating in their driver using EXE name detection for YEARS, deleting textures and grass in games like FarCry to reduce load on their struggling hardware, even Flat out cheating with Fur-Mark - and manipulating settings to effect the benchmark - in an article published in this TPU forum a few years ago.

(Edit : the Stock cooler design was so inferior that if you actually maxed the card EVEN AT STOCK CLOCKS it could physically destroy itself)

YES the idea of "Per-Game Optimizations" started as a good idea - to boost compatibility with various games, and both nVidia and AMD/ATI Use it - the fact was that it was there for COMPATIBILITY - not to cover up how lame ass rubbish your useless gfx cards are by force disabling peoples PERFECTLY COMPATIBLE quality settings just to get a better frame rate.

That is just plain fraud - and in my limited personal opinion, that is not representative of TechPowerUp or it's administration, and based on my personal experience with AMD/ATI products, I believe AMD/ATI are frauds - and anyone new to the game should take their "Claims" and "Expectations" with a large tablespoon of salt.
Posted on Reply
#37
pantherx12
BazookaJoeUnless you NEVER play games panther - that is flat out impossible.

Skyrim is the best recent example.

After a "Performance" Upgrade from AMD - Fsaa was simply REMOVED at driver level - no matter what you set in the game or your catalyst driver there is NO AA anymore - and the whole gfx goes to schyte with creepy crawly aliased edges allover the screen - but if you simply Alt-f4, find tesv.exe, Rename to quake.exe, and run the game again? BAM all processing now works perfectly again.
Thing is, it works fine for me, you sure you have looked at your cat settings?

More specifically " Catalyst A.I" It may say it's just for textures in the description but setting it to high quality instead of performance/quality seems to disable all optimisations.

And as far as I know has always done this.

"Catalyst A.I: Catalyst A.I. allows users to determine the level of 'optimizations' the drivers enable in graphics applications. These optimizations are graphics 'short cuts' which the Catalyst A.I. calculates to attempt to improve the performance of 3D games without any noticeable reduction in image quality. In the past there has been a great deal of controversy about 'hidden optimizations', where both Nvidia and ATI were accused of cutting corners, reducing image quality in subtle ways by reducing image precision for example, simply to get higher scores in synthetic benchmarks like 3DMark. "
Posted on Reply
#39
cdawall
where the hell are my stars
BazookaJoeUnless you NEVER play games panther - that is flat out impossible.

Skyrim is the best recent example.

After a "Performance" Upgrade from AMD - Fsaa was simply REMOVED at driver level - no matter what you set in the game or your catalyst driver there is NO AA anymore - and the whole gfx goes to schyte with creepy crawly aliased edges allover the screen - but if you simply Alt-f4, find tesv.exe, Rename to quake.exe, and run the game again? BAM all processing now works perfectly again.

It is a very well known fact that AMD/ATI have been Cheating in their driver using EXE name detection for YEARS, deleting textures and grass in games like FarCry to reduce load on their struggling hardware, even Flat out cheating with Fur-Mark - and manipulating settings to effect the benchmark - in an article published in this TPU forum a few years ago.

(Edit : the Stock cooler design was so inferior that if you actually maxed the card EVEN AT STOCK CLOCKS it could physically destroy itself)

YES the idea of "Per-Game Optimizations" started as a good idea - to boost compatibility with various games, and both nVidia and AMD/ATI Use it - the fact was that it was there for COMPATIBILITY - not to cover up how lame ass rubbish your useless gfx cards are by force disabling peoples PERFECTLY COMPATIBLE quality settings just to get a better frame rate.

That is just plain fraud - and in my limited personal opinion, that is not representative of TechPowerUp or it's administration, and based on my personal experience with AMD/ATI products, I believe AMD/ATI are frauds - and anyone new to the game should take their "Claims" and "Expectations" with a large tablespoon of salt.
BOTH COMPANIES DO THAT

Thanks for playing though.
Posted on Reply
#40
dieterd
I dont know about you, but my major concern is about price tag and what I heard around there is gona be 500$+!! If new generations comes out it should NOT be like - if preformance is +150% then price is 150%!!! and if it would be like that, then right now hd6970 should cost in release like x5 (and more) times hd2900xt when it was released, but it is like 1:1. So when I w8 new generation I wait preformance increase and maybe nothing to few $ price increase, like Intel yesterday - Ivy Bridge chart - prices 1:1, preformance 1.3:1 vs Sandy. I w8ted buldozzer and gues what - barley no preformance increase and major price increase vs Phenom II so it is like 1.1:1 preformance and 1.3:1 price :(. I am w8ting hd79xx and best I could expect is like 1.5:1 price and 1.5:1 preformance and that preformance "1.5" preformance is from AMD PR charts with "up to" attached so it will be way down. of course price "1.5" also could be hike, but I have bad feeling about this...
Posted on Reply
#41
laszlo
till i don't see a review i can say Mr. Papermaster at AMD has start working
Posted on Reply
#43
theJesus
My opinion of it (and all other GPUs) is that it does absolutely nothing until I see W1z's numbers. :)
Posted on Reply
#44
Aceman.au
I wonder what Nvidia will change to compete with this.
Posted on Reply
#46
runevirage
Doesn't the graph just look like someone whipped it up in MS Word? You'd expect an internal or PR slide to have more embellishments.
Posted on Reply
#47
entropy13
On average it's 147% for the HD 7970 (with the GTX 580 as the 100%). Which means it's just behind the HD 6990 (152%). If those figures are accurate.

Now let's assume it's actually lower by 15 percentage points (132%)...that puts it behind a GTX 590 (140%) but still some way ahead of the HD 5970 (124%).
Posted on Reply
#48
Unregistered
Patriotand...its Nvidia who coded a profile for Furrmark to keep their cards from dying running it...

sigh... :shadedshu
Nope. You're spreading fanboyism and garbage. OCP first came out in 400 series cards, not 500. The reason they wrote it was to tame the monster that was the 480; for people who didn't know what they're doing. The card was built to withstand those temps. Those that did knew what they're doing disabled it.
#49
cdawall
where the hell are my stars
John DoeNope. You're spreading fanboyism and garbage. OCP first came out in 400 series cards, not 500. The reason they wrote it was to tame the monster that was the 480; for people who didn't know what they're doing. The card was built to withstand those temps. Those that did knew what they're doing disabled it.
So capt smart do you know how to fully disable OCP? I'll give you a hint break out the soldering iron. Nvidia coded their drivers in the same manor AMD is being chastised for doing to theirs. Both companies adjust video quality in specific games, benchmarks, and apps. That's the nature of the beast it is and always will be a competition between the two.
Posted on Reply
#50
Unregistered
cdawallSo capt smart do you know how to fully disable OCP? I'll give you a hint break out the soldering iron.
I've an IPC 610 certification and have recapped a 1250W unit.
Add your own comment
Jun 6th, 2024 17:16 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts