• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Catalyst 10.12 Performance Analysis

Panther pictures represent settings you use and we have no way of knowing the real settings used in them. Also we have to think about why they "forgot" to mention the IQ trick ATI has been pulling as of late to get up to a 10% performance increase, which for me is important since it feels like they think of the end user as ignorant. Of course it's not the first time ATI has cut IQ details from here and there to gain performance but again thats another story.

Finally no i don't trust results from an ATI fan forum that carries the name of one of ATIs first 3d cards ( which were a fail btw, for example black waters in tomb raider for all those who used them back then ) as i would not trust results from a forum if it had the name Riva128. Makes sense i think correct ?

Speedo stop cutting pieces and go where they say that ATI could do better ;)

"As part of the new driver released for the HD 6800, there were some changes to the image quality options available. Gone were the Narrow and Wide Tent MSAA modes, leaving only the standard Xox and excellent, but performance sapping, Edge Detect MSAA modes. Changes to the Catalyst Application
Intelligence (AI) slider appeared as well, with a new Texture Quality slider replacing Cat AI standard or advanced. The new default settings for Texture Quality is Quality, which is not the highest quality mode - there are some optimizations performed, that AMD states shouldn't visibly affect image quality but can increase performance."


That link, you know, the one there I posted, if you scroll down, you know with your mouse wheel.


Perhaps you can't as you have a old mouse and don't know about that yet.
 
High quality = no optimisations
Quality = AMD suggested optimisation
Performance = Lots of optimisation

I think your taking the term quality a bit to literally here, doesn't seem to be an issue there for me.
 
I can't force you to accept some things, but since you like looking at pictures check online for several pictures ( pointing out the IQ trick ) showing off Quality with NVIDIA cards and Quality with ATI cards in Games and judge for yourself.

When i check quality on either the 5970 or the 580 i expect the same level of IQ otherwise we can't really compare anything that way.
 
I can't force you to accept some things, but since you like looking at pictures check online for several pictures ( pointing out the IQ trick ) showing off Quality with NVIDIA cards and Quality with ATI cards in Games and judge for yourself.

When i check quality on either the 5970 or the 580 i expect the same level of IQ otherwise we can't really compare anything that way.

"quality" is the optimised setting for BOTH companies. ( basically that "trick" you keep mentioning will not run when you switch optimisations off, I.E high quality setting in CCC )

You want to compare "high quality" dude :laugh: ( or the nv equivilant) for a true apples to apples image quality comparison : ] ( otherwise it's down to what the company feels does not detract from user experiance, which is basically the companies "opinion" which means nothing to anyone)
 
"there are some optimizations performed, that AMD states shouldn't visibly affect image quality but can increase performance."

That's all that matters for this thread guys. I already said that the resulting IQ is irrelevent, mainly because it's highly subjective. The optimizations are there, however, and according to the above quote even AMD stated that optimizations are there and that they increase performance. What else is there to say?

The optimizations are there and increase performacen. People find this completely legit and useful.

Nothing prevents Nvidia from using the same optimizations and gaining 5-10% performance in some drivers in the future, that's all.
 
When w1zz does reviews do you think he switches off things like IQ optimisations ?
 
Panther have even searched the web about what i am talking about ? If you had you would see that even the High Quality mode is not really High Quality, please do and then i will be happy to hear what you have to say. As for Wiz i think he lets them at default CC settings, as do most people reviewing cards.

Benetanegia if NVIDIA was to do this we would probably have 10 threads with accusations, flames and stories about the end of the world and how they are the reason :)

Anyway no time to argue and this was not my intention in the first place. Just saying my opinion, back to work now :)
 
Last edited:
no high quality mode just switches off optimisation, to get the rest of the quality settings up you have to turn them up.

As for search naww I havn't it's you making the statement post some links : ]

I wouldn't know where to look.


Oh should note I'm talking about catalyst A.I slider, as I said I don't think anyone but casual gamers actually uses the global IQ setting slider. ( turning catalyst A.I off in old versions of CCC does the same thing as turning it to high quality now) as it doesn't adjust the catalyst A.I setting ;) ( just changes AA, AF and things like that)

And who cares about them, they can't see the difference.

*Casual I mean people who have a gaming PC, but have no idea how it works.


Honestly when there's an off switch for image quality tricks I don't think it matters at all, one bit, except in benchmarks. ( where they should be off 100% of the time)


Upsetting to see you use the term argue, I thought this was a friendly discussion : [


Infact I don't dissagree with you, there is optimisations going on, but you can switch them off which is the important thing to consider.

So saying nvidia or AMD have better image quality is well, silly really, they're about the same when no optimisations are on, trust me I know, I could quickly swap to an NV card now and do some tests just incase my memory is failing me, but I'm fairly certain ;)
 
Last edited:
Some of the comments in here make me wonder how long certain people have been using computers as both company's have used optimizations that have defiantly negativly effected the image quality over the years and both company's have used optimizations that have not changed image quality enough for it to be noticeable, just like both company's have been beating the other at some point it keeps swapping between the company's who is complaining at who for what optimizations they have used.

Can't we just get over it and accept both company's do this stuff and at the moment it hardly effects quality yet increases performance, it's really to be expected if you have paid attention to computer graphics over the past decade and a half or so.
 
Most people do benchmarks with default settings, if not all, meaning quality setting in both companies.

When however the IQ is lower, even not noticeable by most people, when it comes to ATI than the one in NVIDIA i hardly call that fair.

So when we have seen several hundred benchmarks with the launch of the 6xxx series without people mentioning this then i really think that the end user is not getting all the facts right and in the end that is simply wrong. As i said 5-10% performance increases is not huge but deduct that from all benchmarks of the 6xxx series so far and many things change.

Anyways i can't continue, too much work, just wanted to say a couple of things, whether or not some people decide to accept them or look into them further is really up to them.

Cheers :)
 
Most people do benchmarks with default settings, if not all, meaning quality setting in both companies.

When however the IQ is lower, even not noticeable by most people, when it comes to ATI than the one in NVIDIA i hardly call that fair.

So when we have seen several hundred benchmarks with the launch of the 6xxx series without people mentioning this then i really think that the end user is not getting all the facts right and in the end that is simply wrong. As i said 5-10% performance increases is not huge but deduct that from all benchmarks of the 6xxx series so far and many things change.

Anyways i can't continue, too much work, just wanted to say a couple of things, whether or not some people decide to accept them or look into them further is really up to them.

Cheers :)


I have to ask have you only stared using computers recently? i would assume not by the tomb raid comment you made before so can i ask you if you were you posting like this in forums when nvidia was doing this with noticeably degraded image quality?

As i said before both company's have done this many times over the years, anyone who has been gaming for a while should know this and understand the effects of it thus this whole exchange of posts is pointless.
 
When NV was doing similar things ATI was also doing similar things, everyone knew it, everyone mentioned it, noone denied it and we all knew what to expect.

However now is now and since only ATI is doing it i think we should underline it especially in light of their new 6xxx series don't you ?

Anyways like i said i don't have time to follow threads, just said what i thought on the drivers matter that is all.
 
Looks like AMD gave the 5870 one last bang for the 10.12.

What is that about 3-4% performance gain over all? not bad :toast:
 
and as i said earlier, nvidia are doing the same shit.

its a pointless argument, who cares if its not 100% of what they can achieve in quality, so long as the end user cant tell the difference?
 
Well if they did it on their most recent ( few days ago ) drivers i see no wrong with that, its just countering what ATI does for almost 2 months now if not longer.

Of course from what i see in that other thread you are taking a whole different approach saying that NV did this to make their cards seem faster in light of the new 6xxx series....Seriously ??? Obviously you are running for ATI all the way so yes this is pointless i agree.
 
Cool, wish ya did this more often. Must take some time installing all those drivers over and over.

I am hopeing that you did not just use old benches for the older cards due to game patches and fixing stuff..

MASSIVE thanks.... Hope to see this done again :toast:
 
Nice review Wizz, I installed the 10.12's and will try them out.
 
It all boils down to the company that can allow us users to jack up Image Quality to the MAX but at the same time maintaining great performance. Today I see this happening with Radeon cards.

In the past that was not possible I believe. The more you increase IQ settings the more is sucks your performance away (Except for games such as Half-Life 2 and FarCry).
 
It all boils down to the company that can allow us users to jack up Image Quality to the MAX but at the same time maintaining great performance. Today I see this happening with Radeon cards.

In the past that was not possible I believe. The more you increase IQ settings the more is sucks your performance away (Except for games such as Half-Life 2 and FarCry).

I see a much better implementation of AA and AF ( at least when it comes to performance cost ) with the new GTX580 in high resolutions than i have seen with any other card in the market, up until today that is.

In any case AA / AF will always hit performance, until of course we reach a time when max AA / max AF will be default and another setting in terms of quality takes their place.
 
Last edited:
for my 5970 is 10.12whql first usefull driver with nice performance since 10.5a even though theres annoying bug sometimes when systems hangs for few secs if monitor goes back from sleep mode or when u press alt+f4
 
wew, i though this thread is about AMD driver improvement ?? and btw both company use some short of optimization, and btw at least AMD admit it.and if you don't like it at all then you can just change the option from quality to high quality
 
The one thing I would like to see ATI fix is the speed limit of the GPU . They lock you at 960MHz this sucks ! nVidia doesn't lock you to any specific clock speed on your GPU but ATI sure does . I have seen this for the longest time and it really pisses me off . They fix every thing and nothing all the time !
 
The one thing I would like to see ATI fix is the speed limit of the GPU . They lock you at 960MHz this sucks ! nVidia doesn't lock you to any specific clock speed on your GPU but ATI sure does . I have seen this for the longest time and it really pisses me off . They fix every thing and nothing all the time !

I assume you are talking about the max selectable clocks in the overdrive tab on the catalyst control center, if so then you should just use different software like MSI afterburner to set your clocks... assuming they would be stable that high.
 
90% of users will never exceede the clocks listed in CCC overdrive, for those that will apps like afterburner, Trixx, and BIOS editing tools will be the superior method of getting the clocks they want and can achieve.
 
Back
Top