• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Nvidia Shaders vs ATI/AMD Shaders: How do they compare?

Joined
Dec 2, 2008
Messages
368 (0.06/day)
Location
Over There
System Name Games for Breakfast
Processor i7 2600k @ 4.3ghz
Motherboard ASRrock Z77 extreme6
Cooling Cooler Master TX3 Dual Fan
Memory 16GB DDR3 - RipjawsX 1600 1.5v @ 8-8-8-22 1T
Video Card(s) Asus AMD R9 290 4GB Reference PCB
Storage Samsung F3 Spinpoint 1TB
Display(s) HP 22vx IPS LED
Case Cooler Master Elite 331
Audio Device(s) Onboard
Power Supply Corsair TX650 V1
Keyboard CM Storm Devastator
Software Windows 10 64bit
This is just out of curiosity.

Let's, for example, take the 8800gt/9800gt and the HD 4830.

They are both in the same performance league, yet the 8800gt has 112 shader units, and the 4830 has 640.

I was wondering, they have to be different somehow... because less shader count on Nvidia cards matches higher shader count on ATI's.

Are they architecturally different? If so, how?
 
They've both gone the same routes since their first DX10 iterations, ATI uses one direction, that includes counting those shaders differently they use a ratio of simple to complex shaders. NV goes a different route with more complex shaders and has a lower shader count. Both get the same job done with similar performance, but it is cool they're not both using the same r&d in their gpu and shader designs. There's a ton of info on this topic though man, search through the forums or even check out the R600 vs G80 in google, you'll find a ton of good information.

:toast:
 
Kursah is right, ATi uses a combination of Complex and Simple Shaders, while nVidia only uses Complex. It also helps that nVidia's shaders are clocked ~3x as high as ATi's(1500 vs. 575 in the case of 8800GT vs. HD4830).
 
I'm not technically (how things work) savant, but still an enthusiast (technology/science is not my field unfortunately, because I always sucked @ math. REALLY! SUCKED!)

I couldn't even get 5th grade math straight :|

Well, back to topic, if ATI uses a mix of complex/simple shaders (I'm assuming simple shaders do less complex tasks), let's say that out of the 640 Shaders in the 4830, 500 of those are simple shaders, and the remaining 140 are complex shaders.

By the complex shader number alone, ATI would win. But then there's still the simple shaders... and 500 of them. I'm guessing every bit of power helps? It's like haviing 140 tanks in a battle, and 500 trycicles with submachine guns attached. They would be weak, but a lot of them would do some damage.

Hope you get this stupid analogy :P

Or am I getting this totally wrong?

So shouldn't ATI win? I'm not being a fanboy here... I own an nvidia card ;)
 
ati shaders are clocked at the same speed as the core, whereas nv's arent bound by that restriction
 
I say nvidia has the more powerful shaders but ati makes up for that by using both complex and simple shaders

neither i guess, shaders are shaders

the architectures on nvidia and ati cards are not the same so its not as simple as saying ati has this many and nvidia has this many

as a matter of fact the 29XX/HD38XX's only have 64 complex shaders, the HD48XX only have 160, but can do 4 or 5 simple instructions per clock(not sure)
 
Last edited:
Anyone know why the ATI shaders are bound to the core clock?
 
Back
Top