1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

8800gts vs 2900 pro all tests made by me

Discussion in 'Graphics Cards' started by cefurkan, Oct 21, 2007.

  1. Tatty_One

    Tatty_One Senior Moderator Staff Member

    Joined:
    Jan 18, 2006
    Messages:
    16,674 (5.25/day)
    Thanks Received:
    2,506
    Location:
    Worcestershire, UK
    Okey dokey, we'll call it a day! but we do disagree on one point, 2006 is not particularily shader intensive and actually the 2900XT beats (since cat 7.8) even a GTX in a number of real world gaming scenario's....but thats another story! Just as a matter of interest, with no additional shader clocking, just leaving it in "sync" a core speed of 783Mhz auto clocks the shaders to 1761mhz which is pretty significant in itself!
     
  2. Lekamies New Member

    Joined:
    Dec 16, 2004
    Messages:
    152 (0.04/day)
    Thanks Received:
    0
    Warm water is enough to cool down for breaking that barrier.
    Here is pic after 3d mark'03 run include rivatuners hardware monitor.
     
  3. DarkMatter New Member

    Joined:
    Oct 5, 2007
    Messages:
    1,714 (0.67/day)
    Thanks Received:
    184
    I have heard and read lot of people claiming big performance boosts out of shader overclocking alone, so I will keep my thoughts.

    The fact that new cat 7.8 makes 2900 perform better only corroborates my point. Shader power is what performance is asking for. And if there's something that Radeons have is shader power. Better said they have as much as 2x the theoretical peak shader power of Nvidia counterparts.

    I didn't know how shaders overclock when overclocking the core, I assumed they did rise their clock but I doubt they do it in a linear fashion. Now I know it's not linear. Nevertheless you have to agree that 1860 (or 1836 for that matter) is higher than 1761, don't you?

    Now if you want you can join me in the "more shader power, more performance" club, wich BTW is backed up by the 8800GT and many 3dmark records, for example lekamies case. :D
     
  4. trt740

    trt740

    Joined:
    May 12, 2006
    Messages:
    10,935 (3.57/day)
    Thanks Received:
    1,113
    I'm not impressed. THAT RESOLUTION NO WONDER!!!! and in a 4 1/2 year old bench. Lets see 3dmark06. As a matter of fact why don't you bench it to show me just how wrong I am.
     
    Last edited: Oct 24, 2007
  5. trt740

    trt740

    Joined:
    May 12, 2006
    Messages:
    10,935 (3.57/day)
    Thanks Received:
    1,113
    I'm gonna stop now because what your say is total garbage good luck. one more thing I still don't see a 3dmark06 19000 score with full screens cpuz, gpu clock etc. or for that matter a 8800 gts hitting 14000 with screen shot to back that up other than future marks. Future marks by it'self means zero. I want to see a completed test screen and all the other required screens I bet I won't see them.
     
    Last edited: Oct 24, 2007
  6. Tatty_One

    Tatty_One Senior Moderator Staff Member

    Joined:
    Jan 18, 2006
    Messages:
    16,674 (5.25/day)
    Thanks Received:
    2,506
    Location:
    Worcestershire, UK
    I agree completely with the shader power, I think you are misunderstanding what I am trying to say regarding the 1761. The 1761 is not independantly overclocked, unlike the R600 series (where the shader clock is locked and fixed at a lower speed than Nvidia) on the G80, as you increase the core clock the shader clock automatically increases with it so, when a user sets his 8800GTS core speed to 863 Mhz he already has a shader clock speed of 1761, then he can go into rivatuner and overclock the shader clock seperatly (he has probably hit his core max already at 863mhz) without touching the core, so my point was the chances are his shaders are well beyond 1761 if he knows what he is doing, and to acheive that score on an 8800GTS suggest he does.....make sense? So I am saying the minimum it can be is 1761 and is likely to be much higher.....but of course we dont know, what we do know is though that you would have to expect it to be a lot LOWER because the difference in CPU speed and Core clocks is a lot and the faster (perhaps not the best terminology :)) is actually the slower.
     
    Last edited: Oct 24, 2007
  7. ccleorina

    ccleorina New Member

    Joined:
    Mar 9, 2007
    Messages:
    195 (0.07/day)
    Thanks Received:
    10
    Location:
    Overclocking Hell
    I just want to ask... Is ATI HD2900 have shader clock??? I cant see in GPUZ?:rockout:

    If the shader clock is locked? So what is the real shader clock?:roll:

    Thanks.....
     
  8. yogurt_21

    yogurt_21

    Joined:
    Feb 18, 2006
    Messages:
    4,408 (1.40/day)
    Thanks Received:
    570
    Location:
    AZ
    I think it's tied to the stock core clock, and from what I've heard it doesnt change. even when you oc the core.
     
  9. Wile E

    Wile E Power User

    Joined:
    Oct 1, 2006
    Messages:
    24,324 (8.32/day)
    Thanks Received:
    3,778
    I wish somebody would make a good BIOS editor for R600. :(
     
  10. cefurkan New Member

    Joined:
    Oct 21, 2007
    Messages:
    206 (0.08/day)
    Thanks Received:
    4
    shader and gpu clock is same at ati
     
  11. Tatty_One

    Tatty_One Senior Moderator Staff Member

    Joined:
    Jan 18, 2006
    Messages:
    16,674 (5.25/day)
    Thanks Received:
    2,506
    Location:
    Worcestershire, UK
    Yes it is locked and does not increase with core overclocking, the 8800GTS stock shader clock is at 1185mhz and will increase with core overclocks, it can also now be increase independantly in rivetuner :D the 2900XT shader clock runs at 800Mhz and stays there, one of the reasons why it has so many SP's, to compensate (although thats not the specific reason).

    Just to bore you though :eek:Technically tho for practical purposes, although the 2900XT has 320SP's it actually has 64 groups of 5 shaders. Each group of 5 shaders can only run 1 thread each while each single shader in the 8800 can run 1. This means that the 8800GTX runs 128 shaders per clock and the 2900 runs 64. BUT! Each thread is worked on by 5 shaders and hypothetically can have 5 instructions ran per thread per clock. This equates to 320 intructions per clock versus the GTX's 128. While it is easy to divide the threads to all 64 groups of shaders, it is very difficult to keep all 5 shaders in each group working. This means that on a best case senario (all 5 shaders per group working to max) the 2900 does 2.5x's the work per clock than the 8800GTX and on a worse case senario it does half the work per clock than the 8800GTX. This means that the perfromance of the 2900XT GREATLY relys on the ability of the driver to distribute the instructions of the game being played. This is why we see poorish performance in some games and spectacular performance in others with the 2900XT. Future drivers can help this greatly.
     
    Last edited: Oct 25, 2007
  12. DarkMatter New Member

    Joined:
    Oct 5, 2007
    Messages:
    1,714 (0.67/day)
    Thanks Received:
    184
    I understood your point of shader clocks been locked. And I understand why you assume it is even higher, it makes sense after all. Happens that I don't, but maybe I'm biased in this respect. I used to frequent a local forum of overclockers who's only goal was to achieve the higher clocks possible, or better said the higher conbined clocks they could reach, of course being it stable 100%, and they used to say it was common in their circles. The problem is that higher clocks not always translate into better performance. At some point there are current leakages, sinc problems between modules, etc. When this happens nobody knows what could happen and one posibility is that higher speeds give worse results, even if it's (or seems to be) totally stable. Ok, we all know this and I am talking about CPU overcloking for the most part, but the same applies to GPU, more in this case where core and shaders are asinc'ed. And although you have 3DMark to try the performance as you increase clocks, not everybody benches all speeds, they use only stability tests instead.

    Another thing that I take into acount when doing my assumptions is this: the whole 8800 line is the same chip when manufacturing it, and then they choose wich one is wich model based on yields or demand. This is nothing new, we all know, but I don't see anyone paying attention to this as much as I think we should. There's a big difference between the models, as much that it could have been called a different chip. For example a chip that has been chosen as GTS because a defective ROP alone, is going to overclock a lot better than one that has been chosen because it couldn't reach GTX speeds as well as Nvidia wanted. This isn't new to anyone either, but I don't see anyone giving it the importance it deserves.

    So my point about the thing was:
    first, do we know for sure they are overclocking in a proper manner, so they get better performance? or on the contrary they are aiming for higher clocks only? (ok I didn't read the whole link, I just looked at screenies and little more so...) There are so many records or record claims out there with higher scores, but still lower clocks than those on your link, that I know where my two cents are going to stay by now.
    second, can a specific 8800gts perform better at 720/1850 (core/shader) than other(s) at 860/1750? From my point of view it can.

    Wow! That was a long post! I hope it explains my point of view. I will admit that I don't have personal experience with new cards, both Nividia 8 or HDs, since I'm out of this bussiness right now, but I do have some knowledge and I read a lot. So yeah I speak out of theory for the most part, and I have to rely on others experiences, and so I have to believe them. But I have learnt something on technology: Impossible is nothing (or was it adidas?)
     
  13. DarkMatter New Member

    Joined:
    Oct 5, 2007
    Messages:
    1,714 (0.67/day)
    Thanks Received:
    184
    About the HD 2900 5 instructions per clock... I remember that when I saw all the specs and diagrams, my guess was that despite AMD's claims it was more likely that the chip wouldn't be able to effectively use both scalar and vectorial units at the same time. Can't remember why though, and from that day until now I look at the HD SPs as 4 ops/cycle units.

    I'm too lazy to go back on reading all the stuff, so I would love your opinion. So please everybody share your thoughts on this.
     
  14. bigboi86

    bigboi86 New Member

    Joined:
    Apr 8, 2006
    Messages:
    1,450 (0.47/day)
    Thanks Received:
    35
    Location:
    techPowerUp!
    That doesn't mean that the CPU will cause the graphics card not to overclock as much. They are completely different subsystems.

    I am not wrong.

    This has nothing to do with benchmarking.
     
  15. Lekamies New Member

    Joined:
    Dec 16, 2004
    Messages:
    152 (0.04/day)
    Thanks Received:
    0
    I had only this orb link to proof my 3d mark'06 score.

    So i run it again at same clocks and take screenshot it for you.
    Here
     
  16. trt740

    trt740

    Joined:
    May 12, 2006
    Messages:
    10,935 (3.57/day)
    Thanks Received:
    1,113
    :D
    Oh I already said i believe this I want to see the GTX doing 19000. Nice bench:respect:
     
  17. ccleorina

    ccleorina New Member

    Joined:
    Mar 9, 2007
    Messages:
    195 (0.07/day)
    Thanks Received:
    10
    Location:
    Overclocking Hell
    Thanks for nice info....:rockout:
     
  18. Tatty_One

    Tatty_One Senior Moderator Staff Member

    Joined:
    Jan 18, 2006
    Messages:
    16,674 (5.25/day)
    Thanks Received:
    2,506
    Location:
    Worcestershire, UK
    Always glad to help!
     
  19. cefurkan New Member

    Joined:
    Oct 21, 2007
    Messages:
    206 (0.08/day)
    Thanks Received:
    4
    well i am upping this topic for

    IQ compare

    already both card are meaningless since

    8800gt and 3870 on market
     
  20. Lionheart

    Lionheart

    Joined:
    Apr 30, 2008
    Messages:
    4,056 (1.73/day)
    Thanks Received:
    810
    Location:
    Milky Way Galaxy
    nvidia fanboyism to the max
     
  21. marsey99

    marsey99

    Joined:
    Jul 18, 2007
    Messages:
    1,576 (0.60/day)
    Thanks Received:
    298
    dead thread res to the max :)
     

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page