1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

8800GT 3DMark Scores

Discussion in 'Graphics Cards' started by OrbitzXT, Nov 8, 2007.

  1. OrbitzXT

    OrbitzXT New Member

    Joined:
    Mar 22, 2007
    Messages:
    1,969 (0.69/day)
    Thanks Received:
    59
    Location:
    New York City
    I just ran 3DMark05 and 06 in 4 different configurations and thought I'd share my results.

    Q6600 2.40 GHz/8800GT 632Core,1674Shader,950Memory
    3DMark 05 - 14,892
    3DMark 06 - 11,122

    Q6600 2.40 GHz/8800GT 702Core,1782Shader,950Memory
    3DMark 05 - 14,356
    3DMark 06 - 11,200

    Q6600 3.30 GHz/8800GT 632Core,1674Shader,950Memory
    3DMark 05 - 18,094
    3DMark 06 - 12,928

    Q6600 3.30 GHz/8800GT 702Core,1782Shader,950Memory
    3DMark 05 - 18,034
    3DMark 06 - 13,489

    These scores are extremely similar with what I got before I replaced my GTX with my XFX GT. I'm tempted to push the memory farther but I've read various reports that running it at 2GHZ will kill the card. I know everyone says "If you overclock you shortern the lifespan" but these reports about this specific issue seems different. I read it wasn't actually the memory's fault but something else on the card. I'm going to try and find the article now. It was also being discussed on eVGA's website, but I think it pertains to all GT's.

    Also, does anyone know for sure yet if the locked shaders have the possibility of being unlocked?
     
  2. tkpenalty New Member

    Joined:
    Sep 26, 2006
    Messages:
    6,958 (2.31/day)
    Thanks Received:
    345
    Location:
    Australia, Sydney
    Truly amazing! The answer to the "unlocked shaders" is no. You cannot unlock them, the Chip was manufactured without them, as they have remade the G80 in 65nm. What cooling solution are you using on the GPU at the moment? Stock right? I've heard some people say that blasting the cooler at 100% will make the GPU run cooler by 20*C... right?
     
  3. OrbitzXT

    OrbitzXT New Member

    Joined:
    Mar 22, 2007
    Messages:
    1,969 (0.69/day)
    Thanks Received:
    59
    Location:
    New York City
    According to various reports on eVGA's forum, the "Auto" setting for the fan doesn't kick up the RPMs as the card gets hotter, which is why I think a lot of people were reporting these cards get hot and blamed the single slot cooler. Right now with the core at 700 and memory at it's stock 950, I idle at 48C, and it hasn't gotten above 65C yet, with the fan set on 100%. It is noticeably louder than either the GTS or GTX at 100% which I've owned before, but its very tolerable and definetly quieter than the past ATI cards I've owned.

    When I set the core to 710, it immediately becomes unstable, but I think this is because the Shader clocks jumped too high along with the Core. I notice in Riva Tuner they have a setting "Link Clocks". Is it now possible to raise the Core clock and leave the Shader clock at it's own stable setting? The card is definetly cool enough to overclock more, it either just needs more voltage or for the Core and Shader clocks to be set independent of one another.
     
  4. tkpenalty New Member

    Joined:
    Sep 26, 2006
    Messages:
    6,958 (2.31/day)
    Thanks Received:
    345
    Location:
    Australia, Sydney
    It isnt that, 710 may be your max due to the core itself. Nothing to do with how much heat it generates! More voltage may be needed, however... cores have an architectual limit to how far they can be pushed.

    About the stock cooler. Its not necessarily a terrible design but Nvidia or other manufacturers could really do something about that fan! I mean there is so much wasted space, the fan should be bigger, 70 or 80cm will give a healthy increase in airflow. Just some of the gripes of stock cooling solutions... make the fan bigger manufacturers!
     
  5. {JNT}Raptor

    {JNT}Raptor New Member

    Joined:
    Jul 12, 2005
    Messages:
    733 (0.21/day)
    Thanks Received:
    87
    Location:
    NY
    I'm confused....I've been running my shaders seperate from the core with Zero Issues with RivaTuner....when I took my core up to 780Mhz I lowered my Shaders to 1700 and 1800 with no Issues and it ran benchmarks well.......didn't handle 3d games as well though.

    Am I missing something?....Because I can clock them separately....nothing seems locked on my end.

    Just curious. :)




    EDIT....Ahhhh I get It now....the extra shaders on the card....not the clocks....sorry about my confusion. LMAO
     
    Last edited: Nov 8, 2007
  6. giorgos th.

    giorgos th. New Member

    Joined:
    Nov 10, 2005
    Messages:
    1,540 (0.46/day)
    Thanks Received:
    117
    Location:
    Athens - Hellas
    here are my 05 and 06 scores with Q6600 - 8800GT.

    3dmark06 = 16626.Q6600 @ 4050mhz - 8800GT @ 771/1944/1026.
    3dmark05 = 24252.Q6600 @ 4050mhz - 8800GT @ 756/1944/1026.
     
  7. OrbitzXT

    OrbitzXT New Member

    Joined:
    Mar 22, 2007
    Messages:
    1,969 (0.69/day)
    Thanks Received:
    59
    Location:
    New York City
    Giorgos, what are your scores with the GT at stock and your Q6600 @ 4.05 GHz? Stock Q6600 and my overclock of 3.30 GHz result in pretty much the same score regardless what I overclock the GT to.
     

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page