1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Need help ocing my video card

Discussion in 'NVIDIA' started by CrAsHnBuRnXp, Jan 18, 2012.

  1. CrAsHnBuRnXp

    CrAsHnBuRnXp

    Joined:
    Oct 19, 2007
    Messages:
    5,473 (2.18/day)
    Thanks Received:
    639
    So im rather retarded when it comes to GPU overclocking and im taking a stab at it. I havent really overclocked a card since my 6800GS that was on AGP and you could unlock the pipes from 12 to 16. :rockout:

    Anyway, im taking a stab at it and using furmark to stress test it. Im benching it at 1900x1200 with no aa for 60000ms. But no matter what clock speeds I choose, i get a score of 470 points and that just doesnt seem right. Is there something im doing wrong with furmark or what?

    Thanks guys! :toast:

    Edit: So apparently my score doesnt want to change at 1920x1200 but yet if i change it to 1280x1024 and adjust core/memory/shader, my score will vary. :confused:
    Last edited: Jan 18, 2012
  2. TheLaughingMan

    TheLaughingMan

    Joined:
    May 7, 2009
    Messages:
    4,998 (2.57/day)
    Thanks Received:
    1,291
    Location:
    Marietta, GA USA
    That means at the higher resolution, something else is holding your GPU back. Its memory speed, your system memory speed, the PCIe bus, or CPU could be lacking. There is no real way to tell.

    The best test is to run furmark at the resolution you actually use. If you use 1920 x 1200 as your resolution for your PC, then testing should always be done there. You may want to look into overclocking other components at this point IMO.

    Edit: After looking at your system specs, I would say the GPU simply can't keep up with the shaders. I remember the 200 series ran a seperate clock for the shaders and overclocking the GPU core doesn't change it. What are you using to overclock your system? What was its stock (Core/Shader/Memory) speeds and what are they at now?
  3. CrAsHnBuRnXp

    CrAsHnBuRnXp

    Joined:
    Oct 19, 2007
    Messages:
    5,473 (2.18/day)
    Thanks Received:
    639
    The CPU is at 4.3GHz and its an i5 2500k and the ram is at 2133MHz from 1600MHz.
  4. TheLaughingMan

    TheLaughingMan

    Joined:
    May 7, 2009
    Messages:
    4,998 (2.57/day)
    Thanks Received:
    1,291
    Location:
    Marietta, GA USA
    Since you missed that change:

    What are you using to overclock your system? What's stock (Core/Shader/Memory) speeds for your GPU and what are they at now?
  5. CrAsHnBuRnXp

    CrAsHnBuRnXp

    Joined:
    Oct 19, 2007
    Messages:
    5,473 (2.18/day)
    Thanks Received:
    639
    Stock clocks are 666/1512/1242. Using EVGA Precision to overclock.

    What is even the point of upping the shader clock as it doesnt seem to do anything for increasing performance as according to Furmark at 1280x1024?
    Last edited: Jan 18, 2012
  6. TheLaughingMan

    TheLaughingMan

    Joined:
    May 7, 2009
    Messages:
    4,998 (2.57/day)
    Thanks Received:
    1,291
    Location:
    Marietta, GA USA
    It may not have helped at that resolution, but may be the reason why you didn't get any changes at 1900 x 1200 if you did not change it at that time. If you increased the other two, but not the shader it could become the limiting factor in your cards performance.

    I guess the resolution of 1280 x 1024 is what you intend to game at, so as long as you are getting the performance you want there, don't worry about anything else. If the shader clock doesn't help, leave it alone.
  7. CrAsHnBuRnXp

    CrAsHnBuRnXp

    Joined:
    Oct 19, 2007
    Messages:
    5,473 (2.18/day)
    Thanks Received:
    639
    1280x1024 is definitely not what I intend to game it. It was just a test to see if the score at the end of the benchmark changed at all compared to my native res of 1920x1200.

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page