1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

tom's hardware

Discussion in 'Games' started by hercules71185, Sep 25, 2007.

  1. hercules71185 New Member

    Joined:
    Sep 15, 2007
    Messages:
    83 (0.03/day)
    Thanks Received:
    0
    http://www23.tomshardware.com/graphics_2007.html?modelx=33&model1=707&model2=855&chart=296
    my question is,
    I have a 8600gt.. I play the game on 1280x1024 with HDR and in my ntune performance panel I add 2x to my AA and 16AF. I play the game MAXED meaning everything on all shadows all distance and 80+ mods to improve visuals alone including more detailed grass more detailed leaves and further distance viewing. Giving it a LOT MORE quality than before to say the least. I get right around 30FPS in the worst parts of the game sometimes in loading I get 15 for about a half of a second. (I think thats because of my lack of ram)Well according to that list I shouldn't be able to play it hardly at all. Does this chart match anyones cards? or is the chart way off?
    I have an AMD 3600 x2 overclocked to 2.7. Also checking their benchmarks on pcmark05. I get 5900. That is better than a stock 5600? I don't think so. I just want to know how accurate is that site or if I'm just getting good ratings. As far as I know my friend has the amd 6000 stock he won't overclock it and he destroys my overclocked benchmarks getting 6800 I know I've seen people claiming their 7600gt can run the game maxed out @ 1600x1200 with his amd sempron 3000. but honestly what are people FPS on this game and is there any king of benchmark utility for this game I can run?
    Last edited: Sep 25, 2007
  2. Darknova

    Darknova

    Joined:
    Nov 8, 2006
    Messages:
    5,037 (1.77/day)
    Thanks Received:
    535
    Location:
    Manchester, United Kingdom
    Firstly, the guy with the 7600GT is a moron. I had an overclocked 7600GT playing Oblivion at Full @ 1280x1024 with an Athlon X2 4200 overclocked to 2.8Ghz. My minimum fps was 20fps, so 1600x1200 would KILL it. No 7600GT can perform at 1600x1200 unless at minimum settings, even then it would probably roast itself to death.

    Secondly, every PC is different, so graphs like that should always be taken with a pinch of salt.
  3. Oliver_FF

    Oliver_FF New Member

    Joined:
    Oct 15, 2006
    Messages:
    546 (0.19/day)
    Thanks Received:
    65

    tomshardware.com has been like a pile of steaming poo lately... Those VGA charts were taken well back with some of the original drivers that came out with the 8000series, and had pretty poor performance at the time. The charts don't seem to have been updated since then, as prooved by the lack of ATi cards on there, and the time it's taken them to get them on.
    hercules71185 says thanks.
  4. hercules71185 New Member

    Joined:
    Sep 15, 2007
    Messages:
    83 (0.03/day)
    Thanks Received:
    0
    alright thanks that seems to explain why the 8600 series is so bad is because of old drivers when they came out. Maybe they are inferior to the 7 series I've never used one of those. but, I haven't had any problems with my card yet and I can't run it at full pcie x16 yet only x8. Overclocking sucks since there really is none without losing performance on my card. so at stock speeds it seems to be performing better than I expected from how bad everyone claimed it was supposed to be

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page