• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

8800GT 3DMark Scores

OrbitzXT

New Member
Joined
Mar 22, 2007
Messages
1,969 (0.32/day)
Location
New York City
System Name AX-01
Processor Intel Core i5-2500K @3.7 GHz
Motherboard ASRock Z68 Extreme3 Gen3
Cooling Zalman 9700
Memory Kingston HyperX T1 Series 8GB DDR3 1600 MHZ
Video Card(s) GTX 590
Storage Intel X25-M
Display(s) 42" Samsung LED HDTV
Case Antec Twelve Hundred
Audio Device(s) HT | OMEGA STRIKER 7.1
Power Supply Kingwin 1000W
Software Windows 7 64-Bit
I just ran 3DMark05 and 06 in 4 different configurations and thought I'd share my results.

Q6600 2.40 GHz/8800GT 632Core,1674Shader,950Memory
3DMark 05 - 14,892
3DMark 06 - 11,122

Q6600 2.40 GHz/8800GT 702Core,1782Shader,950Memory
3DMark 05 - 14,356
3DMark 06 - 11,200

Q6600 3.30 GHz/8800GT 632Core,1674Shader,950Memory
3DMark 05 - 18,094
3DMark 06 - 12,928

Q6600 3.30 GHz/8800GT 702Core,1782Shader,950Memory
3DMark 05 - 18,034
3DMark 06 - 13,489

These scores are extremely similar with what I got before I replaced my GTX with my XFX GT. I'm tempted to push the memory farther but I've read various reports that running it at 2GHZ will kill the card. I know everyone says "If you overclock you shortern the lifespan" but these reports about this specific issue seems different. I read it wasn't actually the memory's fault but something else on the card. I'm going to try and find the article now. It was also being discussed on eVGA's website, but I think it pertains to all GT's.

Also, does anyone know for sure yet if the locked shaders have the possibility of being unlocked?
 
Joined
Sep 26, 2006
Messages
6,959 (1.08/day)
Location
Australia, Sydney
Truly amazing! The answer to the "unlocked shaders" is no. You cannot unlock them, the Chip was manufactured without them, as they have remade the G80 in 65nm. What cooling solution are you using on the GPU at the moment? Stock right? I've heard some people say that blasting the cooler at 100% will make the GPU run cooler by 20*C... right?
 

OrbitzXT

New Member
Joined
Mar 22, 2007
Messages
1,969 (0.32/day)
Location
New York City
System Name AX-01
Processor Intel Core i5-2500K @3.7 GHz
Motherboard ASRock Z68 Extreme3 Gen3
Cooling Zalman 9700
Memory Kingston HyperX T1 Series 8GB DDR3 1600 MHZ
Video Card(s) GTX 590
Storage Intel X25-M
Display(s) 42" Samsung LED HDTV
Case Antec Twelve Hundred
Audio Device(s) HT | OMEGA STRIKER 7.1
Power Supply Kingwin 1000W
Software Windows 7 64-Bit
According to various reports on eVGA's forum, the "Auto" setting for the fan doesn't kick up the RPMs as the card gets hotter, which is why I think a lot of people were reporting these cards get hot and blamed the single slot cooler. Right now with the core at 700 and memory at it's stock 950, I idle at 48C, and it hasn't gotten above 65C yet, with the fan set on 100%. It is noticeably louder than either the GTS or GTX at 100% which I've owned before, but its very tolerable and definetly quieter than the past ATI cards I've owned.

When I set the core to 710, it immediately becomes unstable, but I think this is because the Shader clocks jumped too high along with the Core. I notice in Riva Tuner they have a setting "Link Clocks". Is it now possible to raise the Core clock and leave the Shader clock at it's own stable setting? The card is definetly cool enough to overclock more, it either just needs more voltage or for the Core and Shader clocks to be set independent of one another.
 
Joined
Sep 26, 2006
Messages
6,959 (1.08/day)
Location
Australia, Sydney
According to various reports on eVGA's forum, the "Auto" setting for the fan doesn't kick up the RPMs as the card gets hotter, which is why I think a lot of people were reporting these cards get hot and blamed the single slot cooler. Right now with the core at 700 and memory at it's stock 950, I idle at 48C, and it hasn't gotten above 65C yet, with the fan set on 100%. It is noticeably louder than either the GTS or GTX at 100% which I've owned before, but its very tolerable and definetly quieter than the past ATI cards I've owned.

When I set the core to 710, it immediately becomes unstable, but I think this is because the Shader clocks jumped too high along with the Core. I notice in Riva Tuner they have a setting "Link Clocks". Is it now possible to raise the Core clock and leave the Shader clock at it's own stable setting? The card is definetly cool enough to overclock more, it either just needs more voltage or for the Core and Shader clocks to be set independent of one another.

It isnt that, 710 may be your max due to the core itself. Nothing to do with how much heat it generates! More voltage may be needed, however... cores have an architectual limit to how far they can be pushed.

About the stock cooler. Its not necessarily a terrible design but Nvidia or other manufacturers could really do something about that fan! I mean there is so much wasted space, the fan should be bigger, 70 or 80cm will give a healthy increase in airflow. Just some of the gripes of stock cooling solutions... make the fan bigger manufacturers!
 

{JNT}Raptor

New Member
Joined
Jul 12, 2005
Messages
732 (0.11/day)
Location
NY
System Name Ummmm...Mine
Processor I7 920 @ 4.2ghz @ 1.29v's load
Motherboard ASUS P6T Deluxe V2
Cooling Custom 1/2 inch H20
Memory 3x2gb Patriot Sector 7 @2008Mhz 27-9-11-9 1T
Video Card(s) EVGA GTX 580 SC 900/1800/1090
Storage 1-Mushkin 60gb SSD 1-500GB WD Black and 2-1TB 32mb WD Black
Display(s) 25 inch Hanns-G 2ms
Case Custom
Audio Device(s) Turtle Beach Catalina
Power Supply Corsair AX850 Pro Series-Modular
Software All Kinds...and then some.
Benchmark Scores 3dMark 11 P7066 Compare Link- http://3dmark.com/3dm11/251153
I'm confused....I've been running my shaders seperate from the core with Zero Issues with RivaTuner....when I took my core up to 780Mhz I lowered my Shaders to 1700 and 1800 with no Issues and it ran benchmarks well.......didn't handle 3d games as well though.

Am I missing something?....Because I can clock them separately....nothing seems locked on my end.

Just curious. :)




EDIT....Ahhhh I get It now....the extra shaders on the card....not the clocks....sorry about my confusion. LMAO
 
Last edited:
Joined
Nov 10, 2005
Messages
1,540 (0.23/day)
Location
Athens - Hellas
Processor C2D E7600
Motherboard GA EG41M ES2L
Memory 2GB ADATA 800MHZ
Video Card(s) ASUS HD4350
Storage OCZ VERTEX TURBO 32GB + 2 MORE..
Display(s) SM226BW
Audio Device(s) 7.1 HD / X-540
Software 7X86 ULT
here are my 05 and 06 scores with Q6600 - 8800GT.

3dmark06 = 16626.Q6600 @ 4050mhz - 8800GT @ 771/1944/1026.
3dmark05 = 24252.Q6600 @ 4050mhz - 8800GT @ 756/1944/1026.
 

OrbitzXT

New Member
Joined
Mar 22, 2007
Messages
1,969 (0.32/day)
Location
New York City
System Name AX-01
Processor Intel Core i5-2500K @3.7 GHz
Motherboard ASRock Z68 Extreme3 Gen3
Cooling Zalman 9700
Memory Kingston HyperX T1 Series 8GB DDR3 1600 MHZ
Video Card(s) GTX 590
Storage Intel X25-M
Display(s) 42" Samsung LED HDTV
Case Antec Twelve Hundred
Audio Device(s) HT | OMEGA STRIKER 7.1
Power Supply Kingwin 1000W
Software Windows 7 64-Bit
Giorgos, what are your scores with the GT at stock and your Q6600 @ 4.05 GHz? Stock Q6600 and my overclock of 3.30 GHz result in pretty much the same score regardless what I overclock the GT to.
 
Top