Wednesday, December 5th 2007

NVIDIA GeForce 8800 GT Gets a New Stock Cooler, Quieter and Cooler

For a high performance card like the GeForce 8800 GT, it seems a single-slot cooling solution is not enough. With reports of the card running over 90°C under load and the loud fan, NVIDIA has decided to upgrade the standard 8800 GT with a new cooler. The new cooler will be equipped with a larger fan (75x10mm compared to the old 65x10mm) running at a lower RPM (1664RPM compared to the old 4333RPM) resulting in a quieter and more efficient cooling solution.

Source: Expreview
Add your own comment

23 Comments on NVIDIA GeForce 8800 GT Gets a New Stock Cooler, Quieter and Cooler

#1
zekrahminator
McLovin
Those stock temperatures tell me that NVIDIA knows that the 8800GT isn't one of those cards that's built to last...
Posted on Reply
#2
csplayer089
woop-de-doo. 2 degrees celcius lower? who cares. oh and by the way i love how this happens to those who bought 8800GT's a few weeks ago. (not that i care about the new cooler just sayin this might piss some people off).

those idle/load temps are taken when the ambient temperature is what???


i have my pc in the basement and it is like 65 degrees farenheight down there. im sure my 8800gt will be fine even with the "old" cooler.
Posted on Reply
#3
PVTCaboose1337
Graphical Hacker
WOW! That is useless.... MAJOR FAIL!

Edit: I mean... EPIC FAIL!
Posted on Reply
#4
panchoman
Sold my stars!
by: PVTCaboose1337
WOW! That is useless.... MAJOR FAIL!
agree, 2 degress, big whoop. if this is nvidias response to the great cooler on the hd3k series then this was a major major failure :slap:
Posted on Reply
#5
JacKz5o
Keep in mind that while the new cooler only lowers temperatures by 2-3°C , it is much quieter since the RPM only runs at a little over 1/3 of the original fan.

Now.. if only NVIDIA includes a bigger fan with the same RPM as the old fan, I'm sure there would have been a bigger difference in temperatures :p
Posted on Reply
#6
devguy
by: JacKz5o
Keep in mind that while the new cooler only lowers temperatures by 2-3°C , it is much quieter since the RPM only runs at a little over 1/3 of the original fan.

Now.. if only NVIDIA includes a bigger fan with the same RPM as the old fan, I'm sure there would have been a bigger difference in temperatures :p
While that may be true, why didn't they think of this ealier? :confused:

I'm sure they measured temps in the labs before shipment. D'you expect me to believe that they didn't say, "Hey, this is hot and noisy, how about putting a bigger fan in?" before establishing the stock cooler?
Posted on Reply
#7
zekrahminator
McLovin
Something about "rushing it out the door before the HD 3k series took precious market share away from NVIDIA" comes to mind :p.
Posted on Reply
#8
Grings
by: JacKz5o
Keep in mind that while the new cooler only lowers temperatures by 2-3°C , it is much quieter since the RPM only runs at a little over 1/3 of the original fan.

Now.. if only NVIDIA includes a bigger fan with the same RPM as the old fan, I'm sure there would have been a bigger difference in temperatures :p
i was about to say that, set it to 100% and it stays under 70c, thats pretty good, though they should up the auto settings speed to 80% or so as its a slow fan anyway
Posted on Reply
#9
crow1001
by: zekrahminator
Something about "rushing it out the door before the HD 3k series took precious market share away from NVIDIA" comes to mind :p.
Well considering Nvidia have had over a year to get their next card out mainly due to lack of competition, i hardly doubt it was rushed in development. Anyone with have a brain would ramp up the GT's fan with Riva, instead of having it stay at 29% all the time, mine hits 50% at 60c and never goes ablove 75c, noise is not a problem.
Posted on Reply
#10
FatForester
This is pretty dumb, but it's better than nothing. And anyone with half a brain would know to replace the stock cooler in the first place if they're worried about temps and reliability.
Posted on Reply
#12
devguy
D'you think it would be cool for companies like eVGA, XFX, ASUS, (as well as AMD partners like Visiontek and Sapphire) etc, to include something like an optional add on to put into the slot next to the video card so that it will exhaust the hot air out of the video card, but if you can't afford the extra slot then it can be easily removed?

Something like this included picture. Sorry for it being ugly, but the idea is obvious. I think that would be awesome. It could even be some cheap plastic piece that will do nothing but guide the air out the back (maybe even UV reactive for show).
Posted on Reply
#13
OrbitzXT
What on earth are they doing to get these cards to reach those temperatures? My XFX GT OC'ed to 700 core never has gone above 65C under full load. I would have been more interested to see a comparison of the noise outputted by the new fans vs the old ones, cause the 8800GT above 70% fan speed is quite noise. Then again, you can create fan profiles in RivaTuner to have a certain fan speed when the GPU reaches a certain temperature...so its not a huge issue.
Posted on Reply
#14
Tatty_One
Super Moderator
My GT cooler gives me an idle of 32C and just 44C at full load :D It was cheap as well!....bonus! Ohhhh and thats with a modded BIOS to increase the voltage running as in specs.
Posted on Reply
#15
hat
Maximum Overclocker
I wish it was dual-slot. No sense in blowing all that hot air around in the case. :banghead:
Posted on Reply
#16
strick94u
i have hit 74 c under max load but soon as I get unlazy I'm putting after markit cooler.
Posted on Reply
#17
JacKz5o
My 6800GS hits around 85°C load in a few minutes when gaming... but then again, it is old 130nm technology lol..
Posted on Reply
#18
OnBoard
I'd take one even without a cooler, if I could just buy one :) Tomorrow should arrive a lot of different brand cards, but I don't believe until I see them on stock.
Posted on Reply
#19
a_ump
by: zekrahminator
Those stock temperatures tell me that NVIDIA knows that the 8800GT isn't one of those cards that's built to last...
i agree, though it mite be in a different way. with the 8800GTS coming out and the benchmarks provided by http://www.tweaktown.com/articles/1234/nvidia_geforce_8800gts_512mb_g92_tested/index.html (though im not sure if they are ligit) i don't see y they would keep the GT when the GTS is in the same performance area, i thk that since the 256MB version of the GT is out that the 512MB version GT will dissapear and the GTS will b left to own the low-high/high-mid stream area while the 256MB owns the high-low/low-mid stream area though the 256MB version is around 209+ usd
price of 8800GT 256MB
http://www.newegg.com/Product/ProductList.aspx?Submit=Property&N=2010380048&PropertyCodeValue=679%3A32704%2C679%3A28976&bop=And&Order=STOCK
performance 8800GT 256MB
http://en.expreview.com/?p=64 (not sure if these are ligit either)

but thats wat i thk, bye bye 512 GT hello 512 GTS and 256MB gt
Posted on Reply
#20
trt740
still a piece of shit
Posted on Reply
#21
ShinyG
The addition of a lower-revving, more silent cooler is always a good thing!
On the other hand, I think that nVidia is trying to get people off the fact that is almost impossible to get a 8800 GT these days by getting "new and improved" versions out, like the new cooler or the 1GB version...
The truth is that the 8800 GT is not so economically profitable for nVidia so the lack of stock, coupled with the huge popularity of this card is a huge marketing tool for the new nVidia 8800 GTS. They "sell" 8800GTs, people buy "the next best thing", the 8800GTS as the GT is not in stock, nVidia makes 100 bucks extra! I LOVE corporate thinking... ;)
Posted on Reply
#22
Live OR Die
thats abit guy :laugh: i leave mine on fan speed on 100% my temps are 60c-70c not to bad
Posted on Reply
#23
mitsirfishi
that is still preety dire temps they should use a 92mm fan on the cooling and run it at about 1200rpm keep there single slot cooling design but have the fan where it draws the air in deeper would make it better but a 4000rpm would annoy me now :P i know my hd3870 past 60% annoys me and so did my fx5950u in the day -.-
Posted on Reply
Add your own comment