Thursday, May 14th 2009

Inno3D to Launch the Fastest 9800 GT Yet

Expreview has got hold of details and imagery of an iChill 9800GT, which Inno3D are claiming to be the fastest 9800GT around, as a counter to AMD's recently released HD 4770. Although the core clock of 670 MHz is no faster than many factory overclocked 9800GTs from other manufacturer's, it is the memory where Inno3D have made the biggest advance. The iChill 9800GT's memory is clocked in at 2,200 MHz, an increase of 400 MHz or 22% over NVIDIA's reference clocks. Not only that but with the card making use of 0.8 ns DDR3, the memory is theoretically capable of being clocked at 2,500 MHz, leaving much overclocking headroom and the possiblilty of a 700 MHz or 39% increase above stock 9800GT memory clocks.

Memory aside, the card features the usual 112 shader processors with 512 MB of DDR3 on a 256-bit bus, and supports CUDA, DirectX 10, SM4.0, PhysX and PureVideo HD. Inno3D have opted to use the slightly older 65nm G92 core, which means that this card still requires a 6-pin PCI-E power connector.
The cooler as usual with Inno3D's iChill range is nothing to be sniffed at either, it comes with a dual heatpipe FreezerX DHT HP2-858 cooler featuring Direct Heatpipe Touch technology and the fan is said to ensure low-noise working environment as well as delivering adequate airflow.
The iChill 9800GT comes with a 3 year warranty, but we have yet to hear about availability or pricing.

Source: Expreview
Add your own comment

25 Comments on Inno3D to Launch the Fastest 9800 GT Yet

#1
hooj
Beastly !
Posted on Reply
#2
Mussels
Freshwater Moderator
my last card was an inno 3D... and the core did 740. they shoulda gone to 700 at least.
Posted on Reply
#3
Rammari
My 8800GT core already did over 700MHz. Wussy clocks in this iChill.
Posted on Reply
#4
BumbleBee
Innovision and Palit pulled out of North America :( don't count on buying this in America or Canada.
Posted on Reply
#5
Oblivion-330
Memory and cooler is nice, dont like the gpu clocks.
Posted on Reply
#6
tkpenalty
...........


HOW MANY FRIGGN G92 DERIVATIVES DO WE NEED?



(then again if it aint broke, dont fix it :laugh:)
Posted on Reply
#7
BumbleBee
would be nice if it had a hdmi port, I don't mind 65nm it has a higher heat emission but I don't think 55nm supports hybrid power.
Posted on Reply
#8
soryuuha
to counter 4770? :lol:

lets see how much oc headroom this card left
Posted on Reply
#9
newtekie1
Semi-Retired Folder
With the decent cooler, this card should have some decent overclocking headroom left.

But what interests me is the two SLi fingers...Tri-SLi anyone?

Seems like this card is using the new GTS250 PCB.
Posted on Reply
#10
Mussels
Freshwater Moderator
newtekie1With the decent cooler, this card should have some decent overclocking headroom left.

But what interests me is the two SLi fingers...Tri-SLi anyone?.
nice catch, i missed that
Posted on Reply
#11
iStink
Yeah I'd like to see tri sli.

Can you run dual monitors in sli yet? or tri sli for that matter? That's the only reason I haven't gone crossfire/sli
Posted on Reply
#12
Roph
Can they let this chip die already?
Posted on Reply
#13
newtekie1
Semi-Retired Folder
iStinkYeah I'd like to see tri sli.

Can you run dual monitors in sli yet? or tri sli for that matter? That's the only reason I haven't gone crossfire/sli
Dual monitors in SLi has been possible for a while now.
RophCan they let this chip die already?
Why?
Posted on Reply
#14
Mussels
Freshwater Moderator
newtekie1Dual monitors in SLi has been possible for a while now.



Why?
the second screen disables once 3D is activated, yes?
Posted on Reply
#15
newtekie1
Semi-Retired Folder
Musselsthe second screen disables once 3D is activated, yes?
It did when the new drivers were first released, I haven't tested it since.

Edit: Just tested, using 185.85 the second monitor is not disabled when running a fullscreen 3D app(3DMark06).
Posted on Reply
#17
Per Hansson
Yawn, my 8800GTS already does 1075Mhz (or 2150 marketing Mhz)
And the memory is not even voltmodded, only the GPU
Oh, and mine is 1.5 years old
Some development!
Posted on Reply
#18
a_ump
Yawn maybe, it depends on how much they ask for this, if it's $110, and that memory can clock up to maybe 1,200mhz(2,400 effective) then it might be worth it, i want to see a review though on if the new memory actually makes a difference to warrant it's purchase over a HD 4770
Posted on Reply
#19
Hayder_Master
i was have an gigabyte 8800gt and clock at 700 mhz and i overclock at 760 using simple gigabyte overclock program called HUD
Posted on Reply
#20
Cheeseball
Not a Potato
This FreezerX DHT is the same one they use on their normal 9800GTX+/GTS 250s.

Tri-SLi fingers indicate that this is definitely the 55nm G92 chip. All the Palit 9800GTs with the stock dual-heatipipe HSF had the tri-SLi fingers and were all the 55nm revision.

If this does come out at it's rated price of ~$90, then this is a done deal.
Posted on Reply
#21
cdawall
where the hell are my stars
Per HanssonYawn, my 8800GTS already does 1075Mhz (or 2150 marketing Mhz)
And the memory is not even voltmodded, only the GPU
Oh, and mine is 1.5 years old
Some development!
mine did 1100mhz mem and the core did 840mhz on the stock cooler are you trying to make a point?
Posted on Reply
#22
Per Hansson
cdawallmine did 1100mhz mem and the core did 840mhz on the stock cooler are you trying to make a point?
Yea, that the renamed 8800GTS needs to die already! :D
Posted on Reply
#23
a_ump
Per HanssonYea, that the renamed 8800GTS needs to die already! :D
It's 8800GT there buddy ;), and i'd use G92 instead. I don't see a problem with re badging, more variety means your more likely to find exactly what you want.
Posted on Reply
#24
cdawall
where the hell are my stars
Per HanssonYea, that the renamed 8800GTS needs to die already! :D
like was already said this is an 8800GT core has some stuff shut off on it
Posted on Reply
#25
Wile E
Power User
Going 1100Mhz on the mem wont make much of a difference. One of my 8800GT's clocked to 900+ Core (linked) and 1100 mem, and going from 1000-1100Mhz at even that core speed made little difference. The core definitely was the bigger factor in clocking.
Posted on Reply
Add your own comment
Apr 29th, 2024 09:14 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts