• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Axle GeForce GT 220 1 GB

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
28,750 (3.75/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
While Axle is not the most well-known graphics card manufacturer, their GeForce GT 220 can score. It comes with a silent Arctic Cooling heatsink making it the quietest GT 220 we tested so far - great for HTPCs. It also offers low temperature levels which means overclocking is easy to do. We saw an 31% overclock on the GPU core clock and 18% on the memory.

Show full review
 
Last edited:
for a brand new card to be so soundly trashed by a 9600GT is a sad state indeed.

and with that cooler taking up 3 slots (one for card, one for heatsink, one for fan/intake for fan) its huge as well
 
Not sure, but 8600 was same as 9600 .. eg. its beaten by two or three generation old cards? :D

nVidia - way to be absurd!
 
no offense, but this card performs like garbage even on the resolution it was designed for (1024x768) i give this 2 thumbs down. even the 4670 which is older tech beats it easily. If someone needed a DX10 board as a igp replacement i wouldn't recommend this.
 
Not sure, but 8600 was same as 9600 .. eg. its beaten by two or three generation old cards? :D

nVidia - way to be absurd!

The 8600 and 9600 were not the same card.
 
no offense, but this card performs like garbage even on the resolution it was designed for (1024x768) i give this 2 thumbs down. even the 4670 which is older tech beats it easily. If someone needed a DX10 board as a igp replacement i wouldn't recommend this.

Agreed. I defend NV a lot because I want to see good things from them, but they F*cked up! MESSED UP, screwed up, everything with the low end 200 cards. To a loyalist it's got to sting more than to someone who's willing to go either ATI or NV.
 
Well guys it's only a 220 after all. I mean it's the second lowest card of the 200 series! There is nothing wrong with the performance considering where it stands and it has some pretty low specs. The problem is the price as it costs more than the mid range cards from previous generations that can still smack it around! That is crazy. These things should be like $50 tops.

Also yeah 8600GT and 9600GT not the same thing at all! Double the stream processing cores, double the memory bus and pretty much double the performance! I think you meant 8800GT and 9800GT is the same thing witch is true for the most part.
 
Last edited:
My 9500 beat it by a big margin only whit a PII 720 and a little overclock:nutkick:
http://img.techpowerup.org/091207/9500gtop.jpg

Why did i read Pentium II at 720 Mhz... lol

Anyway, its nothing new. You should blame the maker of the card for flooding the market. They dont add new things, and its basicly a sugar for the Fermi in Q1 / 2010 soon.

The 310 / 210 is also garbage, even tho its being sold at OEM market only.
 
Today I saw something really tragic, there was a customer at a pc store that was going to buy
a new vga for his son for about 50~80€ to play games. The employee of the store told him to buy a GT220 1GB DDR2 for 70€ !!!!!! :wtf:! I was telling the employee why not buy a 9800GT/8800GT(store has them at 75€) and he said it's a better deal getting a GT220! DAMN! When he said what about a card at 90€ the employee suggest him to buy a GT240 for 90€!!!
Then I went berserk ... and told him why not buy a GTS250 512 MB at 90€ ??? And he told again it's a better deal ... damn worthless noobs. Man it's 2 times faster what a stupid buy ...
I'm sure he's gonna regret the purchase he made. All these nv gt220,240 cards sucks for their price.
 
I hate it when such employees try to con people... Countless times i ve seen them prey on them poor souls, but heck, what can we do? Most of the time the customers will just tell you to mind your own business anyway.
 
Today I saw something really tragic, there was a customer at a pc store that was going to buy
a new vga for his son for about 50~80€ to play games. The employee of the store told him to buy a GT220 1GB DDR2 for 70€ !!!!!! :wtf:! I was telling the employee why not buy a 9800GT/8800GT(store has them at 75€) and he said it's a better deal getting a GT220! DAMN! When he said what about a card at 90€ the employee suggest him to buy a GT240 for 90€!!!
Then I went berserk ... and told him why not buy a GTS250 512 MB at 90€ ??? And he told again it's a better deal ... damn worthless noobs. Man it's 2 times faster what a stupid buy ...
I'm sure he's gonna regret the purchase he made. All these nv gt220,240 cards sucks for their price.

That's when you pull up specs of both cards, and performance numbers at 1280x1024-1600x1200 (or widescreen formats) for a slew of games. Fact of the cards lower number means its inferior in the lineup.

Agreed. I defend NV a lot because I want to see good things from them, but they F*cked up! MESSED UP, screwed up, everything with the low end 200 cards. To a loyalist it's got to sting more than to someone who's willing to go either ATI or NV.

I still say you get a better deal with a 4670 with 1GB ram
 
Not sure, but 8600 was same as 9600 .. eg. its beaten by two or three generation old cards? :D

nVidia - way to be absurd!

The 9600GT was a lot better than the 8600 series (the numbers were similar but nothing else)
 
The 9600GT was a lot better than the 8600 series (the numbers were similar but nothing else)

true, thats where NV did something right. Course my card being as old as it is, still ate up the 8600 series and , 2600 series.
 
even today a pair of 9600GTs in SLI are a great value (especially if you were lucky enough to get them for $39 when they were on sale)

Its gonna be interesting to see comparisons of the new cards against the old ones and see if there is an improvement.
 
Is memory bandwidth wrong ? If not then holy crap ! ... Houston we 've got problem ...

gpuzoc.gif
 
128 (bits) * 4 (bytes per clock gddr5) * 2065 (mhz) / 8 (bits per byte) = 132G
 
Must be some kind of gpu-z error can't be 132gb bandwidth ,,, maybe it's 1700x2 or 850x4 for ddr5, check it out. On official specs it says "900 MHz (3.6 GHz QDR)" = 57 Gb/sec. Or it's an NVIDIA fake mem boost hack ...
 
you are looking at the wrong specs. your specs are for a 900 mhz card. the gpuz screenshot is for a 2065 mhz card. do the math. and yes your 2065 mhz are 8 ghz qdr
 
you are looking at the wrong specs. your specs are for a 900 mhz card. the gpuz screenshot is for a 2065 mhz card. do the math. and yes your 2065 mhz are 8 ghz qdr

Dude I know you've done thousands or reviews but here is an gpu-z error, for GT240 axle states that card is 850 MHz DDR5 (3400 QDR) check it out. Bandwidth should be ~55gb/sec.
Nvidia here http://www.nvidia.com/object/product_geforce_gt_240_us.html says that bandwidth is 57gb/sec. They made memory info look like the memory speed is 1700(850x2) x 2, or to look like a ddr3 speed. Anyway it can't be 8000 MHz DDR5 chips equiped on these trashy cards. Nvidia specs below:

850 MHz (3,400 MHz effective) GDDR5 = 57gb/sec
1,000 MHz (2,000 MHz effective) DDR3 = 32gb/sec
 
Last edited:
Ok I'm going to say this right now, are you seriously planning on buying the board??? If not, who gives a flying f*** what the memory bandwidth is. You've seen the tests the board has had and it's not better than a GF 9800. It's a Low end card with DX10.1 capabilities that ATI has had since the Radeon 3000 Series.


end rant
 
Capture353.jpg


that's from the nvidia reviewer guide.
gddr3: 1000 mhz * 128 bit / 8 bit * 2 (ddr3) = 32,000 <-> nvidia 32.0 .. looks right
gddr5: 1700 mhz * 128 bit / 8 bit * 4 (gddr5) = 108,800 <-> nvidia 54.4 .. something wrong here?
 
http://img.techpowerup.org/091223/Capture353.jpg

that's from the nvidia reviewer guide.
gddr3: 1000 mhz * 128 bit / 8 bit * 2 (ddr3) = 32,000 <-> nvidia 32.0 .. looks right
gddr5: 1700 mhz * 128 bit / 8 bit * 4 (gddr5) = 108,800 <-> nvidia 54.4 .. something wrong here?

It' a glitch of the Gpu-z with nvidia gt240 cards i think. I've seen it on another review, when it's stock it says 1700 ddr5 but it multiplies it x4 but it should be x2, also in this review ... "you write that, check memory specs"
http://www.techpowerup.com/reviews/MSI/GeForce_GT_240/4.html it says 2000x2. And finally
here http://www.techpowerup.com/reviews/MSI/GeForce_GT_240/33.html bandwidth calculation is faulty, at stock and then at oc. Gpuz should count bandwidth in this card as 1700x2x16(128bit) and not as 1700x4x16(128bit). We haven't seen any 8Ghz ddr5 yet and if we did it shouldn't be from gt240. So with current gpu-z on gt240 "bandwidth shown / 2"
Also check msi card with msi afterburner utility. It starts shown the memory at 1700 (it's already multiplied x2) and then gpu-z multiplies it X4 ... but should do it x2.
 
Last edited:
GT 220 is basically the GT 240M. Just got a Lenovo Y550P with a GT 240M and a I7-720QM. Trying to squeeze more out of it already with overclocking. Considering it has more shader cores then previous generations, but still slower, what do you think is the main bottleneck? Just clocks or memory speeds specifically? I mean comparing with 9500GT and other cards with 128bit memory interface.
 
Back
Top