• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

GECUBE lists Radeon HD 2600 XT

Darksaber

Senior Editor & Case Reviewer
Staff member
Joined
Jul 8, 2005
Messages
3,109 (0.43/day)
Location
Victoria, BC, Canada
System Name Corsair 2000D Silent Gaming Rig
Processor Intel Core i5-14600K
Motherboard ASUS ROG Strix Z790-i Gaming Wifi
Cooling Corsair iCUE H150i Black
Memory Corsair 64 GB 6000 MHz DDR5
Video Card(s) Gainward GeForce RTX 4080 Phoenix GS
Storage TeamGroup 1TB NVMe SSD
Display(s) Gigabyte 32" M32U
Case Corsair 2000D
Power Supply Corsair 850 W SFX
Mouse Logitech MX
Keyboard Sharkoon PureWriter TKL
The company website has the Radeon HD 2600XT listed in detail, while the Radeon HD2400 Series link is present but empty. Gecube opted for a special "X-Turbo2 Silent Fan" so the card may be clocked higher than default. No mention of actual clock speeds are made, but the card features 256MB GDDR4, Built-in HDMI (HD Video with 5.1 surround audio) and Dual DVI (2 Dual link, HDCP).


View at TechPowerUp Main Site
 
GDDR4 on a mid-range card eh? While the X2900 uses GDDR3...
 
Looks good but I gather from what I have read that this card will only be 128Bit bandwidth also and probably wont be widely available till late June. I came across this, dont know how accurate it is, seems to make some sense to me, it says that the 2400XT will be restricted to a bandwidth of only 64Bit????? It even says the 8500GT will wipe the floor with it....WTF?

http://www.fudzilla.com/index.php?option=com_content&task=view&id=939&Itemid=34
 
GDDR4 on a mid-range card eh? While the X2900 uses GDDR3...

Yeah, I thought that, seems this new AMD/ATi card strategy is a bit kind of haphazard 2 me.
 
i think they just had issues with the high end gddr4, the stuff on this (and the x1950xtx) is probably in readier supply
 
interesting, the x1950 uses gddr4, the 2900xt uses gddr3, the 2600 uses gddr4... i spot no pattern
 
Memory prices have dropped recently, maybe a big enough order secured a good deal for cheap DDR4 or maybe they are using DDR4 to keep the energy consumption down on the card.
 
Well, to be fair, they're using GDDR3 on the 2900 probably to try bringing the price down a bit. Since it has a 512 bit bus, it doesn't need extremely fast RAM to make a high memory throughput.
Maybe they're trying to get past the 128 bit bus by using GDDR4 clocked higher, and also, I agree that they've probably got a steady cheap supply of GDDR4 from the X1950XTX, and they're taking advantage of it.
 
Quick question, have any of these vendors started using custom pcb designs yet? besides a different fan sticker do any of these hd 2900 card have different specs?
 
Last edited:
Well it doesn't surprise me that they use GDDR4 on a 128bit bus width. Graphics card companies often put high frequency memory on small bus widths to sort of "make up" for the lack of width on midrange cards.

Sort of like the 7600GT. 1600mhz DDR3 on a 128bit wide bus. If they stuck slow ram on that small of a bus it would be crap(7600GS for example).
 
doesnt the 2900XTX use GDDR4 while the 2900XT uses the GDDR3?

I would undertand it if that was the case; ATi are making the difference between the XT and XTX physical, so it cant just be a simple overclock that makes the difference.

Anywho, if these HD 2600xt's are cheap enough, im gonna sell my X1950 pro and buy two of em :D

I find it weird that ati have 512bit cards ---> 128bit ---> 64 bit. Id think they would scrap 64 bit altogether and go 512 --> 256 --> 128.

Unless they are planning to release 2*50 cards or rename all their cards, where they have double the bitrate :p (They wait for all these cards to sell like mad and then release a new product line :D)
 
Last edited:
I find it weird that ati have 512bit cards ---> 128bit ---> 64 bit. Id think they would scrap 64 bit altogether and go 512 --> 256 --> 128.

Unless they are planning to release 2*50 cards or rename all their cards, where they have double the bitrate :p (They wait for all these cards to sell like mad and then release a new product line :D)

Hmm new marketing team : ATI has new motto: Make Money! :twitch: :twitch:
 
If i remember correctly the 1900series is a build up version of 1600 series that uses the 1pipeline = 4shaders or something.
 
If i remember correctly the 1900series is a build up version of 1600 series that uses the 1pipeline = 4shaders or something.
You didn't remember correctly:p
The X1900 is more like a build up version of the X1800.
And all the X1000s after the X1800s are 3 p. shaders to 1p. pipeline
 
I don't like dual slot coolers at all, let alone on midrange VC's.
 
Well the X1650XT uses a 256 bit memory bus, so I dont see why they would regress to 128 bit on the new mid-range cards.
 
It does?!
*No, it doesn't. The memory uses 128bit bus. But anyways the 2600's should deffinately use a 256 mem bus.
 
Last edited:
Well the X1650XT uses a 256 bit memory bus, so I dont see why they would regress to 128 bit on the new mid-range cards.
I haven't figured that out either, Waz. Both companies are guilty of it to boot.
 
Well the X1650XT uses a 256 bit memory bus, so I dont see why they would regress to 128 bit on the new mid-range cards.

I thought it used 128 bit.
 
The GPU uses a 256 bit ring bus that goes 2 ways 128 bits each way, the memory bus is 128 bit just like all the other x16xx series.
 
Ring bus, the X1600 series uses Ring Bus, but on a slightly smaller scale than the X1900 and HD2900.
It has 256-bit bus for one direction only I believe. I gotta look it up in some old magazines, but it's half 128-bit, half 256-bit.
EDIT: I skimmed through some responses, didn't see erocker had already said Ring Bus, heh.

Maybe you thought it had 256-bit since the X1650XT is basically an X1950Pro with only 24pps (or pixel shaders, w/e, but it's based on the R570 core, that's the important part)

I don't see how putting a full 256-bit bus on a graphics card can be more expensive than putting GDDR4 or even just GDDR3 memory, so all of the mid-range cards should really have it by now. Although knowing both companie's strategies, maybe they'll soon have a 2900 Pro or a 2900 GTO version, for the upper-mid-range.
 
I guess it depends on the bandwidth the chosen GPU can push out too. If the GPU cant use it, they might not want to put the extra bandwidth in.

That said, perhaps its the memory controller thats expensive?

Edit: Oh and IMO, DX10 hardware needs a lot more GPU balls now - with physics and more effects pushed away from the CPU to the GPU, the GPU has to be more and more powerful to keep up, as opposed to the memory.
 
Last edited:
Back
Top