• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

GECUBE lists Radeon HD 2600 XT

I would not rush into a conclusion that the HD2600XT with 512MB has 128bit wide memory bus...
 
Generally, 256/512MB versions are the same bit width, with the exception of say the powercolor EZ series, which used slowerDDR1 on its 256MB versions, compared to DDR2 on its 256MB.
 
GDDR4 is different, it's exclusively 64MB 32bit/ic.
 
ah i see, so it depends on the amount of chips.
 
doesnt the 2900XTX use GDDR4 while the 2900XT uses the GDDR3?

I would undertand it if that was the case; ATi are making the difference between the XT and XTX physical, so it cant just be a simple overclock that makes the difference.

Anywho, if these HD 2600xt's are cheap enough, im gonna sell my X1950 pro and buy two of em :D

I find it weird that ati have 512bit cards ---> 128bit ---> 64 bit. Id think they would scrap 64 bit altogether and go 512 --> 256 --> 128.

Unless they are planning to release 2*50 cards or rename all their cards, where they have double the bitrate :p (They wait for all these cards to sell like mad and then release a new product line :D)

Your right but I was of the beleif that the 2900XTX is binned and will not be retail released.
 
Even if it isn't being released, someone will likely make a 2900XT with GDDR4, it's pin compatibel with GDDR3 modules, and can use the same memory controller anyhow, so there wouldn't be much difficulty in squeezing on some GDDR4.
 
true but no one would buy it... it was cancelled beacuse the DDR4 XTX was no faster than the DDR3 XT - the core was limited, not the ram speed.
 
Mussels,
Yes, bus width depends on the "width" of the chip and the amount of chips used.

DDR1 and DDR2 = 8 and 16bit/ic
gDDR1 and 2 = 16bit/ic
gDDR3 and 4 = 32bit/ic
 
handy info that i will forget within a week. :D
 
handy info that i will forget within a week. :D

Damn your good.....I have forgotten it already :eek: I am old tho so dont function very well these days. :p
 
I don't see how putting a full 256-bit bus on a graphics card can be more expensive than putting GDDR4 or even just GDDR3 memory, so all of the mid-range cards should really have it by now. Although knowing both companie's strategies, maybe they'll soon have a 2900 Pro or a 2900 GTO version, for the upper-mid-range.

Exactly.

If they were to release a 2600xt with 256-bit, then what purpose would the 2900PRO/GT serve?
Its not how much it costs the actual company...i mean, there wouldnt be that much of a price difference in producing an x2900xt and an x300 :p.

Mussels,
Yes, bus width depends on the "width" of the chip and the amount of chips used.

DDR1 and DDR2 = 8 and 16bit/ic
gDDR1 and 2 = 16bit/ic
gDDR3 and 4 = 32bit/ic

Not sure about GDDR4, but doesnt GDDR3 have 64bit/ic as well?
 
Pinchy,
No, 32bit/ic is the maximum.
 
All informations about Radeon HD 2600 & 2400 (PCI Express and AGP versions) posted by Sapphire (privileged partner of ATI):



Thanks to Matbe and, of course, to Sapphire !
 
All informations about Radeon HD 2600 & 2400 (PCI Express and AGP versions) posted by Sapphire (privileged partner of ATI):



Thanks to Matbe and, of course, to Sapphire !

Is it just me, or does anyone else find it weird that the 2600XT has higher clock speeds than the 2900XTX? :wtf:

Also, if those pics are valid....Woot for AGP DX10 :D
 
Pinchy,
Well, GPUReview is wrong. As it is with HD2900XT; their chart says it has 8x64bit chips. In reality it has 16x32bitters.

The 512MB X800XL is truly a horrible abomination.
It has, not 8, but 16 32MB 32bit/ic gDDR3 chips with ½ of the chips' I/O pins crippled in the card's bios so that the active bus is still 256bit despite the amount of chips.
 
Last edited:
Why not?
The X1600XT has higher core clock speed than my X1950Pro anyways.
 
Yeah...but the 2900 is meant to be the flagship of ati cards...it should have the highest clock speeds :p
 
Also, HD 2600 XT (65nm) has higher clock speeds than the HD 2900 XT (80nm).

We must wait 1st July, probable, to see if all these are correct...

Why would you think thats odd? I beleive they have different cores or dervitiives, the 8600GTS has higher clock speeds than both the 8800GTS and GTX....again, because it has different cores.....G80......G84......G86 etc etc
 
looks to be an awesome performing midrange card. Maybe worth going xfire with these?
 
Back
Top