• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD R600 to use GDDR4 only

zekrahminator

McLovin
Joined
Jan 29, 2006
Messages
9,066 (1.29/day)
Location
My house.
Processor AMD Athlon 64 X2 4800+ Brisbane @ 2.8GHz (224x12.5, 1.425V)
Motherboard Gigabyte sumthin-or-another, it's got an nForce 430
Cooling Dual 120mm case fans front/rear, Arctic Cooling Freezer 64 Pro, Zalman VF-900 on GPU
Memory 2GB G.Skill DDR2 800
Video Card(s) Sapphire X850XT @ 580/600
Storage WD 160 GB SATA hard drive.
Display(s) Hanns G 19" widescreen, 5ms response time, 1440x900
Case Thermaltake Soprano (black with side window).
Audio Device(s) Soundblaster Live! 24 bit (paired with X-530 speakers).
Power Supply ThermalTake 430W TR2
Software XP Home SP2, can't wait for Vista SP1.
While there were initially predictions about low GDDR4 yields and high costs, AMD has announced that they will only put GDDR4 memory in the R600 series. The R600 series will also be compatible with Stream computing (maybe even a second version).

View at TechPowerUp Main Site
 
pure ownage :P though I dont know how much better the ram would be... but I'm sure its pure ownage!:rockout:

when I really need one I will pursue aquiring one of these bad boys.
 
Isnt This The Same Memory The X1950 Uses?
 
In one way more perf. in the other more €/$
 
A x1950 is nothing but a x1900 with GDDR4 ram, nothing more, nothing less.
 
Faster ram always = faster performance though lol and when we overclock the ability to have ram that can go to a incredibly high speed can really benefit in performance.
 
Isnt This The Same Memory The X1950 Uses?

It's the same memory that the X1950XTX/CrossFire uses, but they can always use lower latency GDDR4 for faster speeds.
 
and the big question is what about the energy consumption of this card? they always seem to list every other state openly but energy consumption these days on graphics cards.


surprising not a single mention of ATI,

RIP ATI brand
 
Last edited:
While there were initially predictions about low GDDR4 yields and high costs, AMD has announced that they will only put GDDR4 memory in the R600 series. The R600 series will also be compatible with Stream computing (maybe even a second version).

Source: The Inquirer



yea high costs because something is FISHY, and the government will get to the bottom of it, maybe this means cheaper video cards for us,

AMD, NVidia Subpoenaed in Justice Dept. Investigation

http://www.betanews.com/article/AMD_NVidia_Subpoenaed_in_Justice_Dept_Investigation/1164999684
 
I hope that by using GDDR4, AMD's card will draw less power than the 8800 GTX. I plan on buying a DX10 card next year, but I don't want to build my own nuclear plant to run it :P
 
I hope that by using GDDR4, AMD's card will draw less power than the 8800 GTX. I plan on buying a DX10 card next year, but I don't want to build my own nuclear plant to run it :P

Oh that's okay, you won't need to :). From what I can tell, the R600 series has an external power connection :roll:. And each card is expected to use between 180 and 250 watts of power...The VGA industry really needs more competition :shadedshu .
 
Oh that's okay, you won't need to :). From what I can tell, the R600 series has an external power connection :roll:. And each card is expected to use between 180 and 250 watts of power...The VGA industry really needs more competition :shadedshu .

Or in this case, just don't buy the crap.

Thats competition enough for them

too bad it isn't gonna happen.. Consumers dictate the market....

If consumers will buy it, why bother?
 
Both companies have already said they aren't focussing at all on heat or power this generation. They are just concerned about getting the things out the door. It won't be until the next series of cards that we see cards made with heat and power in mind. Which is why I am not buying into the first DX10 gen cards, instead I am waiting until DX10 is more established and there are cards out there that don't require I add a second air conditioner and an electrical substation to my house.
 
Sorry, in my dig through old news, I found this and had to laugh at how we've been using "only" GDDR3 HD 2900's for... two months now?
 
Back
Top