Friday, December 1st 2006

AMD R600 to use GDDR4 only

While there were initially predictions about low GDDR4 yields and high costs, AMD has announced that they will only put GDDR4 memory in the R600 series. The R600 series will also be compatible with Stream computing (maybe even a second version).Source: The Inquirer
Add your own comment

16 Comments on AMD R600 to use GDDR4 only

#1
Chewy
pure ownage :P though I dont know how much better the ram would be... but I'm sure its pure ownage!:rockout:

when I really need one I will pursue aquiring one of these bad boys.
Posted on Reply
#2
AthlonX2
HyperVtX™
Isnt This The Same Memory The X1950 Uses?
Posted on Reply
#4
Canuto
In one way more perf. in the other more €/$
Posted on Reply
#5
EastCoasthandle
A x1950 is nothing but a x1900 with GDDR4 ram, nothing more, nothing less.
Posted on Reply
#6
InfDamarvel
Faster ram always = faster performance though lol and when we overclock the ability to have ram that can go to a incredibly high speed can really benefit in performance.
Posted on Reply
#7
newtekie1
Semi-Retired Folder
by: AthlonX2
Isnt This The Same Memory The X1950 Uses?
The X1950XTX uses GDDR4, the X1950XT and Pro both use GDDR3.
Posted on Reply
#8
Azn Tr14dZ
by: AthlonX2
Isnt This The Same Memory The X1950 Uses?
It's the same memory that the X1950XTX/CrossFire uses, but they can always use lower latency GDDR4 for faster speeds.
Posted on Reply
#9
jocksteeluk
and the big question is what about the energy consumption of this card? they always seem to list every other state openly but energy consumption these days on graphics cards.


surprising not a single mention of ATI,

RIP ATI brand
Posted on Reply
#11
XeoNoX
by: zekrahminator
While there were initially predictions about low GDDR4 yields and high costs, AMD has announced that they will only put GDDR4 memory in the R600 series. The R600 series will also be compatible with Stream computing (maybe even a second version).

Source: The Inquirer
yea high costs because something is FISHY, and the government will get to the bottom of it, maybe this means cheaper video cards for us,

AMD, NVidia Subpoenaed in Justice Dept. Investigation

http://www.betanews.com/article/AMD_NVidia_Subpoenaed_in_Justice_Dept_Investigation/1164999684
Posted on Reply
#12
ShinyG
I hope that by using GDDR4, AMD's card will draw less power than the 8800 GTX. I plan on buying a DX10 card next year, but I don't want to build my own nuclear plant to run it :P
Posted on Reply
#13
zekrahminator
McLovin
by: ShinyG
I hope that by using GDDR4, AMD's card will draw less power than the 8800 GTX. I plan on buying a DX10 card next year, but I don't want to build my own nuclear plant to run it :P
Oh that's okay, you won't need to :). From what I can tell, the R600 series has an external power connection :roll:. And each card is expected to use between 180 and 250 watts of power...The VGA industry really needs more competition :shadedshu .
Posted on Reply
#14
Dippyskoodlez
by: zekrahminator
Oh that's okay, you won't need to :). From what I can tell, the R600 series has an external power connection :roll:. And each card is expected to use between 180 and 250 watts of power...The VGA industry really needs more competition :shadedshu .
Or in this case, just don't buy the crap.

Thats competition enough for them

too bad it isn't gonna happen.. Consumers dictate the market....

If consumers will buy it, why bother?
Posted on Reply
#15
newtekie1
Semi-Retired Folder
Both companies have already said they aren't focussing at all on heat or power this generation. They are just concerned about getting the things out the door. It won't be until the next series of cards that we see cards made with heat and power in mind. Which is why I am not buying into the first DX10 gen cards, instead I am waiting until DX10 is more established and there are cards out there that don't require I add a second air conditioner and an electrical substation to my house.
Posted on Reply
#16
zekrahminator
McLovin
Sorry, in my dig through old news, I found this and had to laugh at how we've been using "only" GDDR3 HD 2900's for... two months now?
Posted on Reply
Add your own comment