Thursday, May 22nd 2008

Qimonda Wins AMD as Partner for Launch of New Graphics Standard GDDR5

Qimonda AG, a leading manufacturer of memory products, today announced that the company has won AMD as launch partner for the new graphics standard GDDR5. Qimonda already started mass production and the volume shipping of GDDR5 512Mbit components with a speed of 4.0Gbps to AMD, a leading global provider of innovative processing solutions in the computing, graphics and consumer electronics markets.

GDDR5 will become the next predominant graphics DRAM standard with a tremendous memory bandwidth improvement and a multitude of advanced power saving features. It targets a variety of applications, starting with high performance desktop graphic cards followed by notebook graphics. Later on also the introduction in game consoles and other graphics intensive applications is planned.

"We are very proud to supply AMD with GDDR5 volume shipments only six months after first product samples have been delivered," said Robert Feurle, Vice President Business Unit Specialty DRAM of Qimonda AG. "This is a further milestone in our successful GDDR5 roadmap and underlines our predominant position as innovator and leader in the graphics DRAM market."

"Qimonda's strong GDDR5 roadmap convinced us to choose them as a primary technology partner for our GDDR5 GPU launch," said Joe Macri, Sr. Director, Circuit Technologies, AMD. "Both the early availability of first samples and volume shipments added great value to the development and launch of our upcoming high-performance GPU."

More information on Qimonda's GDDR5 products is available at: www.qimonda.com/graphics-ram/gddr5/index.html
Source: Qimonda
Add your own comment

7 Comments on Qimonda Wins AMD as Partner for Launch of New Graphics Standard GDDR5

#1
Wile E
Power User
All I'm still concerned about is how this is gonna translate into performance. Hopefully HD4xxx can make use of all the bandwidth.

Oh, and in before the fanboy war. lol.
Posted on Reply
#2
jbunch07
Wile EAll I'm still concerned about is how this is gonna translate into performance. Hopefully HD4xxx can make use of all the bandwidth.

Oh, and in before the fanboy war. lol.
shouldn't be to much of a problem...after all im sure there is a reason they decided to go from gddr3 to gddr5
Posted on Reply
#3
FR@NK
Wile EAll I'm still concerned about is how this is gonna translate into performance. Hopefully HD4xxx can make use of all the bandwidth.
It will translate into fewer memory chips and less complex/smaller PCB for the same memory performance as GDDR3/4.
Posted on Reply
#4
jbunch07
FR@NKIt will translate into fewer memory chips and less complex/smaller PCB for the same memory performance as GDDR3/4.
witch should make the pcb less to manufacture...i think you said that in the other thread though...
Posted on Reply
#5
WarEagleAU
Bird of Prey
They went from GDDR4 to GDDR5. Nvidia is still using GDDR3, I wonder if there is something to that.
Posted on Reply
#6
imperialreign
WarEagleAUThey went from GDDR4 to GDDR5. Nvidia is still using GDDR3, I wonder if there is something to that.
could be, but has nVidia ever released a GDDR4 card? :confused:

ATI likes to keep on the forefront of new graphics techology . . . but I wonder if pricing/availability of GDDR4 is why they plane to go with 3 and 5 with the new series . . .
Posted on Reply
#7
Rebo&Zooty
WarEagleAUThey went from GDDR4 to GDDR5. Nvidia is still using GDDR3, I wonder if there is something to that.
cost, nvidia want to keep their cost as low as possable so they can keep their profit margins as high as possable.

also its very possable they bought HUGE ammounts of ddr3 from vairous ram makers getting huge disscounts, so they got this uber cheap ddr3, and they want to use it because the less they spend on building a card, the more they can make per card sale.

lookat the 8800gt, the coolers a peice of crap, the first run use what a few sites have showen to be sub standred componants, im sure picked to raise the profit margin.

in the end all nvidia care about is reaping as much $ from our wallets as possable not about advancing gfx tech or tech in genral.

i know thats buisness, but at least ati/amd tend to try and advance things, even if it dosnt work out like they expected, at least they tryed.....nvidia=bruit force and has for years, if they are slower, they add more pipes and up the clocks.

blah, both companys frustrate me to be honest, just nvidia frustrates me more currently, im waiting on my 8800gt to get back from rma, beucase it died due to heat thanks to the shitty stock cooler.......what a lovely design........cheap ass cooler on a hot as hell chip, the cooler takes the heat from the gpu and shares it with the vrm/fets and the ram and well everything else on the card........great design.......then they run the ran at 20%.......great design if you want the cards to die so people are FORCED to buy new ones or so that companys like bfg/xfx that offer life time warr are forced to buy more in order to replace the fryed ones........again, what a crock of shit..........

im ranting again, but if you had to rma 4 8800gt's because they where eather doa or because they cooked themselves wouldnt u bea bit annoied?

my card was hitting 97-98c playing WoW of all things, thats with its fan forced to 100% using rivatuner, oh and the 8800gt fan at 100% is like a small vaccume cleaner....high pitched and annoing...(reminds me of my ex....:roll: )

but honestly, i hope this works out well, i may sell the replacement card off and get a 4800 card if things look decent, im getting tired of rma'ing it......i have owned many many many videocard in my life, never have i had to rma a card more then 2x in its warr period, and the 2x was a x800xt pe from asus, first time was not long after they came out, and it was due to a bad componant on the board, 2nd time was 2 years later and the stock coolers fan went out, they said to send it in (at their cost) and they swaped the card with a newer ver of the same card......worked for me, cards on a shelf now, waiting to find a new home......
Posted on Reply
Apr 23rd, 2024 06:15 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts