Wednesday, May 21st 2008

AMD Confirms GDDR5 for ATI Radeon 4 Series Video Cards

AMD today announced the first commercial implementation of Graphics Double Data Rate, version 5 (GDDR5) memory in its forthcoming next generation of ATI Radeon graphics card products. The high-speed, high-bandwidth GDDR5 technology is expected to become the new memory standard in the industry, and that same performance and bandwidth is a key enabler of The Ultimate Visual Experience, unlocking new GPU capabilities. AMD is working with a number of leading memory providers, including Samsung, Hynix and Qimonda, to bring GDDR5 to market.

Today's GPU performance is limited by the rate at which data can be moved on and off the graphics chip, which in turn is limited by the memory interface width and die size. The higher data rates supported by GDDR5 - up to 5x that of GDDR3 and 4x that of GDDR4 - enable more bandwidth over a narrower memory interface, which can translate into superior performance delivered from smaller, more cost-effective chips. AMD's senior engineers worked closely with industry standards body JEDEC in developing the new memory technology and defining the GDDR5 spec.

"The days of monolithic mega-chips are gone. Being first to market with GDDR in our next-generation architecture, AMD is able to deliver incredible performance using more cost-effective GPUs," said Rick Bergman, Senior Vice President and General Manager, Graphics Product Group, AMD. "AMD believes that GDDR5 is the optimal way to drive performance gains while being mindful of power consumption. We're excited about the potential GDDR5 brings to the table for innovative game development and even more exciting game play."

The introduction of GDDR5-based GPU offerings marks the continued tradition of technology leadership in graphics for AMD. Most recently AMD has been first to bring a unified shader architecture to market, the first to support Microsoft DirectX 10.1 gaming, first to lower process nodes like 55nm, the first with integrated HDMI with audio, and the first with double-precision floating point calculation support.

AMD expects that PC graphics will benefit from the increase in memory bandwidth for a variety of intensive applications. PC gamers will have the potential to play at high resolutions and image quality settings, with superb overall gaming performance. PC applications will have the potential to benefit from fast load times, with superior responsiveness and multi-tasking.

"Qimonda has worked closely with AMD to ensure that GDDR5 is available in volume to best support AMD's next-generation graphics products," said Thomas Seifert, Chief Operating Officer of Qimonda AG. "Qimonda's ability to quickly ramp production is a further milestone in our successful GDDR5 roadmap and underlines our predominant position as innovator and leader in the graphics DRAM market."

GDDR5 for Stream Processing
In addition to the potential for improved gaming and PC application performance, GDDR5 also holds a number of benefits for stream processing, where GPUs are applied to address complex, massively parallel calculations. Such calculations are prevalent in high-performance computing, financial and academic segments among others. AMD expects that the increased bandwidth of GDDR5 will greatly benefit certain classes of stream computations.

New error detection mechanisms in GDDR5 can also help increase the accuracy of calculations by indentifying errors and re-issuing commands to get valid data. This capability is a level of reliability not available with other GDDR-based memory solutions today.
Source: AMD
Add your own comment

135 Comments on AMD Confirms GDDR5 for ATI Radeon 4 Series Video Cards

#51
Kirby123
well my sli ultras didnt run crysis as good as my ati cards have.... thats what i find funny. the games that say way to be play invidia ? why do my ati cards run it better?
Posted on Reply
#52
Valdez
imperialreignand possibly that refinement is why ATI chose to be the only hardware manufacturer to support it . . .
s3 chrome 430gt supports dx10.1 too :)
Posted on Reply
#53
[I.R.A]_FBi
ningenYou don't play 3DMark so who cares. 9800gx2 vs 3870x2 tests (at least those I've seen) show that cards are pretty much on par, but... ATI experiences heavy fps drops with AA sometimes. Maybe drivers, maybe nvidia optimalizations... whatever. In the end, I'd rather play than indulge in conspiracy theories, just as I'd buy a card that works better for variety of games instead of clinging to those few that my gpu shines in.

GDDR5 doesn't seem like something to drool over at all, too. I mean, isn't it like with DDR3 RAM? Frequency increases and so do timings, and in the end performance increase is nothing much really.
Same with graphic cards, we're up to GDDR5 already but the cards will still develop at the usual, moderate pace. 4x series is supposed to be what, up to 50% faster then 3x series? Within the usual cycle, I'd say.
I wouldnt say "nothing". It would be nothing without clockspeeds but the clockspeeds are there.
Posted on Reply
#54
kylew
From the devs' point of view "nvidia, the way you're meant to be paid" :D. I wouldn't be supprised that without any special NV optimisations through "close working relationships" with devs, NV cards wouldn't be nearly as fast as they are now. I also wouldn't be surprised of they got slapped with an anti trust something, all this about Assassin's Creed and DX10.1 has to have caught some one's interest by now.
Posted on Reply
#55
ningen
[I.R.A]_FBiI wouldnt say "nothing". It would be nothing without clockspeeds but the clockspeeds are there.
I said "nothing much", not "nothing".
Without the clocks increased, we'd get a performance drop with slower timings, obviously.
Posted on Reply
#56
kylew
Valdezs3 chrome 430gt supports dx10.1 too :)
I think NV is so against DX10.1 because they can't implement it them selves, that'd kinda tie in with rumors that NV complained to MS about the specs of DX10 before it was finialised, so they were changed. I think what DX10.1 is, is what DX10 was originally meant to be. It just seems too much of a coincidence, NV are appearing to be very against DX10.1 for reason unknown outside of rumors. All this crap about DX10.1 and what NV appears to have done has put me totally off giving NV any of my money, they can go away, to stay in the corner... ahem... :D anyway.
Posted on Reply
#57
Kirby123
cards are built for specific games. thats how they work :)
Posted on Reply
#58
Unregistered
Kirby123well my sli ultras didnt run crysis as good as my ati cards have.... thats what i find funny. the games that say way to be play invidia ? why do my ati cards run it better?
saying it with words is one thing , but proving with a in-game benchmark is another.

to say ATI cards play crysis better than NVIDIA is a very subjective term. u got to explain further with screens, or a benchmark. its difficult to believe u when all the benchmarks prove otherwise.
Posted on Edit | Reply
#59
EastCoasthandle
kylewI think NV is so against DX10.1 because they can't implement it them selves, that'd kinda tie in with rumors that NV complained to MS about the specs of DX10 before it was finialised, so they were changed. I think what DX10.1 is, is what DX10 was originally meant to be. It just seems too much of a coincidence, NV are appearing to be very against DX10.1 for reason unknown outside of rumors. All this crap about DX10.1 and what NV appears to have done has put me totally off giving NV any of my money, they can go away, to stay in the corner... ahem... :D anyway.
From my understanding this is what was said to be the case. Read here


Also read this about DX11 being nix'd (take it with a grain of salt). But if true, see a pattern?
Posted on Reply
#60
Kirby123
i say it is my ocing im using on my x2. since it is for both cores 918/1053 stable on air. with my 3.6ghz E8400 on air. once i get my new heatsink i can go up to 3.8 or 4.1ghz. i have to buy the mark 06 to get real pics of benchmark -.- im to poor to even get that. all money went to computers and bills XD
Posted on Reply
#61
flashstar
Maybe it's just me, but I cannot get over 35 fps on UT3 at max settings at 1680x1050 with my 2900pro overclocked to 870/927. I've tried to figure out what's happening because my 2900pro is clearly much faster than a 8800gts 640 and slightly faster than an 8800gt at stock speeds. Even with my old 7800gtx, I could get 30 fps at similar settings and that card had 1/3 the raw power of my 2900pro. I too believe that Nvidia has had control of the market for too long and has been pulling some strings behind the scene. I'll give these new drivers a shot and let everyone know the results.
Posted on Reply
#62
ShadowFold
Cybrnook2002Man, cool and all, but its getting hard to keep up. Specs are getting so dated sooo fast, what happened to the 6 months and your good rule.
Nothing :) You can still game with a 8800GTS G80/ULTRA. They are just updating cores cause they have the tech so there like "why not?"
Posted on Reply
#63
imperialreign
TBH, I've never had anything against nVidia for their TWIMTBP campaign - truthfully, I thought it was a brilliant marketing maneuver . . .

but now, years later, it hasn't lead to an increase in competition . . . it's lead to an increase in one-sidedness.

But, the only thing, IMO, that TWIMTBP really accounts for - aside from the fact that a game is written to be optimized for green camp hardware - is the major performance lead nVidia cards have over ATI when a new title is released. It leaves ATI having to make up that ground with CAT releases, and IMO, they reclaim that ground quite respectfully after a few months. Sure, the game will continue to run better on nVidia hardware, but months after a major release, ATI cards are at least back on par, or slightly behind nVidia.



Somewhat back on topic: I'm really glad to hear that the new HDk4s will be utilizing GDDR5, and hopefully it will work out great to their advantage performance wise . . . but, I also find it somewhat disturbing how ATI tries to stay at the top of technology support, while nVidia seems to ignore any indusrty-wide technological advancements. ATI is the first to support GDDR3, GDDR4, GDDR5, HDMI, PCIE 2.0, DX10.1, etc, etc . . .

Curious . . . did nVidia ever release a GDDR4 video card?
Posted on Reply
#64
Unregistered
imperialreignTBH, I've never had anything against nVidia for their TWIMTBP campaign - truthfully, I thought it was a brilliant marketing maneuver . . .

but now, years later, it hasn't lead to an increase in competition . . . it's lead to an increase in one-sidedness.

But, the only thing, IMO, that TWIMTBP really accounts for - aside from the fact that a game is written to be optimized for green camp hardware - is the major performance lead nVidia cards have over ATI when a new title is released. It leaves ATI having to make up that ground with CAT releases, and IMO, they reclaim that ground quite respectfully after a few months. Sure, the game will continue to run better on nVidia hardware, but months after a major release, ATI cards are at least back on par, or slightly behind nVidia.



Somewhat back on topic: I'm really glad to hear that the new HDk4s will be utilizing GDDR5, and hopefully it will work out great to their advantage performance wise . . . but, I also find it somewhat disturbing how ATI tries to stay at the top of technology support, while nVidia seems to ignore any indusrty-wide technological advancements. ATI is the first to support GDDR3, GDDR4, GDDR5, HDMI, PCIE 2.0, DX10.1, etc, etc . . .

Curious . . . did nVidia ever release a GDDR4 video card?
ya never . thats an interesting point u make . ati has always taken the initiative with technology.

another interesting point u make is that games on ati hardware catch up with nvidia after few months , u have any link to a benchmark ? i'm interested in seeing this.
Posted on Edit | Reply
#65
Kirby123
flashstarMaybe it's just me, but I cannot get over 35 fps on UT3 at max settings at 1680x1050 with my 2900pro overclocked to 870/927. I've tried to figure out what's happening because my 2900pro is clearly much faster than a 8800gts 640 and slightly faster than an 8800gt at stock speeds. Even with my old 7800gtx, I could get 30 fps at similar settings and that card had 1/3 the raw power of my 2900pro. I too believe that Nvidia has had control of the market for too long and has been pulling some strings behind the scene. I'll give these new drivers a shot and let everyone know the results.
i dont see why your getting so low with a 2900???? im confused myself, youve got atleast a 2.8 dual core? and your mem should be able to get faster tahn that
Posted on Reply
#66
Kirby123
im confused it says in your computer specs you have 2900 xt and you say you have a pro
Posted on Reply
#67
Kursah
This kind of reminds me of the touted GDDR4 introduction on the X1950XTX, which was cool and a "braggable" feature for a few. I hope the GDDR5 introduction is much improved over the GDDR4 introduction...also the hopes of better OC-ability also arise in my mind...the GDDR4 on my XTX struggled to get beyond 1080...where-as my GDDR3 on my current 9600GT hit 1100 w/o issue and has gone higher.

Of course it's totally cool that AMD/ATI is still pushing memory innovation in the graphics arena, they definately got something here, especially if the performance netted from GDDR5 proves to be a strong value and factor for the card series' performance. I will sit back with my 9600GT and watch for now, as-always I hope that AMD/ATI are successful, same with NV...if neither succeed, we lose.

:toast:
Posted on Reply
#68
Dangle
The prob with NVIDIA is they do crap like watering down graphics to give their cards higher framerates. Don't believe me? PM me and I'll find a link for you. ATI FTW!
Posted on Reply
#69
Kursah
DangleThe prob with NVIDIA is they do crap like watering down graphics to give their cards higher framerates. Don't believe me? PM me and I'll find a link for you. ATI FTW!
That's strategy for ya...honest or not, in a cut-throat, bottom line, very fast paced industry...I've seen the good/bad/ugly of both sides...personally I care for what gets me the best performance for my budget and requiriments...in this last purchase, the 9600GT won, but a quite a few times before that...I had the ATI's.

For those of you that care about the politics in this industry, then you can support the company that makes you feel good...I support what makes my games play smoothly, look pretty and is easy on the wallet, to some that may seem wrong, but for me that's just how it is.

:toast:
Posted on Reply
#70
btarunr
Editor & Senior Moderator
While some believe a single HD4870 will be about 20~25% faster than 9800 GX2 suggest pre-release benches, others believe it's not going to be more than 25% faster than a 9800 GTX.

I'm not expecting much out of the HD4870.
Posted on Reply
#71
XooM
EastCoasthandleWell think of it as this:
You have water cooling setup and want to decide on the size of the tubing. You can go 3/4" inner diameter tubing but you run the risk of a slower flow rate do to the size of the pump's barb only being 1/2" and it's power output (more/less). Or you can get a tube with an inner diameter 7/16" (which is slightly smaller then 1/2") which should maximize your flow rate.
This doesn't make sense. Water velocity through tubing is completely meaningless. Water volume through tubing is what really matters, and 3/4" has lower laminar resistance than 7/16". However, the difference in volumetric throughput between the two is minimal enough that 7/16" is preferred due to simplicity in tubing runs.
Posted on Reply
#72
EastCoasthandle
XooMThis doesn't make sense. Water velocity through tubing is completely meaningless. Water volume through tubing is what really matters, and 3/4" has lower laminar resistance than 7/16". However, the difference in volumetric throughput between the two is minimal enough that 7/16" is preferred due to simplicity in tubing runs.
This is completely and utterly ridiculous.
#1 Work on how you reply to people.
#2 I suggest you check your folks water hose in the yard. Just turn the facet on a little bit and use the nozzle on the end to release the water. The water volume at the facet won't be the same at the nozzle ;)
#3 If you want to continue this PM me, no need to derail the thread!
Posted on Reply
#73
erocker
*
Lol, I am seriously laughing that I go to the end of this thread and find a discussion on tubing for water cooling!:roll: This thread has indeed gone way off topic! Let's try to at least keep it in the realm of video cards.
Posted on Reply
#74
jbunch07
erockerLol, I am seriously laughing that I go to the end of this thread and find a discussion on tubing for water cooling!:roll: This thread has indeed gone way off topic! Let's try to at least keep it in the realm of video cards.
haha i was just wondering the same thing...:ohwell:

back on topic:
lets hope the price of gddr5 doesn't hurt Ati to much...:)
they need to be putting some money into marketing, ATi's marking is close to non existent at the moment.
Posted on Reply
#75
imperialreign
wolf2009ya never . thats an interesting point u make . ati has always taken the initiative with technology.

another interesting point u make is that games on ati hardware catch up with nvidia after few months , u have any link to a benchmark ? i'm interested in seeing this.
I'll try and dig up the review I read a while back, if I can find it again - it's kinda hard to find legit reviews like that seeing as how not many sites run back through testing when new driver releases are put out . . .

we can kind of see it, though, in our e-peen "post your gameX benchmark score here" threads

but, again, I'll try and dig up what I remember seeing, and I'll post it back up here in this thread once I find it . . .
Posted on Reply
Add your own comment
May 5th, 2024 05:12 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts