1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

GDDR5 Memory - Under the Hood

Discussion in 'General Hardware' started by HTC, May 28, 2008.

  1. HTC

    HTC

    Joined:
    Apr 1, 2008
    Messages:
    2,237 (0.97/day)
    Thanks Received:
    302
    Source: ExtremeTech
    grazzhoppa and FR@NK say thanks.
  2. btarunr

    btarunr Editor & Senior Moderator Staff Member

    Joined:
    Oct 9, 2007
    Messages:
    28,236 (11.36/day)
    Thanks Received:
    13,582
    Location:
    Hyderabad, India
    But what's the point when only one company uses it, and that too on its 'top-of-the-line' product (HD4870), while another company used GDDR3 across 5 generations of products?
  3. spearman914

    spearman914 New Member

    Joined:
    Apr 14, 2008
    Messages:
    3,339 (1.45/day)
    Thanks Received:
    502
    Location:
    Brooklyn, New York 11223
    Yea I know. But really, theres no real difference in gaming between GDDR3,4,and 5 yet.
  4. magibeg

    magibeg

    Joined:
    May 9, 2006
    Messages:
    2,000 (0.67/day)
    Thanks Received:
    203
    Just have to weigh the costs between things. I guess one of the big questions would be if its cheaper to go GDDR5 with a 256bit bus or to go GDDR3 with a 512bit bus.
  5. HTC

    HTC

    Joined:
    Apr 1, 2008
    Messages:
    2,237 (0.97/day)
    Thanks Received:
    302
    Have you dudes checked the page? There's more, you know!
  6. [I.R.A]_FBi

    [I.R.A]_FBi New Member

    Joined:
    May 19, 2007
    Messages:
    7,664 (2.92/day)
    Thanks Received:
    540
    Location:
    c:\programs\kitteh.exe
    Have you read the link?
  7. Megasty New Member

    Joined:
    Mar 18, 2008
    Messages:
    1,263 (0.54/day)
    Thanks Received:
    82
    Location:
    The Kingdom of Au
    Too true, the increased speed doesn't hurt either :p
  8. Silverel

    Silverel New Member

    Joined:
    Nov 14, 2007
    Messages:
    1,769 (0.72/day)
    Thanks Received:
    233
    Location:
    Detroit, MI
    That was a pretty good read. In the end they figure GDDR5 to be as cost effective as GDDR3, so why not just replace the whole lot of the stuff? It'd be overkill for weaker cards, but they'd probably get better pricing if they went exclusive GDDR5 with Samsung, Qimonda, and Hynix.

    Every card I've owned has always benefited more from a higher memory clock than core. I'm down with a 4870 if they got the GDDR5, but I'd rather stick with a 4850 if they get the same stuff.
  9. Darknova

    Darknova

    Joined:
    Nov 8, 2006
    Messages:
    5,037 (1.79/day)
    Thanks Received:
    535
    Location:
    Manchester, United Kingdom
    Says who? GDDR5 has never been seen in a real card.

    And GDDR4 is better than GDDR3, it's just offset by the crappy bus widths nvidia and ATi use.

    Also, it won't cost more, think about it, GDDR5 costs slightly more than GDDR4 ok, but then GPUs are getting smaller, they are fitting more on a wafer, so each one costs less. In the end, counting everything overall, the graphics card will be cheaper to make.

    And to btarunr, because ATi looks to the future (and generally fails lol), where as nvidia is still stuck with it's brute force method (bigger badder GPUs).
  10. Scheich

    Joined:
    Dec 20, 2005
    Messages:
    245 (0.08/day)
    Thanks Received:
    20
    There was some news, where someone stated that its more profitable to produce flash memory for wannebe ssd ´s instead of ddr5, so this "shortage" might last for quite some time :mad:
  11. Silverel

    Silverel New Member

    Joined:
    Nov 14, 2007
    Messages:
    1,769 (0.72/day)
    Thanks Received:
    233
    Location:
    Detroit, MI
    What shortage?

    Switching to GDDR5 would make smaller chips, less PCB layers, more efficient bus widths, and cause less "shortages".
  12. HTC

    HTC

    Joined:
    Apr 1, 2008
    Messages:
    2,237 (0.97/day)
    Thanks Received:
    302
    Yeah: compare the die sizes of both nVidia's and ATI's next gen cards :twitch:
  13. btarunr

    btarunr Editor & Senior Moderator Staff Member

    Joined:
    Oct 9, 2007
    Messages:
    28,236 (11.36/day)
    Thanks Received:
    13,582
    Location:
    Hyderabad, India
    I'm just looking at the near future of GDDR4/5 memory, and that a tiny minority of cards actually use them, that too from ATI which commands less than half of the discrete graphics market share. Since NV makes powerful GPU's that end up faring better than competition, they needn't use better memory and end up using GDDR3 that's dirt cheap these days....profit. Whereas ATI use GDDR4/5 more to build up aspirational value to their products. They need performance increments to come from wherever they can manage to. Stronger/more expensive memory used. It's expensive because companies like Qimonda are pushed to making these memory that are produced on a small scale, lower profit.
  14. Darknova

    Darknova

    Joined:
    Nov 8, 2006
    Messages:
    5,037 (1.79/day)
    Thanks Received:
    535
    Location:
    Manchester, United Kingdom
    Ok, but did you read the rest of the article? Yes, the memory itself is more expensive, but it allows for less complex PCB designs (lower cost), and the die shrinks (lower cost) and the experience at producing dies at 55nm (lower cost)

    So all in all, there won't be that much of a price hike, if any.

    Not only that, but GDDR5 is going to be a bigger performance jump than going from GDDR3 to GDDR4, just because GDDR3 is dirt cheap doesn't make it better to use on your next generation GPUs.

    And you're wrong, You NEED to pair a strong GPU, with stronger memory. With GPUs getting more and more powerful, they need bigger bandwidths, and GDDR5 is the next logical step.
    It's just like in a PC, if you start bottlenecking the GPU, it doesn't matter how powerful you make it, because the GDDR3 will be holding it back.
    WarEagleAU says thanks.
  15. Rebo&Zooty

    Rebo&Zooty New Member

    Joined:
    May 17, 2008
    Messages:
    490 (0.22/day)
    Thanks Received:
    30
    Execly DN, memory price will be higher but pcb price offsets that.

    also more complex pcb the more likely you are to have failings in the pcb itself due to production flaws.

    the more complex something is the more likely it is to fail. this has alwase been true.

    now as to stron gpu not needing strong ram.........I cant say what i am thinking without getting infracted so I will put it a diffrent way.

    only a fanboi would say that a strong gpu dosnt need good/strong ram, and in this case i see alot of nvidiots saying that kinda crap because nvidia is sticking with ddr3, honestly the reasion they are sticking with ddr3 is because ITS CHEAP and they have large contracts that make it even cheaper, not because its the best tech for the job, not because it gives the best performance, but because they want to make as much per card as they can, look at their card prices, they are alwase TO DAMN HIGH per card, i have an 8800gt 512(its being replaced how...) it was an "ok" buy at the time, but the price most ppl where paying for them was 320bucks, thats insain........

    ok the 9600and 8800gt/9600gos are decent priced BUT they are still high for what your getting in my view....the 3870 would be a better buy at that price range and far more future proof.

    blah i dont want to argue, im tired, its late, and i need some rest........

    read the artical, and understand that lower power and higher clocks/bandwith mean that u dont need to make an insainly complex card that costs a ton to build, you can build a cheaper card(pcb) and get the same or better performance.

    also note 3 makers are already onboard, would be more follow suit to.......cant wait to see this stuff in action.
  16. btarunr

    btarunr Editor & Senior Moderator Staff Member

    Joined:
    Oct 9, 2007
    Messages:
    28,236 (11.36/day)
    Thanks Received:
    13,582
    Location:
    Hyderabad, India
    Well that's what NVidia chooses not to do. They're making the GT200 use GDDR3, but part of the reason is also that the GPU itself is very expensive ($125 /die, $150 /package) so that's $150 for the GPU alone. More in this contentious article. So NVidia is using GDDR3 more for economic reasons. And if this is the scheme of things, they'll keep themselves away from GDDR4/5 for quite some time though they're already a JEDEC standard technologies.
  17. Darknova

    Darknova

    Joined:
    Nov 8, 2006
    Messages:
    5,037 (1.79/day)
    Thanks Received:
    535
    Location:
    Manchester, United Kingdom
    Have you ever wondered WHY nvidia is making such an expensive GPU? As I've said before, it's just a brute force method to make a more powerful GPU. Instead of stopping, scrapping when they have and creating a really efficient GPU architecture (like ATi did) they don't stand a cat in hells chance against ATi this coming year.

    Considering how powerful the 4870 is meant to be, I honestly don't see anyone with any knowledge of GPUs going for nvidia with that hefty a price tag...
    WarEagleAU says thanks.
  18. candle_86 New Member

    Joined:
    Dec 28, 2006
    Messages:
    3,916 (1.41/day)
    Thanks Received:
    233
    on paper the 2900XT should have crushed all comers, instead it barley put up a fight agasint the 8800GTS 640. AMD can look great on paper, but give me some proof they can compete. As for the 3870 being futureproof i beg to differ. It has 5 groups of 64 shaders. Only 1 in each group can do complex shader work, 2 can do simple, one does interger the other does floating point. Now in the real world this means that 128 of those shader units wont be used if at all, the floating point and interger units, and the simple shaders are not used thanks to AMD's lack to supply a complier for there cards. LEt it look as good as you want, but if AMD can't supply a code complier so code works right on there design they are still screwed.
  19. wiak

    wiak

    Joined:
    Sep 5, 2004
    Messages:
    1,743 (0.48/day)
    Thanks Received:
    198
    Location:
    Norway
    whats the point of Intel @ DDR3? :p
    adopting new technologies is good
    even the memory companys agree and will make GDDR5 a standard so whats the problem?
  20. candle_86 New Member

    Joined:
    Dec 28, 2006
    Messages:
    3,916 (1.41/day)
    Thanks Received:
    233
    the problem is GDDR5 will suffer like GDDR4 did when it was new, insane latancy. Also for Nvidia they started work on the GT200 right after the G80 was shipped, at the time GDDR4 wasn't viable and GDDR5 wasn't heard of. Do you expect Nvidia to stop working on there next gen just to include a new memory controller?
  21. wiak

    wiak

    Joined:
    Sep 5, 2004
    Messages:
    1,743 (0.48/day)
    Thanks Received:
    198
    Location:
    Norway
    insane what?
    its not atis problem, nvidia is lacking a proper memory controller ^^
    kylew and Rebo&Zooty say thanks.
  22. btarunr

    btarunr Editor & Senior Moderator Staff Member

    Joined:
    Oct 9, 2007
    Messages:
    28,236 (11.36/day)
    Thanks Received:
    13,582
    Location:
    Hyderabad, India
    ATI started work on the R700 architecture about the same time when they released the HD 2900XT. Taken, GDDR5 was unheard of then, but later, the RV770 did end up with a GDDR5 controller, didn't it? Goes on to show that irrespective of when a company starts work on an architecture, something as modular as a memory controller can be added to the architecture even weeks before they hand over designs to the fabs to make an ES and eventually mass production.

    So, about when NV started work on the GT200 is a lame excuse.
    WarEagleAU, kylew and Rebo&Zooty say thanks.
  23. candle_86 New Member

    Joined:
    Dec 28, 2006
    Messages:
    3,916 (1.41/day)
    Thanks Received:
    233
    ATI made GDDR5 bta didnt you get the memo they have most likly been working on it just as long. I approve of what Nvidia is doing, using known tech with a wider bus is just as effective, and there is less chance of massive lat issues like there will be with GDDR5. I prefer tried and true, this will be the 2nd time AMD tried something new with there graphics cards and this will be the 2nd time they fail. I was dead right with the 2900XT failing, i said it would before it even went public, and ill be right about this.
  24. btarunr

    btarunr Editor & Senior Moderator Staff Member

    Joined:
    Oct 9, 2007
    Messages:
    28,236 (11.36/day)
    Thanks Received:
    13,582
    Location:
    Hyderabad, India
    GDDR5 is a JEDEC standard. Irrespective of who makes it, any licensed company can use it. HD4870 is more of something that will beat 9800 GTX and go close to GTX 260. It's inexpensive, cool, efficient. Don't try to equate HD4870 to GTX 280, you'll end up comparing a sub-400 dollar card to something that's 600+ dollars. The better comparison would be to HD4870 X2, which is supposed to be cheaper than GTX 280 and has win written all over it.
    WarEagleAU, Darknova and Rebo&Zooty say thanks.
  25. Rebo&Zooty

    Rebo&Zooty New Member

    Joined:
    May 17, 2008
    Messages:
    490 (0.22/day)
    Thanks Received:
    30
    yes bta, exectly, but you gotta remmber candle has an irrational hate for amd/ati, logic: he fails it....

    tho im shocked to see you say the 4870/4870x2 has win writen allover it.....did you forget your nvidia pillz today?
    kylew says thanks.

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page