1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Confirms GDDR5 for ATI Radeon 4 Series Video Cards

Discussion in 'News' started by malware, May 21, 2008.

  1. imperialreign

    imperialreign New Member

    Joined:
    Jul 19, 2007
    Messages:
    7,043 (2.69/day)
    Thanks Received:
    909
    Location:
    Sector ZZ₉ Plural Z Alpha

    I'll try and dig up the review I read a while back, if I can find it again - it's kinda hard to find legit reviews like that seeing as how not many sites run back through testing when new driver releases are put out . . .

    we can kind of see it, though, in our e-peen "post your gameX benchmark score here" threads

    but, again, I'll try and dig up what I remember seeing, and I'll post it back up here in this thread once I find it . . .
     
  2. FR@NK

    FR@NK

    Joined:
    Apr 30, 2006
    Messages:
    572 (0.19/day)
    Thanks Received:
    91
    Maybe because the 3/4" tubing has more surface area on the inter area of the tube which causes more friction on the coolant.....lol GDDR5 will be able to have smaller 256bit interface yet still have more bandwidth then a 512-bit 2900xt. I have no idea what this has to do with a garden hose tho.
     
  3. Rebo&Zooty

    Rebo&Zooty New Member

    Joined:
    May 17, 2008
    Messages:
    490 (0.21/day)
    Thanks Received:
    30
    oh, dont make me link every farking review out there showing the 7950gx2 for the POS it was, your such an nvidiot..........

    first the gx2 vs the 1950x2 the gx2 looses not just in perf, but in support, the gx2 is trash, nvidia made it to keep top numbers in a few games till the 8800 came out thats it, then they fully dumped its support, sure the drivers work, but quad sli? and even sli perf of the gx2 vs true sli was worse, thats sad since its basickly 2 cards talking dirrectly.

    as to the x1900, it STOMPED the 7900/7950, cards that ON PAPER should have been stronger, 24 pipes vs 16 for example was what ppl where using to "proove" that the nvidia cards WOULD kill the x1900 range of cards.

    i would make another massivly long post, but you would just ignore it like all fanboi's do, or resorte to insults.
     
  4. jaydeejohn New Member

    Joined:
    Sep 26, 2006
    Messages:
    127 (0.04/day)
    Thanks Received:
    8
    Imagine 512 connections/wires coming from the bus to everywhere it needs to go for the output. Thats alot of wires, and voltage control. With GDDR5, you have the ability to push the same or a lil more info faster than a 512 bus without all those wires, in this case, just 256. Also, GDDR5 "reads" the length of each connection, allowing for correct voltage thru the wire/line, this is important, so its more stable, keeping frequencies within proper thresholds, also elimanting costs of having to go the more exspensive way of doing it. Hope that helps
     
  5. EastCoasthandle

    EastCoasthandle New Member

    Joined:
    Apr 21, 2005
    Messages:
    6,889 (2.00/day)
    Thanks Received:
    1,505
    Thanks for the info :toast:
     
  6. jaydeejohn New Member

    Joined:
    Sep 26, 2006
    Messages:
    127 (0.04/day)
    Thanks Received:
    8
    YW. This should dramatically cut down the costs of the pcbs, and still provide great performance
     
  7. jbunch07

    jbunch07 New Member

    Joined:
    Feb 22, 2008
    Messages:
    5,261 (2.19/day)
    Thanks Received:
    614
    Location:
    Chattanooga,TN
    that is if the cost for the gddr5 doesnt cripple them... :(
     
  8. EastCoasthandle

    EastCoasthandle New Member

    Joined:
    Apr 21, 2005
    Messages:
    6,889 (2.00/day)
    Thanks Received:
    1,505
    Agreed...I still wonder what kind of performance is had with 512 bus. I hope we find out with the X2 :D
     
  9. HTC

    HTC

    Joined:
    Apr 1, 2008
    Messages:
    2,239 (0.95/day)
    Thanks Received:
    303
    And, in theory, reduce the heat it creates too!
     
  10. Rebo&Zooty

    Rebo&Zooty New Member

    Joined:
    May 17, 2008
    Messages:
    490 (0.21/day)
    Thanks Received:
    30
    how long till the nvidia fanboi says that ati should have gone 512bit and should have more pipes/rops?

    funny since the x1900/1950xt/xtx cards had 16 pipes/rops vs the 7900 having 24 and the 7900 got pwned........

    meh, im sick of the "ati sucks because *add bullshit FUD here*" or the "nvidia sucks because *add bullshit FUD here*"

    they both have their flaws and their good points.

    the one thing i almost alwase see out of ati since the 8500 has been INNOVATION, it hasnt alwase worked out the way they intended, the 2900/3800 are the prime example, the main issue was that ati designed the r600/670 cores for dx10 not dx9, as such they followed what microsoft wanted to do with dx10+ that was to remove detocated AA hardware, using shaders to do the AA and other work, ofcorse this lead to a problem, dx9 support was an after thought and as such gave worse performance when you turned aa on.

    ati thought like many other companys thought, vista would take off and be a huge hit, just like xp did when it came out, and with vista being a big hit, dx10+ games would have been out en-mass, but vista fell on its face, an ati still had this pure dx10 chip alreadin in the pipe, so they ran with it KNOWING it would have its issues/querks in dx9 games.

    Nvidia on the other hand effectivly took the oposite aproch with the g80/92 cores, they build a dx9 part with dx10 support as an afterthought, in this case it was a good move, because without vista being a giant hit, game developers had no reasion to make true pure dx10 games.

    nvidia didnt go dx10.1 because it would have taken some redesign work on the g92, and they wanted to keep their investment in it as low as possable to keep the profit margin as high as possable, its why they lowered the buss width and complexity of the pcb, its why they didnt add dx10.1 support, its why the 8800gt's refrance cooler is the utter peice of shit it is(i have one, i can say for 100% certen the refrance coolers a hunk of shit!!!!)

    now i could go on and on and on about each company, point is they have both screwed up.

    biggist screwups for each

    ATI:2900(r600) not having a detocated AA unit for dx9 and older games.

    nVidia: geforce 5/FX line, horrible dx9 support that game developers ended up having to not use because it ran so bad, forcing any FX owner to run all his/her games in dx8 mode, also the 5800 design was bad, high end ram with a small buss and ungodly loud fan does not a good card make.


    thats how i see it, at least ATI never put out a card tauted as being capable of something that in practice it couldnt do even passably well......
     
    EastCoasthandle says thanks.
  11. Rebo&Zooty

    Rebo&Zooty New Member

    Joined:
    May 17, 2008
    Messages:
    490 (0.21/day)
    Thanks Received:
    30
    dought it will have any real impact from the card makers end, they buy HUGE quintites of chips, getting a price thats FAR lower then the preimum we consumers pay for that same ram.

    I had an artical b4 my last hdd melt down, it showed acctual cost per memory chip for videocards, ddr vs ddr2 vs ddr3 vs ddr4

    ddr4 was more expencive, but that was mostly due not to it being new but due to it being in short supply at the time, still the price you payed to get it on a card was extreamly exagerated, ofcorse its "new" so they charge extra for it.

    the cost of 2 vs 3 again, wasnt that large, same with ddr vs ddr2, again, we are talking about companys that buy 100's of thousnads if not millions of memory chips at a time from their supplyers, those supplyers want to keep on the good side of their customers so they keep making a profit, so they give them far better prices then they would ever admit to an outside party.

    also the more you buy, the lower the per unit cost is, same as with most things, go check supermediastore, if you guy 600 blanks the price is quite a bit lower then buying 50 or 100 at a time ;)
     
  12. jbunch07

    jbunch07 New Member

    Joined:
    Feb 22, 2008
    Messages:
    5,261 (2.19/day)
    Thanks Received:
    614
    Location:
    Chattanooga,TN
    yea this is true!
    the vendor to get ram at a nice price because they buy such large orders!
     
  13. Rebo&Zooty

    Rebo&Zooty New Member

    Joined:
    May 17, 2008
    Messages:
    490 (0.21/day)
    Thanks Received:
    30
    i would love to see somebody try that new qmoda...whatever ram thats higher dencity per chip, would be intresting to see a videocard that had 2gb of high bandwith ram......or hell use it for onboard video(ohhh that could rock, 4 chips for 512bit(or something like that) would make onboard a hell of alot better....
     
  14. candle_86 New Member

    Joined:
    Dec 28, 2006
    Messages:
    3,916 (1.39/day)
    Thanks Received:
    233

    The AA on the shaders is a stupid bad idea, MS doesnt even understand hardware thats the problem. Nvidia is not going to do DX10.1 because it requires shader based AA which is total junk and worthless. Sure the AA might look better but a 50% drop in FPS isnt worth it, ill take dedicated hardware AA any day. What MS needs to do is discuss these ideas not just sit around and think them up. Remember if MS cuts Nvidia out of DX totally OpenGL will make a massive comeback. MS has not choice but to do what Nvidia tells it to do for this reason alone. Sevral problems exist with shader AA if you can't see that im sorry. As for Innovation i beg to differ, what has ATI actully done, shader AA was the worst idea ive heard of. 5 groups of 64 shaders but only one unit can do complex shader math another bad idea. Thats why ATI cards preform like 64 shader cards most of the time, and if they are lucky 128 shader cards. GDDR5 is marketing hype, the latancy alone kills it, new ram types are never as good as the older ones on release. Look at the GDDR3 5700Ultra vs the regular 5700Ultra. Same preformace because of latancy. Go ahead give us 3000mhz ram with a 200ms reponse time, it wont be any better than 2000mhz ram with an 80ms reponse time, these are just random numbers but its the same reason people dont upgrade to DDR3. All hype from AMD and appsolutly nothing to even care about. This time they might have a single core solution that can tie the 8800Ultra.
     
  15. jbunch07

    jbunch07 New Member

    Joined:
    Feb 22, 2008
    Messages:
    5,261 (2.19/day)
    Thanks Received:
    614
    Location:
    Chattanooga,TN
    hmmm 2gb of video ram would only be good for extremely high resolutions...
     
  16. Dangle New Member

    Joined:
    Dec 13, 2007
    Messages:
    497 (0.20/day)
    Thanks Received:
    15
    Location:
    Reno
    Why are there so many furious Nvidia fans in here?
     
  17. candle_86 New Member

    Joined:
    Dec 28, 2006
    Messages:
    3,916 (1.39/day)
    Thanks Received:
    233
    where tring to save you from a stupid purchase
     
  18. jbunch07

    jbunch07 New Member

    Joined:
    Feb 22, 2008
    Messages:
    5,261 (2.19/day)
    Thanks Received:
    614
    Location:
    Chattanooga,TN
    come on now candle...what has ati ever done to you?

    seriously there is now reason to hate ati that much!
     
  19. candle_86 New Member

    Joined:
    Dec 28, 2006
    Messages:
    3,916 (1.39/day)
    Thanks Received:
    233
    oh yes there is, plus yall are like family and this is an intervention, i have to save yall from yourselves. IF you buy AMD products you will hate yourself for doing so, historiclly Nvidia has always been faster at the same price point
     
  20. jbunch07

    jbunch07 New Member

    Joined:
    Feb 22, 2008
    Messages:
    5,261 (2.19/day)
    Thanks Received:
    614
    Location:
    Chattanooga,TN
    look at my system specks<----
    look at my face----> :D

    im very happy with Amd/ati

    my previous rig was nvidia i was happy with that as well
    but hey im not complaining...you have every right to say what you want.
     
    Rebo&Zooty says thanks.
  21. FR@NK

    FR@NK

    Joined:
    Apr 30, 2006
    Messages:
    572 (0.19/day)
    Thanks Received:
    91
    Most of us here are smart enough to know that the ATI cards we use are slower then nvidia's cards.
     
  22. jbunch07

    jbunch07 New Member

    Joined:
    Feb 22, 2008
    Messages:
    5,261 (2.19/day)
    Thanks Received:
    614
    Location:
    Chattanooga,TN
    true...besides you dont see me going to a nvidia new thread and bash on them...

    no one like a buzz kill!
     
  23. erocker

    erocker Super Moderator Staff Member

    Joined:
    Jul 19, 2006
    Messages:
    39,651 (13.30/day)
    Thanks Received:
    14,042
    No one here needs saving, keep these types of comments to yourself. I believe you have already been warned on this subject before.
     
  24. Rebo&Zooty

    Rebo&Zooty New Member

    Joined:
    May 17, 2008
    Messages:
    490 (0.21/day)
    Thanks Received:
    30
    humm, maby u need to check the assassins creede reviews, seems shader based aa isnt a bad idea if done nativly by the game, the 9800gtx and 3870x2 where toe to toe less then 1fps diffrance between them, corse ur a fanboi, wouldnt expect you to know that.

    as to ms doing what another company tells it, wrong, ms could block opengl support if they wanted, and guess what, nobody could stop them, everybody has to do what ms says, because the only other choice is to fall back into a niche market like matrox has done.

    as to your 5700 example, that dosnt mean shit the 5700 was a peice of crap, it was the best of the fx line, but thats not saying much......specly when a 9550se can out perform it LOL
    this is dx10.1 3870x2 vs 9800gtx under sp1(dx10.1 is enabled with sp1)
    [​IMG]

    funny, shader based aa vs detocated AA and the perf diffrance is around 1fps diffrance

    so your "shader based AA is a stupid Idea" line is a load of fanboi bullshit(as expected from you)

    the ideas fine, if your talking about native dx10/10.1 games, but todays games are mostly dx9 games with some dx10 shaders added(crysis for example)

    as this shows there is ZERO reasion that shader based aa need to be any slower, its only slower in native code, its just slower on older games, hence as i said, they should have had a hardware AA unit for dx9 and older games and used shader based AA for dx10.x games, problem would have been solved.
     
  25. jaydeejohn New Member

    Joined:
    Sep 26, 2006
    Messages:
    127 (0.04/day)
    Thanks Received:
    8
    You wont be seeing those huge latencies with this memory. I dont like to argue, just give it time, and see what happens. The 4870 is going to be a killer card, and rumors have it at 25% over the 9800GTX. nVidias solution, the G280 should be faster, but itll draw too much energy to do a X2, thus allowing the 4870 X2 to compete with it at the very top for single slot solution. Hopefully we will see that MCM on the 4870 X2
     

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page