1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

GeForce 9800 GX2 might get 2GB of memory

Discussion in 'Graphics Cards' started by AphexDreamer, Feb 6, 2008.

  1. AphexDreamer

    AphexDreamer

    Joined:
    Jun 17, 2007
    Messages:
    7,197 (2.62/day)
    Thanks Received:
    937
    Location:
    C:\Program Files (x86)\Aphexdreamer\
    Could be manufacturer dependant

    Although we've already seen a lot of different pictures of the GeForce 9800GX2, Fudzilla is the first site to have a screen shot from a manual of one of the partner cards as you can see below. This in itself might not be the most exciting thing in the world, as it just confirms the previous pictures, but what made us take notice, was that each of the PCB's seem to feature 1GB of memory on this specific card.

    Now we can't be 100 percent sure that this is the case, as it could simply correlate to the total ammount of memory on the card, but comparing other manuals from the same manufacturer seems to imply that this card might very well come with 2GB of graphics memory.

    As you'll notice from our post yesterday, this board doesn't have the optical S/PDIF in seen on some of the card pictures that have been posted elsewhere. It's also strange that each card seem to feature a fan connector, as it looks as if the card only has a single cooling fan.

    Apart from that there isn't much else that we didn't know, but at least it shows that these cards can't be too far away. Now we just need some real performance figures and everyone will be happy.

    http://www.fudzilla.com/index.php?option=com_content&task=view&id=5563&Itemid=34

    Check link for a picture.
     
  2. mandelore

    mandelore New Member

    Joined:
    Aug 12, 2006
    Messages:
    3,251 (1.06/day)
    Thanks Received:
    152
    Location:
    UK-small Village in a Valley Near Newcastle
    well... since they dont bother designing an all-in-one pcb layout, slapping 2 1gb cards would be easy.

    Id be impressed if they followed ATI and made it all on one pcb...
     
  3. devguy

    devguy

    Joined:
    Feb 17, 2007
    Messages:
    1,239 (0.43/day)
    Thanks Received:
    171
    Location:
    SoCal
    Sounds silly. In most cases, the 8800GT isn't fast enough to benefit from the extra 512mb of RAM and it certainly doesn't bring the same performance increase as going from 256=>512mb RAM.
     
  4. gOJDO New Member

    Joined:
    Feb 5, 2008
    Messages:
    120 (0.05/day)
    Thanks Received:
    17
    I agree. Although it's possible, it's silly. The extra 2x512MB frame buffer will give 0 performance boost. There was a comparison between a 8800GTS g92 512MB and 1024MB. Both cards were performing same.
    I hope nVidia are going to use faster video RAM, because the cards are bottlenecked by the video RAM bandwidth. For example gDDR4 @1200MHz can improve the performance dramatically. It'll dissipate less heat and consume less energy.
     
  5. DaMulta

    DaMulta My stars went supernova

    Joined:
    Aug 3, 2006
    Messages:
    16,132 (5.26/day)
    Thanks Received:
    1,457
    Location:
    Oklahoma T-Town
    But if you keeped the card for 5 years it could see the light of day.

    My cos has a 9600SE 256meg card, I gave him a 9600XT 128meg to see if it would help him play some of the newer games(yea I know I sold him a FX-62 and 2900XT for 200 after that)

    The SE played better because it had more memory, now when the cards first came out the XT was the faster card because not all games used all the memory. Now today it's different, and if you think that times are going to slow down and use less memory:laugh:.
     
  6. EastCoasthandle

    EastCoasthandle New Member

    Joined:
    Apr 21, 2005
    Messages:
    6,889 (1.95/day)
    Thanks Received:
    1,506
    2 Gigs is a waste of time IMO. All it does is create latencies that aren't needed. How many games will need 2 gigs?
     
  7. imperialreign

    imperialreign New Member

    Joined:
    Jul 19, 2007
    Messages:
    7,043 (2.59/day)
    Thanks Received:
    909
    Location:
    Sector ZZ₉ Plural Z Alpha
    but, unless I'm wrong - usage of GDDR4 would mean they'd have to pay ATI royalties for the technology - if ATI is even willing to cough it up.

    either way, I think nVidia are taking the "con the buyer" route with this. How many buyers would truly realize the extra MEM would be rather pointless from a performance standpoint, and just buy it because it's offers more MEM than the competition?

    TBH, though, I wonder which company will jump on GDDR5 first . . .
     
  8. AphexDreamer

    AphexDreamer

    Joined:
    Jun 17, 2007
    Messages:
    7,197 (2.62/day)
    Thanks Received:
    937
    Location:
    C:\Program Files (x86)\Aphexdreamer\
    Yeah I figured it would be GDDR5 if it is truely 2GB of RAM. Cause GDDR3 ain't sayin much.
     
  9. DaMulta

    DaMulta My stars went supernova

    Joined:
    Aug 3, 2006
    Messages:
    16,132 (5.26/day)
    Thanks Received:
    1,457
    Location:
    Oklahoma T-Town
    I don't know yet, BUT remember two cards with one gig doesn't mean 2 gigs of memory.

    It's just SLi with both cards having a gig of ram.
     
  10. imperialreign

    imperialreign New Member

    Joined:
    Jul 19, 2007
    Messages:
    7,043 (2.59/day)
    Thanks Received:
    909
    Location:
    Sector ZZ₉ Plural Z Alpha

    but as far as the whole "package" is concerned - it very easy to market as 2GB


    (and print elsewhere on the box in fine print 2x1GB)
     
  11. R_1

    Joined:
    Dec 1, 2006
    Messages:
    449 (0.15/day)
    Thanks Received:
    39
    Two GPU-s on one PCB that is what I am looking for rather then 2x1024MB of video RAM. Nvidia can do it!
     
  12. CH@NO

    CH@NO New Member

    Joined:
    May 24, 2007
    Messages:
    901 (0.32/day)
    Thanks Received:
    59
    Location:
    Toluca, Mexico
    man, 2 gis of RAM is simply too much....I have no doubt that the card will trade blows with the ATi's X2 and maybe win but THE question is HOW MUCH will cost????

    Almost all of us choose our VGAs using performance/price rule, If the card cost too much then maybe ATi's solutions will be a better choice.
     
  13. mandelore

    mandelore New Member

    Joined:
    Aug 12, 2006
    Messages:
    3,251 (1.06/day)
    Thanks Received:
    152
    Location:
    UK-small Village in a Valley Near Newcastle
    well.. no, i dont think they can, or why else would all their dual gpu cards of late be just 2 cards welded together???

    they appear to lack the innovation to go back, redesign a single pcb with everything incorporated like ATI. which is a shame, because if they actually could be bothered it would probably be good.

    BUT. judging by the SIZE of their cards in comparison to ATI's, they lack the engineering ability to do it. If ATI's dual gpu design is the same size as NV's best single gpu design, thats saying alot, and NV's dual gpu (dual pcb design) is 100% larger than a single gpu design...
     
  14. AphexDreamer

    AphexDreamer

    Joined:
    Jun 17, 2007
    Messages:
    7,197 (2.62/day)
    Thanks Received:
    937
    Location:
    C:\Program Files (x86)\Aphexdreamer\
    Dude its like they are selling you two Ultras that just take up one slot.... Thats why the are going to put warning signs that say to not touch the GPU because it can get to hot and oh what was the other one... Oh yeah and to not place it near another pice of hardware in your computer lol.

    take a look. http://forums.techpowerup.com/showthread.php?p=647307#post647307
     
  15. candle_86 New Member

    Joined:
    Dec 28, 2006
    Messages:
    3,916 (1.34/day)
    Thanks Received:
    233
    10% larger, 40% more power, works out for me. And why would the bother, both cards take 2 slots no matter what, so whats your problem?
     
  16. largon

    largon

    Joined:
    May 6, 2005
    Messages:
    2,782 (0.79/day)
    Thanks Received:
    433
    Location:
    Tre, Suomi Finland
    2×1GB can done with GX2 and nothing else is required but Qimonda GDDR3 and a revised bios.
    Huh?
    You don't known what your own thread is about? :wtf:
     
  17. dolf New Member

    Joined:
    Mar 16, 2005
    Messages:
    2,058 (0.58/day)
    Thanks Received:
    30
    There is always sense to put 2GB RAM on video card :cool: instead of 1GB because it is the only way to justify 100USD more and final price of 700USD for example ;). Who is talking about performance and necessity here :laugh:?
     
  18. KainXS

    KainXS

    Joined:
    Sep 25, 2007
    Messages:
    5,603 (2.12/day)
    Thanks Received:
    502
    SCREW 2GB give me a frigging physx proccessor optimized by nvidia for use in all games directly on the card
     
  19. tigger

    tigger I'm the only one

    Joined:
    Mar 20, 2006
    Messages:
    10,183 (3.18/day)
    Thanks Received:
    1,399
    All its gonna do is make it more expensive for no gain in oomph.
     
  20. InnocentCriminal

    InnocentCriminal Resident Grammar Amender

    Joined:
    Feb 21, 2005
    Messages:
    6,484 (1.80/day)
    Thanks Received:
    847
    It'd only make sense if each GPU/RAM had a 512Mbit memory bus instead of the 256 we see on the cards now. Then, people with stupidly high-resolutions would be laughing.

    I doubt they'll increase the bus, but they should. It's basically just a marketing ploy.
     
  21. trog100 New Member

    Joined:
    Dec 18, 2005
    Messages:
    4,420 (1.34/day)
    Thanks Received:
    237
    being as its only 1gig effective 2 total is about right.. especially if more than one x 2 get paired up as they will..

    plus there is the "selling" point.. ati need to do the same and probably will..

    lets not forget these type cards are about very high resolutions and AA.. no point in having em else wise..

    the green team will make it sound as good as they can and jack the price up high.. ati on the other hand will keep going for price competition.. their first x2 needed to be relatively good value..

    trog
     
  22. dolf New Member

    Joined:
    Mar 16, 2005
    Messages:
    2,058 (0.58/day)
    Thanks Received:
    30
    I agree...
    In that situation probably 2GB will attain significance but once again the 512bit memory bus (already implemented in 2900 series) is very nice and also very expensive feature. It requires more complicated and long PCB, big enough GPU to allow the electrical connection with the memory chips and many more things which lead to higher price of the GPU itself.
    The comparison between 2900 with 512bit memory bus (512MB and 1GB) and between 2900 with 512bit vs 256bit memory bus shows that todays GPUs arn't powerful enough to fill the existing buffer of 1GB. What about 2GB?
    Probably the next generation of GPUs from both companys will be with 2-3 or more times better performance and 2GB of memory buffer will be must but probably. The time will answer on that question... hopefully soon.
     
  23. mandelore

    mandelore New Member

    Joined:
    Aug 12, 2006
    Messages:
    3,251 (1.06/day)
    Thanks Received:
    152
    Location:
    UK-small Village in a Valley Near Newcastle
    incorrect

    the ati 2870x2 can be single slot water cooled, a dual pcb design cannot.

    Id imagine the power draw on 1 pcb may be less than 2 pcbs joined.

    It lacks any amount of innovation, style or effort.

    cannot be said to "truly" be a single card, since its infact not, its 2 pcbs = 2 cards

    1 pcb = 1 card

    Edit: this is just my opinion on dual pcb cards, of course im not saying anything bad about the performance etc etc, just ATI should be given immense credit for a total redesign on their x2 cards against Nvidia's literal slapping together of 2 cards
     
    Last edited: Feb 7, 2008
  24. tigger

    tigger I'm the only one

    Joined:
    Mar 20, 2006
    Messages:
    10,183 (3.18/day)
    Thanks Received:
    1,399
    I agree,2 pcb's is not a single card,it is obviously 2,same mashup they did with the 7950 gx2.When will they learn.

    The ati design is much more innovative,even if it may not be faster than the nvidia option.
     
  25. gOJDO New Member

    Joined:
    Feb 5, 2008
    Messages:
    120 (0.05/day)
    Thanks Received:
    17
    ATi have nothing to do with gDDR4 technology and nVidia doesn't have to pay anything to use it.
    http://www.samsung.com/us/business/semiconductor/newsView.do?news_id=749.0
     

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page