1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

How Much Graphics Memory do You Really Need?

Discussion in 'News' started by zekrahminator, Nov 9, 2007.

  1. zekrahminator

    zekrahminator McLovin

    Joined:
    Jan 29, 2006
    Messages:
    9,113 (2.67/day)
    Thanks Received:
    321
    Location:
    My house.
    As monitors get bigger, run at higher resolutions, and video games require ridiculous amounts of graphics memory to run at respectable settings, both AMD and NVIDIA have shoved more and more graphics memory into their cards. However, how much is enough? The folks at YouGamers did some serious tests, and discovered some interesting facts about VRAM. While AMD and NVIDIA both want you to think that humongous amounts of VRAM will magically make your games run at 1920x1200, YouGamers discovered that quantity is not what really matters. If you want to run the most stressful games at the highest resolutions possible, you will see much more benefit from getting faster graphics card memory, or simply a faster graphics card. You can read the full investigative article here.

    Source: Nordic Hardware
     
    OnBoard and error_f0rce say thanks.
  2. Aeon19 New Member

    Joined:
    Apr 3, 2007
    Messages:
    67 (0.02/day)
    Thanks Received:
    2
    I thought that was obvious
     
  3. Black Panther

    Black Panther Senior Moderator™ Staff Member

    Joined:
    May 30, 2007
    Messages:
    8,637 (2.96/day)
    Thanks Received:
    1,960
    Well, it might not be obvious to everyone. Don't forget we have members who still think toothpaste is good for use as a tim paste... ;)

    Interesting article btw. Never knew that " Vista doesn't differentiate video RAM from system RAM - it's all the same, as far the operating system and games are concerned."
     
    Last edited: Nov 9, 2007
  4. mdm-adph

    mdm-adph New Member

    Joined:
    Mar 28, 2007
    Messages:
    2,478 (0.83/day)
    Thanks Received:
    340
    Location:
    Your house.
    You're... kidding me... :twitch:
     
  5. newconroer

    newconroer

    Joined:
    Jun 20, 2007
    Messages:
    3,349 (1.16/day)
    Thanks Received:
    399
    Then why does Crysis stomp the 320 GTS and not the 640?


    Article quote :

    "When the graphics processor wants to use them, it copies them across into its RAM, deleting other stuff to make room. Cue a spot of stuttering or slow down in the frame rate; this is because it takes quite a bit longer to swap textures around than just accessing them in the onboard (or to give it the correct name, local) RAM."


    So yes, not having enough IS relevant. It seems like the article is saying that the faster the RAM on the GPU, the more capable it is of purging unused data and replacing it with active data. But until the hardware is capable of doing so.........
     
  6. Weer New Member

    Joined:
    Aug 15, 2007
    Messages:
    1,417 (0.50/day)
    Thanks Received:
    94
    Location:
    New York / Israel
    VRAM is only important if it's a bottleneck.

    Obviously if you have 2GB of VRAM, you're wasting a lot of power.

    And for the same reason a 512MB 8400GS is a horrible idea.
     
  7. DaMulta

    DaMulta My stars went supernova

    Joined:
    Aug 3, 2006
    Messages:
    16,135 (5.01/day)
    Thanks Received:
    1,459
    Location:
    Oklahoma T-Town
    More the better!!!!!(Depends on what you are doing)

    Lets load all of the textures into our video cards!!!!
     
  8. jydie New Member

    Joined:
    Feb 2, 2006
    Messages:
    209 (0.06/day)
    Thanks Received:
    3
    I would have to confirm that from the various testing I have done in the past 3-4 years. Using ddr2 and ddr3 in video cards is often the most significant difference between certain models... like the Nvidia 8600GS and 8600GT, and the 2600 Pro and 2600 XT. They often bump up the core's clock speed, but the speed of the memory is what seemed to make the biggest difference. I have never noticed any significant improvement between same-model cards that had 256MB of memory compared to one with 128MB.

    Besides clock speed, the other obvious thing to look for is the memory interface... 64-bit and lower should be avoided if you want to do any type of gaming, 128-bit is normally used in mid ranged cards, and 256-bit or higher is used in the higher end cards.
     
  9. Sasqui

    Sasqui

    Joined:
    Dec 6, 2005
    Messages:
    7,961 (2.30/day)
    Thanks Received:
    1,652
    Location:
    Manchester, NH
    It does depend totally. While right now 1GB is probably overkill, you can see many of the game on that link that use way over 512MB at 1600x1200. Once you exceed your video ram, it swaps to system memory (much slower), and you get dropped frames, lags, etc.

    Too bad they didn't include 1920x1200 in the charts - I'm sure the usage would go up to the 800 MB range on some games.
     
  10. hv43082

    Joined:
    Sep 15, 2006
    Messages:
    1,467 (0.46/day)
    Thanks Received:
    95
    Location:
    Orange County, CA and Miami, FL
    So what about gaming at 2560x1600? More Ram definitely matters at this resolution, right? Sorry but too lazy to read the entire article, just pulled a super late night study session for this morning exam...zzzz...
     
  11. jocksteeluk

    jocksteeluk New Member

    Joined:
    Jan 23, 2006
    Messages:
    1,457 (0.43/day)
    Thanks Received:
    46
    Location:
    The 13th room on the 13th floor of the 13th buildi
    If pc game developers put as much time into development for existing hardware as they do for consoles', 128mb of vram and 2gig cpu would no doubt last years rather than months as the pc formats top spec system but the fact is companies' like Nvidia, AMD and Intel rely on the waste in the pc games industry to continually sell more products and continue the flow of income.
     
  12. musek

    musek

    Joined:
    Nov 4, 2007
    Messages:
    367 (0.13/day)
    Thanks Received:
    57
    Location:
    Europe -> Poland...
    Well... another 'u-must-buy-new-hardware' article. :/ If, for example, Call of Duty 4 is sooooo VRAM eater (always ~400megs) then how was i able to play the demo (same as they did) on R9800 with 128MB VRAM without any problem (on resonable settings of course, 1024x768)? My card (and whole system, coz i have 1GB system memory) should just struggle and write me 'No Wai!'.

    I must agree to one thing though - 256MB of VRAM is nowadays absolutely minimum option (IMO not worth buying if someone wishes to play newer titles). But, hey, do we really need an article to know about it. ?


    Namaste,
    musek


    PS. Sorry for my english. :D
     
  13. OnBoard

    OnBoard New Member

    Joined:
    Sep 16, 2006
    Messages:
    3,044 (0.96/day)
    Thanks Received:
    379
    Location:
    Finland
    Because you didn't have all the settings on HIGH :) Just dropping texture levels lower take whole memory need also lower. Would be nice if they'd included medium settings for those new games as most will propably use them to get playable frames.

    edit: And cheers for the article, seems it's time to go 512 like I was planning to. 256MB has been fine for now, Crysis was the first game that absolutely dies when you enabled AA. I got 3fps and a friend 1fps (with a bit higher resolution and pro version card) :p.
     
    Last edited: Nov 9, 2007
  14. trog100 New Member

    Joined:
    Dec 18, 2005
    Messages:
    4,420 (1.28/day)
    Thanks Received:
    237
    the important thing is how much grunt the card has got.. sticking large amounts of memory on cards that cant handle the resolutions and settings that need it is the con..

    the amount of memory on a card is a selling point.. mostly they come with more than they can use.. the article is misleading.. if the card aint got the grunt no amount of extra memory is gonna help it..

    trog
     
  15. FreedomEclipse

    FreedomEclipse ~Technological Technocrat~

    Joined:
    Apr 20, 2007
    Messages:
    14,507 (4.90/day)
    Thanks Received:
    2,696
    lucky for me, I still have A X1800XT 512Mb :p
     
  16. Tatty_One

    Tatty_One Senior Moderator Staff Member

    Joined:
    Jan 18, 2006
    Messages:
    17,367 (5.08/day)
    Thanks Received:
    3,095
    Location:
    Worcestershire, UK
    Agree completely, and tests show that even with no AA/AF that once you EXCEED a resolution of 1280 x 1024 then you are exceeding 256MB of GDDR which will result in some system ram swapping......AKA.....jittering, to what degree/amount of RAM is being used thereafter and therfore how much system ram swapping takes place depends on the resolution and the detail level. Even at 1280 x 1024 in Oblivion with 8x AA and 8x AF at times during the game the card will require more than the 256MB.

    I have a really nice set of tests somewhere that I have posted here before that really sums the process up well......I'll have to try digging it out.....just on the off chance that some of you managed to stay awake until the end of my post and are interested :laugh:
     
  17. musek

    musek

    Joined:
    Nov 4, 2007
    Messages:
    367 (0.13/day)
    Thanks Received:
    57
    Location:
    Europe -> Poland...
    Still, at friends rig (256mb vram) CoD4 demo is flying with everything max'ed out.
     
  18. Tatty_One

    Tatty_One Senior Moderator Staff Member

    Joined:
    Jan 18, 2006
    Messages:
    17,367 (5.08/day)
    Thanks Received:
    3,095
    Location:
    Worcestershire, UK
    Resolution?
     
  19. Steevo

    Steevo

    Joined:
    Nov 4, 2005
    Messages:
    8,847 (2.53/day)
    Thanks Received:
    1,538
    I remember the 128MB cards beiunbg questioned, if there was the need for such extravagance and extra exspence. I need spell check, I know it.
     
    10 Million points folded for TPU
  20. jimmylao New Member

    Joined:
    Dec 10, 2006
    Messages:
    16 (0.01/day)
    Thanks Received:
    0
    Mmmm, it's just the technology trend. It should be common sense to everyone that if you're going to run new games in 1600x1200 or better resolution with everything maxed out and AA+AF. You're going to need a high end card thus equating to faster clocked ram, gpu, and cpu. And like everyone else said, if you're going to run at that resolution you should have the RAM and VRAM so swapping doesn't occur... however like the beginning posts said, some people think using colgate toothpaste is ok as a tim (are usually newbies who are either learning about computers or are hardcore gamers trying to understand what they need in order to build a faster comp).

    All-in-all, it's a good post if someone who doesn't understand comps too much but loves running their games at max settings and wonders why they lag. :rockout:
     
  21. musek

    musek

    Joined:
    Nov 4, 2007
    Messages:
    367 (0.13/day)
    Thanks Received:
    57
    Location:
    Europe -> Poland...
    @Tatty_One - 22'' wide, so 1680x1050 at X850XT PE & Pentium D.

    @jimmylao - I agree. But i still think that if someone using toothpaste as a tim, he won't be here to read atricles like this (and thats sad).
     
  22. musek

    musek

    Joined:
    Nov 4, 2007
    Messages:
    367 (0.13/day)
    Thanks Received:
    57
    Location:
    Europe -> Poland...
    Well... maybe, just look at my current specs to know that at my rig i'm barely walking, mostly crawling. :p


    PS. No AA was set.
     
  23. imperialreign

    imperialreign New Member

    Joined:
    Jul 19, 2007
    Messages:
    7,043 (2.45/day)
    Thanks Received:
    909
    Location:
    Sector ZZ₉ Plural Z Alpha
    and that's where the rest of the system comes into play - especially sys MEM and BUS speeds . . . which, come into play no matter what resolution or AA/AF you're running.

    Maybe it's just me - but, you have to take the system as a whole into account when looking at video performance. For example, we all know how decently powerful a X1950 is, but through a 1950 in with a P4 and you get sub-par performance, doesn't matter what the clock speeds of the 1950, P4 or DRAM are . . . a little odd that the article barely touches on this.
     
  24. trog100 New Member

    Joined:
    Dec 18, 2005
    Messages:
    4,420 (1.28/day)
    Thanks Received:
    237
    some games load all the stuff/textures and whatever in at the start of each level.. when they do this u get a long loading wait then it all runs smooth..quake 4 works this way for example..

    some like oblivion load it in bit by bit as u go along.. the game slows down (stutters) each time the card needs more textures.. it takes time for the card to be loaded with textures from the system memory.. the bigger the cards memory the longer the slow down.. he he..

    not much can be done about this annoying slowdown every so often.. except play the game at lower resolutions and settings.. the old load it all in at the level start worked.. but levels are so huge now it cant be done..

    the bottom line is the more the cards memory the longer the stutter as it gets filled.. it all runs nice between stutters is about all u can say at high settings and resolutions with games like oblivion.. he he

    trog
     
  25. imperialreign

    imperialreign New Member

    Joined:
    Jul 19, 2007
    Messages:
    7,043 (2.45/day)
    Thanks Received:
    909
    Location:
    Sector ZZ₉ Plural Z Alpha
    I honestly preffered the old method of loading all textures prior to the beginning of a level, I don't mind waiting a few extra minutes of load time if it means that the sys won't have to swap out textures 5+ times in a single map . . .

    but, you're right, newer games use such huge maps and such intricate textures that load times would be stoopid long . . . the only other fix would be to break a map up into smaller areas, but this would mean "loading zones" and that's not something I associate with PC games - only consoles.
     

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page