Discussion in 'NVIDIA' started by erocker, Oct 21, 2010.
I just reread it.. I see.. AA on 1920x1200 is indeed a burden for 768mb imo, just normal I guess
Heya e, test this out for me:
Overclock your vmem a little more on the cards from what you have it at and try again. Also, what board you running SLI on?
Overclocking the RAM makes little or no difference. Nothing noticeable anyways. I'm using a Crosshair IV, though I'm sure it's not the fact that I'm using the "SLi hack" to get it working as the same thing happens with a single card. I'm certain now that 768 MB is not enough here.
Yeah. Unfortunately you can't monitor VRAM usage with GPU-Z on Nvidia cards afaik (if I'm wrong here let me know). Very useful to be able to do so obviously.
Gotcha. I'm just being curious because I don't know this or not: on the CH4 are the 16x slots tied to the SB850 or the CPU's NB?
Some here seem to think that you get 2 x 768MB in SLi? ..... you certainly don't.....E Rocker..... I just did some checking (with my neighbour who has the game and 460's 1GB cards in SLi)) and other forums, it would appear that at certain "Peak" points in the game, at 1920 x 1200 with 4x AA in high detail the game can use upto 82% of a 1GB cards memory..... by my math that means sometimes upto 820MB can be used...... that is probably your answer mate.
Yes, I know that about SLi and Crossfire. The rest of your post does help confirm that 768mb isn't enough.
I was not indicating you when I said some seem to still think that the memory doubles up though....... It remains a common enuff misconception. All you can really do is lower the detail levels and drop AA a little, the stuff i read suggests that at those peak times the difference between 2xAA and 4xAA can mean upto an extra 35MB of memory useage or something like that.
Yes you can.
I think that's at least partially because it seems logical that it would double. Along the lines of other types of computer related memory. Also, I have never really seen a good, technical explanation of why this is the case. Anyone got one? I'm not insinuating that one doesn't exist or it's just a bs business decision but at any rate it's certainly not widespread info and common knowledge.
Awesome! Maybe this was prior cards or something; could have sworn saw that from W1z himself at some point. I have tried with my 460 but it didn't seem to be working right...will try further.
SLi and Crossfire both act in the same why with memory. The amount of VRAM is not doubled because each GPU needs to be fed with the data in VRAM directly. Since each GPU is working on rendering the same frame* both GPUs need high speed access to the same data, the only thing fast enough to provide that data is the directly link between the memory and the GPU. It would be too slow to have GPU1 accessing data from GPU2's memory, PCI-E and SLi/Crossfire bridges are not fast enough to handle this. So all the data must be duplicated in the VRAM on each card.
*There is the case that Alternate Frame Rendering mode might be used, in this case the two frames are still so close together that the data in VRAM must be virtually identical.
It is there, and it works. The older versions of GPU-z didn't have this feature, and I don't beileve it works with some of the older nVidia cards, but it definitely works with the newer 400 series cards.
I'd use this just to make sure the VRAM is the cause. Set it to display the Max memory used(by clicking on the number) and fire up Dirt2. When you are done playing, check GPU-z to see how much it peaked at.
it's pretty simple really......... each graphics card uses its own frame buffer memory to render a 3D application. The operating system will report a graphics card frame buffer memory size that is found on a single graphics board and therfore can only use that one amount of memory at a time.
maybe some of it could be attributed to you using an SLI hack on a non SLI board? not trying to troll you rocker just offering another option
We've already covered this. Others using SLi boards and these cards have the same issue and it happens with a single card as well.
erocker....Dirt2 with what settings please?
I want to run it to see what sort of memory usage I get
Also I saw it mentioned to try different rendering modes for SLI....I have been known to use Alt. Rendering 2 a lot.
Dirt 2 maxed out from x4 AA up to 16QAA. I'll try the alternating rendering mode in a bit. Like I said though, with a single card it happens as well.
ahhh yeah right, you did mention that
GPU-z shows 713MB used in Utah.
Gonna turn off V-Sync now, she held at 60 FPS following cars the whole time.
OK without V-Sync I was at 165 max and 155FPS minimum, which brings me to the next question. The 16X AF, is that set in the NVCP, I dont see a AF setting in Dirt 2.
Using MSI afterburner (or Rivatuner) you can monitor a plethora of data on your GPU's usage whilst in game (Framerate, GPU usage, Fan Speed, Memory Usage, Clocks, Voltage). This data can also be logged for future reference / comparisons.
I have this running all of the time when I am gaming.
Just ran Concrete Jungle with Bokeh and Water Physics off. Framerate seemed at least 10 FPS higher, possibly 15 FPS higher. This game is runs better in ATI mode, so to speak, with those off.
RivaTuner won't work with GTX 4xx series because the programmer for that went on to Afterburner. Afterburner is the de facto program for GPU stats now. Mine is always running too and I use it to OC and crank up the fans on my GTX 470s.
Separate names with a comma.