I noticed this while playing Oblivion, and posted about it in a couple of Oblivion forums thinking it had been a problem with my settings or ini file. Apparently Nvidia have got this problem affecting the 8xxx series where the info which is put into the vram never gets wiped out resulting in the vram getting flooded and the gpu having to rely on the system ram. Obviously this might not be felt on games which are not as graphic intensive as Oblivion. Or if you play at medium graphic settings. Because you wouldn't be noticing an fps drop from 75 to 35 unless you are using fraps or some other frame counter program. Also, if you don't play for a long time (less than 20 minutes at a time) you wouldn't notice the problem because you'd quit playing before getting affected. Or if your 8 series card has got a high vram like 768MB it might take longer for the vram to get 'full' than a card with 256MB would. My experience is that I start playing Oblivion, at high settings (not ultra) leaving shadows on grass off and with no shadow filtering, and in the worst parts (aka grassy forests) I'd be getting ~43 fps. Very pretty and very smooth. But add 20 minutes of gameplay and I'd get 9fps... 5fps... practically unplayable!!! Quit game & restart (alt-tab just flickers the screen for me and doesn't minimise, and ctrl-alt-del minimises but freezes the game) and on the same scene where I'd be getting 5fps I'd get 45 fps once more.... 20 minutes later rinse and repeat... Believe me I've tried all drivers, from the ones which came with my card to the 158.22 one, to the last beta one released on 17th August .... It's just plain shitty to have a card capable of giving great performance and eye candy for 15-20 minutes and then having to quit and re-start! It just ruins any game experience! Nvidia should have never put the 8xxx series on the market if their drivers aren't capable of supporting today's games. If I wanted to play on medium settings so the vram bug wouldn't be noticeable I'd have spent less money and purchased a lower end card. If only I had the funds I'd bin my 8600GTS and buy ATI. And this is serious considering I've been an Nvidia girl all my life!