My rig is stable at 1.35 vcore. However some months ago I had randomly reduced it to 1.325 and forgotten about it. I was playing Settlers VI and it started crashing every couple of hours, which made me remember the cpu vcore, which I put up at 1.35V and solved the problem. However what baffled me is this: With CPU vcore at 1.325 my 8800GT was running at 83 degrees. With CPU vcore at 1.35 my 8800GT runs at 74 degrees. The above is running Settlers VI, documented by AtiTool log, using same graphical settings of game, same resolution, even the same in-game map, same cpu clock at 3.0Ghz, same graphics clock at stock, same room temperature. But something must have happened to make my 8800GT run a whole 9 degrees cooler? Does it make sense that if CPU vcore is too low for stability, other components like the graphics card generate more heat? That's the only way this could be explained.