Discussion in 'General Software' started by Athlon2K15, Dec 6, 2010.
I AM ABOVE THE LAW!
Just to avoid the possible confusion about some of the memory Clocks in the tab,for example mine and AthlonX2 shows 1900,thats the reading from MSI afterburner,1900 in Afterburner=950 in GPU-Z..and I see there is much more of us I mean we should have it all the same readings,for example Asylums and Deathmores stats show the GPU-Z reading.
I started it this morning as I was getting ready for work, with lowered clocks I hit 4609. I will beat you all later though.
You can try i guess
Sorry...i'm now a little bit confused.
What "Clocks" should i now post? Core/Memory from GPU-Z? Core/Memory from MSI Afterburner? Core/Shader?
if you use GPUZ multiply your card memory clock times 2 because it only shows one clock of the 2 in ddr
Ok...then please correct my Values in the Table to 845/1900 like postet here:
NEW - 3DMark 11 Compilation
Where the hell is SneekyPeet? I wanna showdown with him and AthlonX2 with our 470's
i adjusted everyone's clocks so there all syncronized to the same standard
peet has two 470's and i dont think he is interested in pulling 1. my score of 5022 is the best im going to get unless i do some heavy volt mods
LagunaX XFX BE HD 6870 940/1150 4298 i3 540 @ 4.6ghz
XFX Black Edition HD 6870 940/1150:
Single GTX465 Run:
What happened to the RAM numbers on the graph? My ram doesn't run at 2300 mhz. It was set for 1200 mhz. If you are going to mark it down properly (QDR) it should be 4800 mhz. I say, just leave it at the base clock of 1200mhz.
Mega Rig Fail
GDDR5 doesn't work that way so it makes no sense. It should be either 1150mhz or 4600mhz
cant please everyone
Think the drivers are failing, he's just getting the benefit of one 480:shadedshu
Well, it was right the first time. Whoever said that GDDR5 works at dual data rate and should be marked that way should of kept their mouth shut. Jus sayin bro.
No, but you could do some proper math...1275x2=2550, not 2350. Real QDR speed is 5100mhz.
It's very simple...if people cannot post GPU-Z shots, then toss thier scores.
Use the speeds from GPU-Z as the "actual" memspeeds, please. Otherwise the data is seriously flawed, and kinda useless.
I also have something you could do,it involves a shaft and balls
Separate names with a comma.