• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

How to enable additional shaders on Radeon HD 6950

Status
Not open for further replies.
My XFX 6950 is running 900/1375@1.175v. Anything higher and it crashes on me. Not right away though, and it passes benches. But say an 1/1.5 hours of straight gaming, and it'll freeze on me(driver stops responding and need to reboot). Maybe I'll try reinstalling drivers...

But it's good to see someone can hit high clocks on low voltage. This will be my last xfx I buy most likely.

its not that bad ..:-)
 
But it's good to see someone can hit high clocks on low voltage. This will be my last xfx I buy most likely.

And those first cards with reference design were not modified by the cardresellers (like XFX, ASUS, etc.)..

Only a Sicker onto the FAN, more or less modified BIOS and finish....

If you have a good or a bad card is only about luck or not....
 
I can get 895MHz/1375MHz/1.1V on my 6950 stable. Well chuffed
 
I did the 6950 unlocked shaders mod (using the provided PHP script) with my XFX 6950. The 6970 bios seemed to cause me problems with the latest 11.1a drivers. All seems well again. I have been using MSI Afterburner, and boy is that program nice. CCC should provide the fan functionality. I have the default fan profile set , which seems to be more aggressive at ramping up the fan speed and my core set to 880. Card is fast.
 
GPU-Z shows the VCC level always at 1.1V whatever power level I choose from the CCC power level setting. BIOS core voltage is 1.1V. But after I choose it to be at +20%, shouldn't it be 1.32V ?
 
GPU-Z shows the VCC level always at 1.1V whatever power level I choose from the CCC power level setting. BIOS core voltage is 1.1V. But after I choose it to be at +20%, shouldn't it be 1.32V ?

Powertune increases wattage, not voltage. If running a 6950 bios, GPU-Z always shows 1.1v. The +20% lets the GPU increase TDP before cutting power and throttling
 
Last edited:
  • Like
Reactions: CXX
GPU-Z does not show real voltage. If I change the voltage in afterburner to 1.2V, GPU-Z still shows 1.1V.
 
  • Like
Reactions: CXX
Powertune increases wattage, not voltage. If running a 6950 bios, GPU-Z always shows 1.1v. The +20% lets the GPU increase TDP before cutting power and throttling

Thanks for the answer. Then it can increase TDP only by increasing the amount of current since as far as I remember, W=V*A. Right? Then it should not be a problem for the core because it is actually the same core as in an 6970 card. Next question is, what about the current supplied to memory? Is that also increased? If so then that may cause a problem since many of our cards does not have the same type of memory.
 
GPU-Z does not show real voltage. If I change the voltage in afterburner to 1.2V, GPU-Z still shows 1.1V.

I have a similar problem, but that may be due to different usage of voltage registers on different cards. Some use VID3, some use VID4. That is why I cannot use the afterburner. I may try an MSI BIOS to see if that can be resolved that way. But, to be honest, I can't dare do it :)
 
Thanks for the answer. Then it can increase TDP only by increasing the amount of current since as far as I remember, W=V*A. Right? Then it should not be a problem for the core because it is actually the same core as in an 6970 card. Next question is, what about the current supplied to memory? Is that also increased? If so then that may cause a problem since many of our cards does not have the same type of memory.

I've asked several times, and all I can find out is that the 6970 uses different voltages and timings, let alone different memory chips. The 6950 chips are not rated for the higher memory clocks of the 6970, but many are running their memory higher than that. I've overclocked but kept the memory lower because it seems that was culprit accounting for my unexplained crashes in-game.

And you dont need an MSI card or an MSI bios to use Afterburner. My xfx cards runs it fine, and my xfx uses VID3 as far as I can tell. But if you want, flash a new bios to the card; It'll work. You can always flash it back, even if it's crashing and failing.
 
I've asked several times, and all I can find out is that the 6970 uses different voltages and timings, let alone different memory chips. The 6950 chips are not rated for the higher memory clocks of the 6970, but many are running their memory higher than that. I've overclocked but kept the memory lower because it seems that was culprit accounting for my unexplained crashes in-game.

And you dont need an MSI card or an MSI bios to use Afterburner. My xfx cards runs it fine, and my xfx uses VID3 as far as I can tell. But if you want, flash a new bios to the card; It'll work. You can always flash it back, even if it's crashing and failing.

You confirm what I already suspect, and this is not good news to me. I am aware that these questions are asked from the beginning but no satisfactory answer is provided yet. So I gave it just another shot :)
Nevertheless, though I had more than enough money for the topmost card, I have chosen this one as my next upgrade. There is a strange satisfaction in getting a similar performance from a lower grade card and then bragging about it everywhere ;)

Update: Just check the performance comparison charts. Even without overclocking and unlocking, it is so close to the 6970 that I think our card is the best for the money.
 
Default clock 890 MHz? Using a different ASUS BIOS then.

Doesn't Asus have stock overclock on 10MHz higher than normal?

Cpu-Z along with the Mark11 score along with GPU-z or it's BS :wtf:

I don't think it's BS. Gaul had posted lots of these, and they are always amazing. And they are usually with CPU-Z.

Gaul, you still at 5Ghz CPU. What voltage did you need for that clock? You've come down from 1000/1500..
 
Last edited:
Doesn't Asus have stock overclock on 10MHz higher than normal?
Either those GPUs they put in their cards are from a better yield, or they are trying to get a somewhat unfair advantage over their competitors. If not so then why don't others do the same thing? There must be an AMD requirement to prevent nonstandard configurations. Perhaps ASUS is too big to be directed by AMD, or there is a completely different story behind.

Back to the subject, I found out something interesting when I compared MSI and Sapphire BIOSes before trying the MSI BIOS on my Sapphire card. They are identical.
Try it:
fc MSI.HD6950.2048.101123.bin Sapphire.HD6950.2048.101123.bin /b
And my original BIOS is Sapphire.HD6950.2048.101123.bin. It should be the stock AMD BIOS.
I am at a loss why afterburner voltage tuning doesn't work on my card. Not that I want to use a higher voltage on a daily basis. Unlocking the extra shaders and maxing out the regular CCC overclocking options are more than enough for me.
 
Either those GPUs they put in their cards are from a better yield, or they are trying to get a somewhat unfair advantage over their competitors. If not so then why don't others do the same thing? There must be an AMD requirement to prevent nonstandard configurations. Perhaps ASUS is too big to be directed by AMD, or there is a completely different story behind.

Back to the subject, I found out something interesting when I compared MSI and Sapphire BIOSes before trying the MSI BIOS on my Sapphire card. They are identical.
Try it:
fc MSI.HD6950.2048.101123.bin Sapphire.HD6950.2048.101123.bin /b
And my original BIOS is Sapphire.HD6950.2048.101123.bin. It should be the stock AMD BIOS.
I am at a loss why afterburner voltage tuning doesn't work on my card. Not that I want to use a higher voltage on a daily basis. Unlocking the extra shaders and maxing out the regular CCC overclocking options are more than enough for me.

I took your Saphire bios from the collection on this site, flash it, and it works fine (though I had to reinstall CCC). I just used it as it came, then OCd to 900/1350@1.15v, which is what I usually use in Afterburner, and it worked fine. Ran all the way through Heaven Benchmark, so pretty stable, though I haven't done any long stress tests.

Make sure you enabled voltage control and that's it not 'grayed out.' It's in the settings on the first tab, click on the "Unlock Voltage Contol" and then it's available. Sorry to hear you're having trouble.
 
Last edited:
  • Like
Reactions: CXX
Make sure you enabled voltage control and that's it not 'grayed out.' It's in the settings on the first tab, click on the "Unlock Voltage Contol" and then it's available. Sorry to hear you're having trouble.
I did exactly as you said. Voltage controls were enabled, and I set the voltage to a fairly safe 1.175V. Though afterburner reported the voltage as 1.175V, GPU-z showed a constant 1.1V. It was probable that GPU-z might be reporting it wrong. So ignoring it I started a benchmark. I observed serious core speed throttling, and a drop in FPS.

Next, I tried a higher voltage, 1.2V. The same result. I returned to CCC control panel and observed that power was still at +20%. Not trusting it, first set it to something different and then again to +20%. Throttling stopped, FPS recovered. Afterburner still reported 1.2V, and GPU-z 1.1V. Then changed the afterburner voltage to something different and back to 1.2V , pressing Apply each time. Throttling started again, and FPS dropped. Clearly, there is something wrong with afterburner, it probably conflicts with CCC power setting. I mean, even if it successfully sets the voltage, it probably drops the current rate, effectively dropping the resulting TDP. I don't think it can set the voltage either, since I trust GPU-z more, and it is the latest version meanwhile.

Afterburner is still beta, and maybe better to wait for the release version. It's very apparent that success rate with the current version is well below 100%. I also tried Trixx but it was a worse experience for me. I also tried RBE, and the increase in the voltage was not worth the increase in performance. At least not enough for me to take the risk. Nevertheless I will give it another shot with the next version of Afterburner.
 
When i run the CMD, i get to the last step of flashing the card, and it gives me the error of unlock.bin not found...

any help with this would be appreciated...

unlock.png


binfile.png
 
When i run the CMD, i get to the last step of flashing the card, and it gives me the error of unlock.bin not found...

any help with this would be appreciated...

http://i937.photobucket.com/albums/ad215/elmo147/unlock.png

http://i937.photobucket.com/albums/ad215/elmo147/binfile.png
your unlock.bin has that stupid VLC player Icon, Id sugest you un-install VLC media player and try again...

It's making that .bin unreadable *wrong format*

It should look like this...

this.png


also add your gpu-z saved Caymen.bin bios to this file if you ever need to flash back. it just makes things way easier :p
 
Last edited:
your unlock.bin has that stupid VLC player Icon, Id sugest you un-install VLC media player and try again...

It's making that .bin unreadable *wrong format*

It should look like this...

http://img.techpowerup.org/110212/this.png

also add your gpu-z safed Caymen bios to this file if you ever need to flash back. it just makes things way easier :p

Again...I'm not disagreeing, but my unlock.bin has the VLC icon and it works fine. Maybe it's different on his system.
 
Again...I'm not disagreeing, but my unlock.bin has the VLC icon and it works fine. Maybe it's different on his system.
Well for him having the problem I betcha thats the problem... I ran into somr thing like that last year with a file that VLC changed... I removed the program for the moment and it worked like a charm.. But hey with these things it's anybody's guess :banghead:

Plus it always helps when they fill in there system spec's!

I've been seeing alot of single posts from one timers looking for answers.... WTF?
 
Well for him having the problem I betcha thats the problem... I ran into somr thing like that last year with a file that VLC changed... I removed the program for the moment and it worked like a charm.. But hey with these things it's anybody's guess :banghead:

Plus it always helps when they fill in there system spec's!

I've been seeing alot of single posts from one timers looking for answers.... WTF?

I was about to edit my post af say it's definitely worth a shot. I don't even how I got it like that, the VLC player that is. But when there's a problem like what he's experiencing, it's worth to try any kind of option.
 
Last edited:
You probably got it like that, when you installed VLC, and left it at the default (aka. recommended settings) and quickly went through with the installation without checking the file associate types.
 
File type association shouldn't affect how other applications interpret a file. However this doesn't mean there aren't any. In fact, no harm in trying. I know how crazy things I try when I exhaust all the possible solutions to solve a problem. And sometimes the solution may seem really crazy.
 
Status
Not open for further replies.
Back
Top