We will be using Battlefield 3 at 4K resolution to verify performance gains. All testing was done at a stock memory clock of 500 MHz, stock fan speeds, and the power limit set to +50% to avoid throttling at higher voltages.
|Processor:||Intel Core i7-4770K @ 4.2 GHz|
(Haswell, 8192 KB Cache)
|Motherboard:||ASUS Maximus VI Hero|
|Memory:||16 GB DDR3 |
@ 1600 MHz 9-9-9-24
|Harddisk:||WD Caviar Blue WD10EZEX 1 TB|
|Power Supply:||Antec HCP-1200 1200W|
|Software:||Windows 7 64-bit Service Pack 1|
|Drivers:||Catalyst 15.7 WHQL|
|Display:||Dell UP2414Q 24" 3840x2160|
First, I tested stock, to get a baseline reading, and undervoltage for additional data points. Next, I increased the voltage in steps of 24 mV (the voltage controller's minimum step-size is 6 mV). For each setting, I determined maximum BF3-stable clocks and recorded the performance, once stable.
As you can see, Fiji scales nearly linearly with voltage, and performance follows at roughly half the clock increase rate.
Near +96 mV, the power limiter will start to kick in from time to time during games, when set to default, which is why we set it to +50% for all these tests.
Once we reach +144 mV, which results in a scorching 1.35 V on the GPU, the maximum stable frequency reaches its peak. At this point, the VRMs are running temperatures above 95°C even though they are cooled by the watercooling loop via a nearby copper pipe. That much heat on the VRMs is definitely not good for long-term use. I would say a safe permanent voltage increase on an unmodded card is around 40 mV or so.
Going beyond 144 mV, to 168 mV, just causes the card to get massively unstable, with maximum stable clocks nearly down to stock voltage levels.
I also added memory overclocking to the results above. My sample's memory doesn't overclock much. I was able to go from the default 500 MHz to 560 MHz, beyond which artifacts start appearing. Increasing GPU voltage has no effect on memory overclocking potential, of course.
Overclocking the memory from 500 MHz to 560 MHz adds 1.6 FPS in performance, a constant increase, no matter the GPU clock, which is as expected.
In this graph, I'm showing full-system power draw during testing. This test has clock speeds fixed at 1100 MHz for better comparison. As you can see, power ramps up very quickly, much faster than maximum clock or performance. From stock to +144 mV, power draw increases by 27%, while overclocking potential only went up by 5% and real-life performance increases by only 3%.
In all these tests, GPU temperature barely moves thanks to the watercooling block. Going from 67°C at stock voltage to 71°C at +144 mV isn't worth mentioning. Heat output definitely increases, though. The watercooler just soaks up all the heat that will ultimately be dumped into your room.
Looking at the numbers, I'm not sure if a 150W power draw increase for a mere 3 FPS increase is worth it for most gamers. Smaller voltage bumps to get a specific clock frequency 100% stable are alright, though.