# Use GPU-Z to attain wattage?

#### Soup

##### New Member
Given that Watts = Amps x Volts, couldn't I use the info provided by GPU-Z to obtain my GPU's actual wattage?
The only problem I see is that I don't know which V or A to use... see below.

The only reasonable wattage's I got came from:
• 12v * VDDC Current In = ~209W
• VDDC * VDDC Current = ~161W
• MVDDC * VDDC Current = ~239W

Any thoughts?

P.S. I hope this is in the correct section. Preemptive sorry if it is not.

#### Athlon2K15

##### HyperVtX™
i believe it's VDDC*VDDC Current

#### W1zzard

Staff member
that calculation would be correct and give your the gpu only power consumption of the board (excluding memory and other consumers on the board). there is no way to calculate board power for the whole card

#### silkstone

that calculation would be correct and give your the gpu only power consumption of the board (excluding memory and other consumers on the board). there is no way to calculate board power for the whole card
How much additional power would the components on the board use, percentage wise? 20% +/- 5%?

#### W1zzard

Staff member
screenshot above shows 149 A @ 1.08 V GPU = 160.92 W for the GPU only.

Assuming the user was running Furmark, I measured 270W board power, so 90 W missing. Obviously this can change depending on application and power state.

#### Soup

##### New Member
I guess I'll just break down and buy a Kill-a-Watt. Thanks for the input guys/gals? Very useful.

BTW I was running DiabloMiner

#### Arctucas

Or use a clamp meter around the +12VDC wires between the PSU and the card while checking the voltage with another voltmeter, then apply Ohm's Law.

#### W1zzard

Staff member
Or use a clamp meter around the +12VDC wires between the PSU and the card while checking the voltage with another voltmeter, then apply Ohm's Law.
missing up to 75W power through the pcie slot

#### Arctucas

missing up to 75W power through the pcie slot
Is that 75W used first?

Is it used always, or does it vary?