• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Any software that can read nvidia core voltage, Pascal and onwards?

Joined
Aug 27, 2023
Messages
302 (0.47/day)
Seems software just reads what nvidia passes on which sometimes isn't actual voltage, maybe SMBUS?
 
Thanks but those do not measure actual voltage for me. For instance my 1050Ti is HW modded to give an extra ~20% voltage so when requesting 1.0000V it's nearer 1.2V but software still shows 1.0000V. Thought HWiNFO might have had something but no. Can and have checked with a meter but would prefer a SW method if possible.
 
Hi,
Modded how ? shunt mod ?
If so you'd have to use a volt meter because it fakes the system vbios.
 
Link the HW mod you used.

Often a HW mod has to do with altering the feedback voltage by dumping some of it to ground in order to have it compensate by giving it extra voltage to offset what was dumped. In such an instance, it is obvious why no software could provide an accurate voltage reading, since you are purposely giving it a false reading to the place responsible for making readings.
 
Hi,
Yeah the main GPU power reading might show the increase but you'd have to of known the original amount before the mod.
 
@ty_ger see what your saying, no output voltage only FB voltage. Now I'm wondering if no voltage is measured at all but just goes by it's own reference. Sorry no link, it's just something I made myself using one resistor, no dumping to ground but does of course adjust FB. Could look for a pic or try and find the data sheet and draw it in, was from a long time ago, if you want to see?

That still leaves a problem with my 1660 super which only shows up to maximum default of 1.068V or 1.093V but have gone higher than that without using a HW mod. I think there was something back during pascal days maybe from EVGA or someone else that read voltage from the VRM or maybe I'm imagining things as my memory is not so good.
 
Tried some other SW such as Asus GPUTweaker III and MSI Afterburner with I2C commands but no joy.

Found a datasheet for VRM on 1660 Super and does report FB voltage as VOUT as per @ty_ger post. It has a resolution of 10mV with range of 0 - 2.55V while reported nvidia voltage appears to use a resolution of 10uV so not measuring actual voltage. With maximum default voltage nvidia reports 1.06875V

With HWiNFO customized to multiply GPU core voltage by 1000 we get
coregpu3.png


Small bug with minimum showing rounding up to 1.069V, maybe not updated after custom value set. Seems to be a small bug with GPU-Z too showing 1.0680V

Martin of HWiNFO reports that likely nvidia have locked down reading VRM. :( Was hoping to get secondhand 3000 GPU for further testing without having to make electrical contact with rail / sense.
 
Finally managed to develop some comm's software for 1660 and now can read VOUT, seems more likely reporting VOUT than VSense, will have to confirm. Still this is nice, I now need to look for a second hand 30/40 card with similar comm's. Did see a 3060m going relatively cheap but a little concerned with unknown build. The 1050Ti is a lost cause with it's ermm cost effective VRM so no comm's / data to be had.


Vout.png

A little 1660 Super heaven run at 150W limit and 100% fan, otherwise all stock. Average current at default of 50us.
 
understand the want to get numbers, but unless getting to max gpu load, why increase vcore for it?
ignoring that those chips get binned and not so good ending up in lower tier will need max just to do a little better, and those going into oced units, wont have much "headroom" before other limiters kick in.
or is it just for "measuring" (as in diagnostics) not related to tweaking?
just curious.
 
@Waldorf For monitoring.

There's a bug I'm looking into that removes limits, see https://www.techpowerup.com/forums/threads/throttling-gtx-1660-super-oc-uv.320586/post-5223303

As @ThrashZone said an increase in power output can be a telltale for higher voltage and while I see this, the core voltage (VID) never reported above the cards max. I had to strip the card down and connect some wires to verify. I want to check if this bug is happening with just this card or also happens on later ones and I would prefer not to have to add wires. I plan to do that when/if I can find some affordable secondhand cards.

I passed on to Martin at HWiNFO about reading VRM values so maybe that feature will be looked at and added but not straight forward so idk.

Edit: Okay with a ~10% FB mod have verified it's Vout given and not FB sense voltage so one can read the voltage with software although it will usually be a few mV higher than sense voltage with load.

Vout2.png
 
Last edited:
Well I got hold of a secondhand RTX3070 and preliminary testing shows some differences in particular xBAR clock and others dropping. This results in about 10% drop in frame-rate for identical GPU and Memory clocks.

Example with 3DMark Steel Nomad
F10a.png

Note the big drop with xBAR and Video clock. Back in the earlier days of Pascal and cross flashing FE the XOC vbios gave lower performance clock for clock and it was noticed the video clock was lower. At the time xBAR clock wasn't measured but probably was lower too. The effective GPU clock (E-GPU) drops possibly due to temperature, need to check further.

Normally I would expect 34-35 FPS here.
2055b.jpg


Note also how 3DMark appears to be using requested GPU clock (R-GPU) instead of effective GPU clock. Surprising considering it's a bespoke benchmark!
2055.jpg


Some more testing needs to be done and still need to get and try 40 series.
 
Back
Top