• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Why is Battery wear level constantly changing?

Joined
Apr 26, 2019
Messages
315 (0.14/day)
Location
Italy
System Name The Worker/ Laptop
Processor Ryzen 5 3600/ i5 1035g1
Motherboard MSI b450 Tomahawk Max / Lenovo proprietary
Cooling Arctic Freezer 33 eSport One
Memory 16Gb DDR4 3200 Mhz Corsair vengeance LPX / 12gb ddr4 3200Mhz (4gb are soldered :( )
Video Card(s) Gigabyte gtx 1660 Ti OC @2030Mhz core and 7400Mhz memory / Intel G1 UHD
Storage 1 970Evo plus 500gb, 2x250Gb Samsung 850 Evo,3 TB HDD / 250gb WD SN530
Display(s) Asus mx259h / 15.6 FHD TN
Audio Device(s) Onboard / Onboard
Power Supply CoolerMaster G750M
Software W10 64bit
Hi all, my laptop is a Lenovo ideapad 3, with a 35wh battery.
I have it since september 2020 and the only thing I don't understand is how battery wear level works.
Within the first 2 months battery wear reading from Hwinfo was always 0%.
After a bit of time went to like 5% and since then constantly changing between 2% to 5%.

Since I use it 90% of the time plugged in I set in lenovo vantage the thing that let battery charge up to 60% when plugged, to preserve its life.
From what I can read in vantage , my battery has only 10 cycles

Is it normal?
 
While HWiNFO64 is my favorite hardware information program, I would not pay attention to it for this. Use the battery program that came with your computer.

Also check your laptop manual for "battery calibration". This process "calibrates" the battery to the notebook's battery monitoring feature so the estimates (for wear levels and charge/discharge percentages) are more accurate. But understand, they are only estimates because no program can accurately predict how you will use your computer from this point forward. It can only guess based on past use.
 
Just for your info...there is a software utility called BatteryCare which takes care of cycling of your battery automatically..

HWINFO.jpg
 
Battery calibration is not a thing anymore it also has nothing to do with the wear level.

As far as I know "wear level" is simply a measure of how much voltage the battery provides compared to the nominal value. Basically as time passes by that voltage is supposed to drop, however it can still vary based on temperature and such.
 
Battery calibration is not a thing anymore it also has nothing to do with the wear level.
:( Of course it is. Let me Google it for you.

Just remember, the process does not make any changes to the battery itself. It just puts the laptop's battery monitoring feature in sync with the battery so when the laptop says the battery is 80% charged, for example, it really is ~80%. And when down to ~10%, it (more or less) accurately tells you how much time you have left.

So absolutely, battery calibration is still "a thing". In fact, when you replace the battery, you need to recalibrate (synchronize) the monitoring feature with the new battery.
 
:( Of course it is. Let me Google it for you.

Just remember, the process does not make any changes to the battery itself. It just puts the laptop's battery monitoring feature in sync with the battery so when the laptop says the battery is 80% charged, for example, it really is ~80%. And when down to ~10%, it (more or less) accurately tells you how much time you have left.

So absolutely, battery calibration is still "a thing". In fact, when you replace the battery, you need to recalibrate (synchronize) the monitoring feature with the new battery.

As usual you try to be particularly arrogant and you are wrong. Manufacturers stopped recommending that years ago when they switched away from nickel based batteries, there is absolutely no point in trying to calibrate a lithium ion battery. In fact draining down the battery to zero and charging it to 100% and keeping it there for a while is exactly what you should not do.

People keep perpetuating this practice and it's simply no longer necessary. Anyway, I am not going to argue with you since I know you're never going to admit you're wrong on anything.


Memory effect, also known as battery effect, lazy battery effect, or battery memory, is an effect observed in nickel-cadmium and nickel–metal hydride rechargeable batteries that causes them to hold less charge.[1] It describes the situation in which nickel-cadmium batteries gradually lose their maximum energy capacity if they are repeatedly recharged after being only partially discharged. The battery appears to "remember" the smaller capacity.[2]


Li-ion batteries use an intercalated lithium compound as the material at the positive electrode and typically graphite at the negative electrode. The batteries have a high energy density, no memory effect (other than LFP cells)[14] and low self-discharge.
 
Last edited:
I have it since september 2020 and the only thing I don't understand is how battery wear level works.

Fundamentally, battery level works as follows:

1. Chemical engineers form a model, a guess, at how the electrons work inside the battery pack.

2. Electronic engineers count the number of electrons that enter, or exit, the battery. Aka: Columb Counting (https://www.analog.com/media/en/technical-documentation/technical-articles/P356_EN-SOC.pdf)

3. #1 is combined with #2 in various inaccurate ways. Lets say a battery is expected to be 90% efficient. That means if 1-amp-hours into the battery, it is then assumed that 0.9-amps-hours can come out of it. Maybe there's manufacturing variance so some batteries are 92% efficient and others are 85% efficient. Still, you need a first-guess.

4. To "calibrate" #3, engineers typically have a learning algorithm / self-tuning algorithm of some kind. Fully discharging the battery all the way to 0% "Teaches" the battery pack's learning algorithm. If you actually want an accurate count, you need to calibrate every month or so. It depends on temperature, age of the battery, chemistry, etc. etc. If you haven't fully discharged within a month, the self-calibration process would be out-of-sync with the battery's physical changes. (Every time electrons enter or leave the battery, it gets slightly worse at keeping a charge). And it depends on the speed at which electrons leave as well (higher amps usually wear out the battery faster).

5. Because users rarely discharge to 0%, there's a voltage estimation methodology. Lithium Ion IIRC is somewhere around 3.7V to 4.2V depending on charge, chemistry, and other details. So if you look at the battery and measure 3.6V, you know the battery is "low" on charge, but you don't really know if its at 25% or 40%. Still, this "partial discharge" gives some amount of information to the self-calibration algorithm. Its not as good as an actual full-discharge cycle but its better than nothing.

--------------

Tl;DR: Battery level gauges are grossly inaccurate and no one knows how to make a cheap-and-effective tracker. What you see is the state of the art right now. If you have a better idea, you probably can become very rich if you solve this problem better. For now, most improvements are in #1 (where chemical engineers come out with a new mathematical model of some combination of date / temperature sensors / etc. etc. to guess at the internal battery charge).

For now: the only real way to know how much charge a battery holds is to discharge it entirely and count it up (fortunately, computers are very good at counting electrons!! V = IR, simple Ohm's law to turn current with a 99% accurate resistor into voltage, then voltage -> Analog-Digital converter -> computer sits there and counts in the background. That computer exists inside of your battery pack already). But once you do so, you've already changed the characteristics of the battery (aka: you've done one more discharge cycle, so it holds less electricity now).

Funny story: it is said that the tiny computers inside of every battery pack doing nothing but counting electrons is more powerful than the entirety of the Space Program of 1969.
 
Last edited:
Fully discharging the battery all the way to 0% "Teaches" the battery pack's learning algorithm.

Let's assume that's true and take for instance a battery that reports 0% charge which actually holds let's say 5% in reality. How can draining it to zero again fix that ? When it goes down to "0%" the device will shut down and the battery will still have extra charge and no matter how many times you try that because the charging circuit will always start from the 5% charge that's left.

These charging circuits simply measure the charge/discharge rate and compute the percentage based on that, that's all they do. The battery may still have left 0, 1, 2 ,3% etc no matter how many times you'll drain it to zero because those charge/discharge rates are are always inaccurate.
 
Let's assume that's true and take for instance a battery that reports 0% charge which actually holds let's say 5% in reality How can draining it to zero again fix that ?

Because there's a coulomb counter on every battery pack. So when you drain it to 0% (which "looks" like -5% to the coulomb counter), the coulomb counter goes "oh, there's no such thing as negative-percent" and should self-calibrate.

I mean... I guess someone could have programmed a shitty coulomb counter.
 
Because there's a coulomb counter on every battery pack. So when you drain it to 0% (which "looks" like -5% to the coulomb counter), the coulomb counter goes "oh, there's no such thing as negative-percent" and should self-calibrate.

The coulomb counter just measures the charging rate (amps per hour), it doesn't measure negative percentages or anything like that. It's the job of the software in the charging circuit to work out what the percentage is, which like I said it's always inaccurate. And yes you can get negative amps per hour that's what happens when the battery discharges.

Even if you "calibrate" the battery by draining it to zero it becomes uncalibrated the moment it starts charging again.
 
Last edited:
The coulomb counter just measures the charging rate (amps per hour)

It also measures the discharging rate.

So that's the thing. If 40W-hrs come in, then 30W-hrs come out before the battery goes dead... your coulomb-counter learns that you have 75% efficiency. Then calibrates off of that last charge/discharge cycle. Every coulomb-counter has different logic inside of it (for all different battery packs), so its hard to generalize how they exactly work. But we know they all count amps (both in and out), watt-hours (both in and out), voltage, maybe temperature.

Even if you "calibrate" the battery by draining it to zero it becomes uncalibrated the moment it starts charging again.

By draining to 0%, the coulomb-counter actually knows if the battery is 30W-hr or 28W-hr. It "sets the zero" appropriately.

If you never reach 0%, the coulomb-counter won't have a reference point. (Or, its reference point would be weeks-old or even months-old). The more "Recent" the reference point, the more accurate the coulomb counter gets.

It will never be perfectly accurate though. Its just an estimation.
 
As usual you try to be particularly arrogant and you are wrong.
And as usual, you didn't do your homework, or read and understand what poster say before posting criticisms and with personal insults instead of simply debating the facts. :(

I NEVER EVER said anything about "memory effect"! Nor did I say calibration has anything to do with memory effect. Calibration has NOTHING to do with memory effect. I said calibration synchronizes the laptop's monitoring feature to the battery. This provides the most accurate reporting of the battery status. Google it.

Plus, I NEVER EVER said anything about charging the battery to 100% or running the battery down to 0%! This is you again misrepresenting what others have said. :(

Even if you "calibrate" the battery by draining it to zero it becomes uncalibrated the moment it starts charging again.
:( That is NOT calibrating a battery! And yes, once you "calibrate" for accuracy, it starts to lose accuracy. It is like defragging a hard drive. As soon as you start to use the drive again, fragmentation starts again. That is EXACTLY why defragging is (or should be) done regularly, just as calibration should be done every 2 - 3 months. Nobody EVER said calibration is a "once and done forever" event. Again, you obfuscating the facts. :(

But since you can't see yourself to admit I might be right here, don't believe me! I don't care! It is not about me being right or you wrong. It is about the readers here having the facts - the true facts!

See How to Calibrate Your Laptop’s Battery for Accurate Battery Life Estimates. Note that title says nothing about memory effect, but is about reporting battery life estimates. And then note where the author says (my bold underline added),
Its built-in power meter estimates how much juice available and how much time on battery you have left—but it can sometimes give you incorrect estimates.

You shouldn’t be allowing your laptop’s battery to die completely each time you use it, or even get extremely low. Performing regular top-ups will extend your battery’s life.

However, this sort of behavior can confuse the laptop’s battery meter.

Calibrating the battery won’t give you longer battery life, but it will give you more accurate estimates of how much battery power your device has left.

Don't believe the How-To Geek either? Fine. See Ask Bob Rankin, Do I Need to Calibrate my Laptop Battery? and note where he says,
"Calibration" in this context means ensuring that the software power meter accurately measures the amount of power remaining in the battery.

Calibrating a laptop battery puts the power meter back into sync with the battery's actual "fully charged" state.

Think Bob is wrong too? Fine. How about HP? See the HP Knowledge Base Article, C04700771, HP Notebook PCs - Testing and Calibrating the Battery (Windows) and note where it says,
Calibrating the battery resets the battery gauge to accurately display the charge level in Windows.

Don't believe HP either? Then I guess the whole world is wrong and only Vya Domas is right. :rolleyes:
 
Oh, it should be noted that the #1 source of modeling-difficulty is something called "self-discharge".

Batteries "leak" their charge over time, sometimes very dramatically. This is also very prone to manufacturing variance (just... luck of the draw). Just as some CPUs can go over 5GHz ("Perfect Chip") but others have difficulty reaching 4.9 GHz... battery manufacturing variance leads to different characteristics.

So lets pretend you're a coulomb counter. The user has charged the laptop to max (40 Amp-hours flowed into the battery at assumed 90% efficiency: so a total of 36 Amp-hours exist in the battery now). But then, the user left their laptop in a closet for 2 weeks. Assuming a self-discharge of 10% / month, you assume 5% of the Amp-hours disappeared (or now 34.2 Amp-hours left).

But what if the battery really only lost 2% of its charge? Because maybe the temperature was lower than you expected (the laptop was stored at 50F instead of 72F). Or what if the laptop was stored in a car at 100F in the summer most of the time, so it really self-discharged 15%? You don't know. The coulomb counter must guess these events. (Or, run a microcontroller in the background constantly checking the temperature. Which btw: costs power and increases the self-discharge rate. Because now you're running a computer, lol for all of these calculations)

There's no good solution here. Fortunately, modern microcontrollers use very little energy (microwatts), so the "keep a microcontroller on 100% of the time to constantly measure temperature conditions" is a surprisingly effective strategy. But there's no "measurement" of the battery pack: all the microcontroller does is measure temperature and then guess what happens to the battery.
 
Calibration has NOTHING to do with memory effect.
I NEVER EVER said anything about "memory effect"!

Calibration has everything to do with it, you think it doesn't because probably you didn't even knew where all of this "battery calibration" business originated. A lot of early laptops and mobile phones and other mobile devices used Nickel based batteries which suffered from the memory loss effect and one way to prevent that was frequent full discharge-charges cycles aka "battery calibration", "yOu CaN gOoGlE iT" if you don't believe me like you love to say so much.

Ever since then people, such as yourself, kept doing that even though it was no longer necessary with newer Li-ion batteries. Now almost no manufacturer officially recommends doing that, because it's a waste of time and it actually hurts the battery.

I said calibration synchronizes the laptop's monitoring feature to the battery.

Synchronize it with what ? What does that even mean ? The level of charge is provided by the charging circuit, that's the monitoring feature on a laptop or everything else that has a battery.
Don't believe the How-To Geek either? Fine. See Ask Bob Rankin, Do I Need to Calibrate my Laptop Battery? and note where he says,
Bob Rankin, whoever the hell he is, looks like he has no clue what he's talking about. The charging circuit cannot measure the "the percentage of "fully charged" power remaining", for the last time, that's not a thing. All that does is measure the charge/discharge rate, that's it.

Not only that but he then goes on to talk about discharging and fully charging the battery in the context of calibrating it which is hilarious.
Don't believe HP either?

For the last time, all documentation that you see that talks about calibrating batteries is meant for devices from the era of Nickel based technology.

Can you imagine with billions of devices out there running on battery how stupid it would be that they'd have to be manually calibrated by their users ?

Then I guess the whole world is wrong and only Vya Domas is right. :rolleyes:

Just a BS statement from someone that scrambles to find obscure outdated info on something that is no longer true. Are you 14 ?
 
Last edited:
So everyone else is wrong but you. Got it.
 
Interpret it whichever way you want. Again, you're quite arrogant for thinking that your opinion on this matter is representative of "everyone else" or "the whole world".

OP asked about the wear level of the battery which had nothing at all to do with calibrating batteries, I simply pointed that out but you couldn't contain yourself.
 
For the last time, all documentation that you see that talks about calibrating batteries is meant for devices from the era of Nickel based technology.

In case it isn't clear: everything I've talked about in this topic applies to Li-ion or Li-poly batteries.

NiCd hasn't been popular for almost 20 years now. To still be talking about NiCd in the modern era would be grossly incorrect. No one aside from you Vya is talking about NiCd era memory effect. I'm talking about modern day coulomb counters.

Based on my knowledge of Li-ion and Li-Poly chemistries, a page like HP's FAQ (https://support.hp.com/us-en/document/c04700771) applies 100% to Li-ion or Li-Poly. Regular calibration of the battery-microcontroller is necessary to "set the zero" of your coulomb counter.

----------

If you don't set the coulomb counter: then your battery charge meter is just inaccurate. Your computer might shutoff at 15%, or it might run down to 0% and then stay at 0% for another hour. It just means that your battery pack doesn't know where the "zero point" has gone, so it is unable to accurately judge charge remaining. Actually finding the zero-charge calibrates the device, but that requires running your computer until it runs out of battery: an operation that is rarely done by today's users.
 
Battery callibration is absolutely still a thing and memory effect is not any longer, it's ancient history, unless using NiCad batts which is unlikely.

Battery callibration simply teaches the battery controller how worn out the battery is. Nothing more, nothing less.

Lithium cells hate being drained to zero. No callibration process is going to do this. All my thinkpads at my business have a callibration process in bios and it does NOT drain to zero. If it really drained to zero, goodbye 50% battery life. When a lithium battery tells you zero, it's probably really at 25%. That's 100% intentional.
 
Lithium cells hate being drained to zero. No callibration process is going to do this. All my thinkpads at my business have a callibration process in bios and it does NOT drain to zero.

You're right. I don't know of any automatic drain-to-zero methodology. You have to manually drain-to-zero, which is bad for the battery. If you manually drain to zero, the coulomb counter (usually) will recalibrate however. No guarantees: everyone's battery packs are programmed differently.

HP's documentation states that running "memtest" for hours is a good way to force a drain-to-zero situation. The battery pack then detects a drain-to-zero situation, and recalibrates itself automatically. No promises that other battery packs work like this... but I've seen that on more than just HP systems. So its somewhat common out there...

-----------

Note: Coulomb counting is... a bit inaccurate. You're trying to determine the number of Coulombs (aka: how many electrons) are in the battery. However, the only thing you can measure is amps (which is Coulombs per second). So a coulomb-counter measures amps * number of seconds those amps are going by. Its like trying to figure out the location of a car by only knowing it's speed. This extremely inaccurate for many, many reasons. But its literally the best that today's technology can do, so we have to put up with it.
 
I'm talking about modern day coulomb counters.

And in case I wasn't clear either whether they are modern or old all they do is measure amps per hour it wont calibrate anything it's just a sensor, I don't know why you keep implying that. Even if a battery is successfully brought down to exactly no charge (which actually isn't possible on lithium ion) and charged back again as soon it starts discharging the battery indicator becomes inaccurate. The coulomb counter may read for instance at a certain point in time 200mA but the device's power consumption may vary by quite a margin within two sampled values so it might have gone up to 350mA for a couple of millisecond.

But if that measurement isn't recorded then it wont reflect in the computed battery percentage, hence it becomes inaccurate. This is why I keep saying calibrating batteries is an utter waste of time, it achieves nothing. The percentage you see is always wrong, no matter what.

To still be talking about NiCd in the modern era would be grossly incorrect.
I have to because that's the origin of these myths.
 
You're right. I don't know of any automatic drain-to-zero methodology. You have to manually drain-to-zero, which is bad for the battery. If you manually drain to zero, the coulomb counter (usually) will recalibrate however. No guarantees: everyone's battery packs are programmed differently.

HP's documentation states that running "memtest" for hours is a good way to force a drain-to-zero situation. The battery pack then detects a drain-to-zero situation, and recalibrates itself automatically. No promises that other battery packs work like this... but I've seen that on more than just HP systems. So its somewhat common out there...

-----------

Note: Coulomb counting is... a bit inaccurate. You're trying to determine the number of Coulombs (aka: how many electrons) are in the battery. However, the only thing you can measure is amps (which is Coulombs per second). So a coulomb-counter measures amps * number of seconds those amps are going by. Its like trying to figure out the location of a car by only knowing it's speed. This extremely inaccurate for many, many reasons. But its literally the best that today's technology can do, so we have to put up with it.
A lot of cell controllers will even shut off current long before you hit true cell zero. It's hard to get there.

I have to because that's the origin of these myths.
It's 2021. People are walking around in their voting years who don't even know what a NiCad is. It's old news now. It may be the origin of one old myth but the present reason for calibration is not a myth.
 
And in case I wasn't clear either whether they are modern or old all they do is measure amps per hour it wont calibrate anything it's just a sensor, I don't know why you keep implying that

Erm... no.

1611693212796.png



LTC2941 is one example of a coulomb counter. You can see its a full blown microcontroller with I2C support. There's thousands of coulomb counters out there, they all work differently.

Most of these have some kind of long-term storage to "permanently" count the coulombs. I'm talking like 1kB of flash or something like that. You don't need much space. As I've stated before: fancy ones will constantly measure temperature conditions and use that to predict the battery's internal self discharge (and thus need storage, a real-time-clock, AND a thermometer).

You can see here that there's a differential-amplidifer (sense+ vs sense-), internal voltage reference, oscillators, I2C controller / communications even on this very simple LTC2941. Its actually a pretty sophisticated process. (A sophisticated utter shit process. Garbage-in / garbage out principle... but its the best we can do right now...)
 
Again, you're quite arrogant for thinking that your opinion on this matter is representative of "everyone else" or "the whole world".
It is not my opinion. I posted multiple links to several sources. YOU are the one saying they are all wrong.

You ignore R-T-B where he agrees calibration is real and explains how it works. You ignore dragontamer where he points out calibration is real and necessary and only you are talking about memory effect and Ni-Cad batteries.

I'm not the one being arrogant here.
 
Lithium cells hate being drained to zero. No callibration process is going to do this. All my thinkpads at my business have a callibration process in bios and it does NOT drain to zero. If it really drained to zero, goodbye 50% battery life. When a lithium battery tells you zero, it's probably really at 25%. That's 100% intentional.

I probably should note...

When I say "drain to zero", I mean "drain to 3.0V" or "drain to 2.8V" or whatever the heck the manufacturer thought was a safe cutoff voltage. No one actually goes to 0V, for reasons that you've mentioned. Still, figuring out how many coulombs / amp-hours it takes to reach 3.0V or 2.8V (or whatever safety-margin your battery system shuts off at) is the reason why calibration is needed. The number of amps to get there just changes constantly.

Yes: literally safety. Go to 0V and you might explode the battery pack and start a fire.
 
Erm... no.

View attachment 185699


LTC2941 is one example of a coulomb counter. You can see its a full blown microcontroller with I2C support. There's thousands of coulomb counters out there, they all work differently.

Most of these have some kind of long-term storage to "permanently" count the coulombs. I'm talking like 1kB of flash or something like that. You don't need much space. As I've stated before: fancy ones will constantly measure temperature conditions and use that to predict the battery's internal self discharge (and thus need storage, a real-time-clock, AND a thermometer).

You can see here that there's a differential-amplidifer (sense+ vs sense-), internal voltage reference, oscillators, I2C controller / communications even on this very simple LTC2941. Its actually a pretty sophisticated process. (A sophisticated utter shit process. Garbage-in / garbage out principle... but its the best we can do right now...)

Except those are a lot more than just simple “coulomb counters”.

You ignore R-T-B where he agrees calibration is real and explains how it works. You ignore dragontamer where he points out calibration is real and necessary and only you are talking about memory effect and Ni-Cad batteries.

And I explained why calibration used to be done and how it’s pointless now, quite thoroughly I’d say in my last comment. You just ignored everything and defaulted to telling me to just google it and copy pasted a bunch of stuff you found.

I don’t care who agrees with who, this isn’t a popularity contest where you’re right if most people agree with you. Do you seriously think that this is how it works ?

To this moment nobody was able to provide a clear explanation on how calibration actually works and why it’s still necessary, everyone said that “it just is”. I am sorry if that annoys you but that doesn’t count.
 
Last edited:
Back
Top