Battery power level

Anyway to poll this in software, even imprecisely? It’d be great to know if your device is about to die on you or not.

I’ve been having a read at the link below might be able to do what you want or head you in the right direction

I tried the following on mine and got 2.548v from the coincell and 5.120v when powered via usb, Credit to eckel.tim

int mv = readVcc();
 display.print(mv / 1000, DEC); // print the integer value of the voltage
 display.print("."); // print the decimal place
 display.print(mv % 1000, DEC); // print the rest of the voltage in millivolts

long readVcc() {
  long result;
  // Read 1.1V reference against AVcc
  ADMUX = _BV(REFS0) | _BV(MUX4) | _BV(MUX3) | _BV(MUX2) | _BV(MUX1);
  delay(2); // Wait for Vref to settle
  ADCSRA |= _BV(ADSC); // Convert
  while (bit_is_set(ADCSRA,ADSC));
  result = ADCL;
  result |= ADCH<<8;
  result = 1126400L / result; // Back-calculate AVcc in mV
  return result;
1 Like

Holy crap you can set the analog comparator to Vcc. That’s awesome! Now we just need to know what the current discharge curve for the coin cell and LIPO is.

Time to buy some coincells and run them down… what does this take? CR2032?

1 Like
"Battery: CR2016"


The problem with this is that the 1.1V bandgap reference, that you’re comparing Vcc to, isn’t very accurate. It can vary from chip to chip. The datasheet says it can be between 1.0V and 1.2V (at Vcc = 2.7V and ambient temperature of 25°C). It also varies slightly with temperature and Vcc changes. This means that the reading likely wouldn’t be accurate enough to properly determine the charge level of the battery.

You could use a more precise and stable external reference, and tie the battery output to a spare analog input pin through a resistor divider, but this would require additional parts and modifications to the design, that the creators may be unwilling to make.

One possibility would be to use the bandgap comparison technique that @Nickpatts referenced above, but also reserve an area in the EEPROM to contain calibration values for each individual device. A library function could be provided to read, calculate and return a calibrated value. The issue here is how do we measure and set the calibration parameter(s) in EEPROM? And how do we make sure that sketches don’t overwrite the value(s)?

I don’t think this was ever going to be reliable enough to get a battery %. But it would be nice to know if the battery was getting low and show some sort of warning or save the game, etc… I think we could still do this even if 1.1V is only approximate (and subject to vary by device). It could simply check the value at startup and use that for calibration… If it starts at 2.6V we assume that is “OK” and if it drops to 2.4V that’s bad. The edge case becomes if it starts at “almost dead” at startup then we’ll never get “lower” than that, we’ll just turn off before a warning is issued.

This could be solved by self-calibrating after a fresh recharge of the battery. If your previous reading was 2.5 and your new reading is 2.9, then we set 2.9 as the calibrated HIGH. So then 2.9 would always be considered the “high” and used for the comparison. I’ve already mentioned ideas for EEPROM in a sep thread… but essentially the Arduboy should have some reserved EEPROM in any case… if it booted up and detected a crazy invalid value in EEPROM it would just reset that value with a new calibration.

Of course all this would require some actual testing to see if the voltage can even be used reliably this way - as an indication of impending shutdown.

Edit: And we can publish a sample sketch users can run on a fresh battery and report their findings… perhaps the 1.1V reference doesn’t vary nearly as much as the docs say, or who knows what else we might learn from a larger sampling.

1 Like

Actually I think this is good news (after more reading). The chip has it’s own brown-out detection circuit so it can power-down if the voltage is too low (before it becomes randomly unreliable). From the docs: “ATmega16U4/ATmega32U4 features an internal bandgap reference. This reference is used for Brown-out Detection, and it can be used as an input to the Analog Comparator or the ADC.”

So from what I’m reading that same 1.1V bandgap reference is how the chip itself determines if the voltage is high enough for it to run stably… so if we’re using that bandgap ourselves to determine the voltage level I think we’re in very good company.

The default brown-out fuse setting is 2.4 volts… after which the CPU will shut itself off until the voltage is increased to a safe level.

The discharge voltage curve for a Li-Po battery (and even the lithium coin cell in the dev unit) is fairly flat, with a very steep knee just before going flat. You need to be able to take pretty precise voltage measurements to know the difference between having 2 hours left or 2 minutes.

Brown out detection doesn’t need a precise voltage. It’s just so the program won’t go crazy, possibly resulting in damaged peripherals, during the fraction of a second that the power is going down below safe limits.

It won’t help if the purpose of battery monitoring is to alert you to apply external power or give you a chance to save your game before the unit ceases to operate.

Yeah, I hear you… but I’d still like to test myself. If the brown-out detection shuts off at 2.4 volts maybe we could detect < 2.42 volts and show a message. If the bandgap measurement is precise enough to shutoff at 2.4 then it would seem plenty precise enough to register 2.42 (or 45 or whatever). Do you mean accurate as opposed to precise?

As long as the # is stable in general (per unit) there should be some value in comparing our own reading to the KNOWN brown-level level of 2.4. If we detect 2.45, 2.44, 2.42, 2.41, 2.4 then we know the device is about to power off, regardless of whether the reading is “precise” or not. The real question becomes if there is a noticeable difference between 2.42 and 2.4 in the discharge cycle to give us a few minutes to shutdown without false positives.

Again I’m not talking about turning 2.42 into 10%… just about knowing “You really should save now, power low.”

The brown out value is not known to be exactly 2.4V. It’s no more precise than the bandgap itself (obviously, since it uses it to determine the value). When set a 2.4 volts the actual threshold could actually be from 2.2V to 2.6V. See the following table from the datasheet:

2.4V is well below the minimum that you want to run the battery down to. Look at the discharge curve in this article. A good value for low battery warning would probably be around 3.4V.

However, the 1.1V bandgap could actually vary from 1.0V to 1.2V from chip to chip. With a program that reads a correct battery voltage of 3.4V with a bandgap of 1.1V, the battery would be at 3.71V with a bandgap of 1.0V and at 3.09V for a 1.2V bandgap. This isn’t taking into account any further inaccuracies due to temperature variations or ADC non-linearity.

So if the battery is really at 3.09V when we get the warning, there may not be enough charge left in the battery to have time to do something about it. If it’s really at 3.71V, we’re at full charge and the warning is useless.

I never meant it was exactly, merely that it’s going to turn the chip off when it hits a consistent x.xV. So if we can detect it getting close to x.xV we can do something about it. Accuracy doesn’t matter - the chip is about to turn off whether it’s really 2.5, 2.4, 2.3… etc… the bandgap still serves as a warning of the brownout.

2.4V is well below the minimum that you want to run the battery down to.

If we’re going to use LIPO we need to set the fuse to a much higher voltage to continue to get the brownout protection. Although the next setting is 3.4v, which might be too high if you think 3.4v is a good “warning” level… not a turn off level.

So if the battery is really at 3.09V when we get the warning, there may not be enough charge left in the battery to have time to do something about it. If it’s really at 3.71V, we’re at full charge and the warning is useless.

True, that’s why I want to see it in the real world. The gameduino for example has battery polling methods, so they must not be entirely useless.

But there’s no way to determine what the absolute value of x.xV is for any given chip, so we can’t use it to determine the voltage of the battery as accurately as we need it. We have no internal reference that is accurate enough to determine the battery voltage, and thus estimate the time remaining, within a margin of error that’s acceptable.

You can’t base things on real world observations. They can change, for the same reasons as discussed in the clock speed discussion (future variations in chip manufacture, etc.). You have to assume that the full range specified by the manufacturer could occur. Otherwise, why wouldn’t they have a tighter specification in the first place?

However, although we don’t know the absolute value of the voltage of the bandgap for an arbitrary chip, it’s probable that whatever that voltage is will be stable enough, over temperature and time, to accomplish the goal, provided each unit is individually calibrated and the calibration is saved and used.

To summarise my thoughts on addressing the problem:

  1. Ideally, attach an external reference to the AREF pin that is precise and stable enough for our purposes. Also, the battery’s positive lead would have to be connected to one of the analog input pins through a resistor divider.
  • If we don’t go with the above hardware solution, calibrate each unit and save the calibration in EEPROM as follows:
  • Determine which is more stable over temperature and time; the bandgap or the 2.56V internal reference (the absolute voltage value is not important). If the 2.56V reference is more stable, it would be better to use it instead of the bandgap, but this would again require connecting the battery to an analog input through a resistor divider.

  • If the bandgap is more stable or we don’t want to add circuitry to connect the battery to an analog input, then use the bandgap.

  • Allocate a byte in the EEPROM for battery low detection calibration, and come up with a method to make sure it won’t be overwritten, such as what is being discussed here (or at least just declare some EEPROM space as reserved for system use only).

  • To calibrate a unit you would load a battery calibration sketch. The sketch would have two modes; A calibration mode to determine the proper value and save it, and an adjust mode to allow the user to manually modify this value.

  • Calibration mode would be run with the unit in an environment with a typical ambient temperature, say 22°C (72°F). You would disconnect the battery and power the unit with a stable, clean power supply set at the desired low battery voltage (hopefully the Arduboy would be designed so it was easy to do this). Calibration mode would run a loop that simulated the conditions of a typical high load, user program. For instance, display a quickly changing pattern that lights 75% of the pixels while sounding short beeps on the speaker two or three time per second. It would also continuously read (and perhaps average over a short time) and display a value obtained by using the ADC to read the input voltage. When the user feels that the unit has run long enough for the internal temperature and reading to to stabilize, a button would be pushed to save the value in EEPROM.

  • A different technique, that makes it easier to set up the unit for calibration, would be to just power the unit with a stable, clean, accurate 5V supply connected to the USB port. Calibration mode would then read the bandgap voltage with AREF set to Vcc. Since we know the reference is exactly 5V, we can determine the actual voltage of the bandgap for this chip. Alternatively, if the Arduboy has been designed to allow using the 2.56V reference to read Vcc via an analog input, we just have to read the (scaled) Vcc using the 2.56V reference. Again knowing that Vcc is actually exactly 5V, we can determine the actual voltage of the reference based on the reading we get. This reading can then be used to calculate the low voltage calibration parameter to be stored in EEPROM. The problem with this technique is that it may not be as accurate as using a voltage equal to a low battery, because both the bandgap and the 2.56V reference can change slightly when Vcc changes. Also, operating at a higher Vcc will change the internal temperature of the chip, which can also affect the bandgap and reference voltages.
  • Adjust mode would display the stored value and allow the user to tweak the value if it was later found to be slightly incorrect, or if more or less time was desired before a low battery warning. The user could also write down the value so it could be manually restored if it accidentally became corrupted.

  • A sketch wishing to implement a low battery warning would call a provided library function fairly frequently, say once every few seconds. The function would read the battery voltage and compare it to the value stored in EEPROM. Every time the function reads a value higher that the stored value, it resets an internal counter stored in a private variable. Each time the function reads a value at or below the stored value, it increments the counter, up to a certain threshold. If the threshold count has been reached, the function returns true to the sketch, indicating that the battery is low. Otherwise, the function returns false.

  • It would be up to the sketch to display a warning, beep the speaker, or do whatever is desired when low battery is indicated. Or, another library function could be added to display, or otherwise handle, a low battery warning in a standard way, that could be used by any sketch.

The absolute value doesn’t matter. The chip turns itself off based on the calculated value - so we can make guesses on that - if it moves slowly enough.

I’m not against a calibration tool if that helps though. Is USB voltage always that precise though?

My plan is to have 16 bytes available for system use. So far all I’ve god is brightness, sound on/off, and now battery calibration.

That’s why I said you would connect a stable, clean, accurate 5V supply to the USB port when doing the calibration. You would use a bench power supply, or something similar, that you would be sure was providing exactly 5V (within the tolerance needed to give an acceptable low battery calibration value).

Again, a better, but possibly harder to set up, method would be to power the unit at exactly the desired low battery voltage during calibration.

Can you explain in detail the technique you would use to do this? I’m not sure I’m following what you’re trying to say.

The code shown above. It uses the bandgap to measure voltage - so does the chips own brownout detection… so the absolutely value doesn’t matter. We’d be using the same exactly measurement (accurate or not) that the chip uses to decide when to turn itself off - making it the best possible way to guess voltage without a dedicated circuit (as you suggested).

It doesn’t matter if it’s really 2.3 or 2.6… if the chip THINKs it’s 2.41 and it shuts off at 2.4, then that’s all we need to know. Accurate would be better, but doesn’t really change the equation much.

Summary: If it’s accurate enough for brown-out detection it’s accurate enough for us

So please manually work through the code that you propose to use and tell me what the battery voltage will be for the two cases when the brown out detection actually turns out to be 2.2V or 2.6V, when set to 2.4V.

I"m not sure I see your point. No one has really suggested changing the fuse… So if the chip THINKS it’s 2.41 that’s all that matters. I think you already did the math to show reality could vary a lot in the worst case, but we can’t do anything about that. The chip is going to brown-out shutoff at 2.4v (calculated) regardless of the ACTUAL voltage. So if we know the same calculated voltage the chip does, then we know when the chip plans on turning off - which si the whole point - to know when the chip plans on turning off.

If you really want to join a more real-time discussion you could join the IRC channel.

I’m sorry; maybe my misunderstanding and the reason for my confusion is the $64,000 question:
What actions and procedures do you feel a sketch should perform when it determines that the chip is about to turn off (such as informing the user, saving data, etc.)?