Battery power level

The discharge voltage curve for a Li-Po battery (and even the lithium coin cell in the dev unit) is fairly flat, with a very steep knee just before going flat. You need to be able to take pretty precise voltage measurements to know the difference between having 2 hours left or 2 minutes.

Brown out detection doesn’t need a precise voltage. It’s just so the program won’t go crazy, possibly resulting in damaged peripherals, during the fraction of a second that the power is going down below safe limits.

It won’t help if the purpose of battery monitoring is to alert you to apply external power or give you a chance to save your game before the unit ceases to operate.

Yeah, I hear you… but I’d still like to test myself. If the brown-out detection shuts off at 2.4 volts maybe we could detect < 2.42 volts and show a message. If the bandgap measurement is precise enough to shutoff at 2.4 then it would seem plenty precise enough to register 2.42 (or 45 or whatever). Do you mean accurate as opposed to precise?

As long as the # is stable in general (per unit) there should be some value in comparing our own reading to the KNOWN brown-level level of 2.4. If we detect 2.45, 2.44, 2.42, 2.41, 2.4 then we know the device is about to power off, regardless of whether the reading is “precise” or not. The real question becomes if there is a noticeable difference between 2.42 and 2.4 in the discharge cycle to give us a few minutes to shutdown without false positives.

Again I’m not talking about turning 2.42 into 10%… just about knowing “You really should save now, power low.”

The brown out value is not known to be exactly 2.4V. It’s no more precise than the bandgap itself (obviously, since it uses it to determine the value). When set a 2.4 volts the actual threshold could actually be from 2.2V to 2.6V. See the following table from the datasheet:

2.4V is well below the minimum that you want to run the battery down to. Look at the discharge curve in this article. A good value for low battery warning would probably be around 3.4V.

However, the 1.1V bandgap could actually vary from 1.0V to 1.2V from chip to chip. With a program that reads a correct battery voltage of 3.4V with a bandgap of 1.1V, the battery would be at 3.71V with a bandgap of 1.0V and at 3.09V for a 1.2V bandgap. This isn’t taking into account any further inaccuracies due to temperature variations or ADC non-linearity.

So if the battery is really at 3.09V when we get the warning, there may not be enough charge left in the battery to have time to do something about it. If it’s really at 3.71V, we’re at full charge and the warning is useless.

I never meant it was exactly, merely that it’s going to turn the chip off when it hits a consistent x.xV. So if we can detect it getting close to x.xV we can do something about it. Accuracy doesn’t matter - the chip is about to turn off whether it’s really 2.5, 2.4, 2.3… etc… the bandgap still serves as a warning of the brownout.

2.4V is well below the minimum that you want to run the battery down to.

If we’re going to use LIPO we need to set the fuse to a much higher voltage to continue to get the brownout protection. Although the next setting is 3.4v, which might be too high if you think 3.4v is a good “warning” level… not a turn off level.

So if the battery is really at 3.09V when we get the warning, there may not be enough charge left in the battery to have time to do something about it. If it’s really at 3.71V, we’re at full charge and the warning is useless.

True, that’s why I want to see it in the real world. The gameduino for example has battery polling methods, so they must not be entirely useless.

But there’s no way to determine what the absolute value of x.xV is for any given chip, so we can’t use it to determine the voltage of the battery as accurately as we need it. We have no internal reference that is accurate enough to determine the battery voltage, and thus estimate the time remaining, within a margin of error that’s acceptable.

You can’t base things on real world observations. They can change, for the same reasons as discussed in the clock speed discussion (future variations in chip manufacture, etc.). You have to assume that the full range specified by the manufacturer could occur. Otherwise, why wouldn’t they have a tighter specification in the first place?

However, although we don’t know the absolute value of the voltage of the bandgap for an arbitrary chip, it’s probable that whatever that voltage is will be stable enough, over temperature and time, to accomplish the goal, provided each unit is individually calibrated and the calibration is saved and used.

To summarise my thoughts on addressing the problem:

1. Ideally, attach an external reference to the AREF pin that is precise and stable enough for our purposes. Also, the battery’s positive lead would have to be connected to one of the analog input pins through a resistor divider.
• If we don’t go with the above hardware solution, calibrate each unit and save the calibration in EEPROM as follows:
• Determine which is more stable over temperature and time; the bandgap or the 2.56V internal reference (the absolute voltage value is not important). If the 2.56V reference is more stable, it would be better to use it instead of the bandgap, but this would again require connecting the battery to an analog input through a resistor divider.

• If the bandgap is more stable or we don’t want to add circuitry to connect the battery to an analog input, then use the bandgap.

• Allocate a byte in the EEPROM for battery low detection calibration, and come up with a method to make sure it won’t be overwritten, such as what is being discussed here (or at least just declare some EEPROM space as reserved for system use only).

• To calibrate a unit you would load a battery calibration sketch. The sketch would have two modes; A calibration mode to determine the proper value and save it, and an adjust mode to allow the user to manually modify this value.

• Calibration mode would be run with the unit in an environment with a typical ambient temperature, say 22°C (72°F). You would disconnect the battery and power the unit with a stable, clean power supply set at the desired low battery voltage (hopefully the Arduboy would be designed so it was easy to do this). Calibration mode would run a loop that simulated the conditions of a typical high load, user program. For instance, display a quickly changing pattern that lights 75% of the pixels while sounding short beeps on the speaker two or three time per second. It would also continuously read (and perhaps average over a short time) and display a value obtained by using the ADC to read the input voltage. When the user feels that the unit has run long enough for the internal temperature and reading to to stabilize, a button would be pushed to save the value in EEPROM.

• A different technique, that makes it easier to set up the unit for calibration, would be to just power the unit with a stable, clean, accurate 5V supply connected to the USB port. Calibration mode would then read the bandgap voltage with AREF set to Vcc. Since we know the reference is exactly 5V, we can determine the actual voltage of the bandgap for this chip. Alternatively, if the Arduboy has been designed to allow using the 2.56V reference to read Vcc via an analog input, we just have to read the (scaled) Vcc using the 2.56V reference. Again knowing that Vcc is actually exactly 5V, we can determine the actual voltage of the reference based on the reading we get. This reading can then be used to calculate the low voltage calibration parameter to be stored in EEPROM. The problem with this technique is that it may not be as accurate as using a voltage equal to a low battery, because both the bandgap and the 2.56V reference can change slightly when Vcc changes. Also, operating at a higher Vcc will change the internal temperature of the chip, which can also affect the bandgap and reference voltages.
• Adjust mode would display the stored value and allow the user to tweak the value if it was later found to be slightly incorrect, or if more or less time was desired before a low battery warning. The user could also write down the value so it could be manually restored if it accidentally became corrupted.

• A sketch wishing to implement a low battery warning would call a provided library function fairly frequently, say once every few seconds. The function would read the battery voltage and compare it to the value stored in EEPROM. Every time the function reads a value higher that the stored value, it resets an internal counter stored in a private variable. Each time the function reads a value at or below the stored value, it increments the counter, up to a certain threshold. If the threshold count has been reached, the function returns true to the sketch, indicating that the battery is low. Otherwise, the function returns false.

• It would be up to the sketch to display a warning, beep the speaker, or do whatever is desired when low battery is indicated. Or, another library function could be added to display, or otherwise handle, a low battery warning in a standard way, that could be used by any sketch.

The absolute value doesn’t matter. The chip turns itself off based on the calculated value - so we can make guesses on that - if it moves slowly enough.

I’m not against a calibration tool if that helps though. Is USB voltage always that precise though?

My plan is to have 16 bytes available for system use. So far all I’ve god is brightness, sound on/off, and now battery calibration.

That’s why I said you would connect a stable, clean, accurate 5V supply to the USB port when doing the calibration. You would use a bench power supply, or something similar, that you would be sure was providing exactly 5V (within the tolerance needed to give an acceptable low battery calibration value).

Again, a better, but possibly harder to set up, method would be to power the unit at exactly the desired low battery voltage during calibration.

Can you explain in detail the technique you would use to do this? I’m not sure I’m following what you’re trying to say.

The code shown above. It uses the bandgap to measure voltage - so does the chips own brownout detection… so the absolutely value doesn’t matter. We’d be using the same exactly measurement (accurate or not) that the chip uses to decide when to turn itself off - making it the best possible way to guess voltage without a dedicated circuit (as you suggested).

It doesn’t matter if it’s really 2.3 or 2.6… if the chip THINKs it’s 2.41 and it shuts off at 2.4, then that’s all we need to know. Accurate would be better, but doesn’t really change the equation much.

Summary: If it’s accurate enough for brown-out detection it’s accurate enough for us

So please manually work through the code that you propose to use and tell me what the battery voltage will be for the two cases when the brown out detection actually turns out to be 2.2V or 2.6V, when set to 2.4V.

I"m not sure I see your point. No one has really suggested changing the fuse… So if the chip THINKS it’s 2.41 that’s all that matters. I think you already did the math to show reality could vary a lot in the worst case, but we can’t do anything about that. The chip is going to brown-out shutoff at 2.4v (calculated) regardless of the ACTUAL voltage. So if we know the same calculated voltage the chip does, then we know when the chip plans on turning off - which si the whole point - to know when the chip plans on turning off.

If you really want to join a more real-time discussion you could join the IRC channel.

I’m sorry; maybe my misunderstanding and the reason for my confusion is the \$64,000 question:
What actions and procedures do you feel a sketch should perform when it determines that the chip is about to turn off (such as informing the user, saving data, etc.)?

Sure, those are some good possibilities… would kind of be up to the app what it should do if it even cares. Though writing the EEPROM at the last second might be scary… we’d need to know it was more like last minutes I think.

If this is accurate at all of how our LIPO might behave it seems it might be possible to have an actual battery indicator.

OK, since you don’t want to be specific, let me ask in another way. In your opinion, how much time would be ideal between getting a low battery indication and the unit powering off due to brownout detection, assuming that you can only have a single time value?

Some people might want a fairly long time, like 10 to 20 minutes, so they can continue playing for a while, before connecting to USB to continue playing, or manually pausing and saving the game and then switching the unit off. However, for this case they would run the risk of loosing their position or other data if they continue for too long and the unit shuts down abruptly, without warning and without saving anything.

Others may want just a few seconds to likely only give the sketch time to save important data and then display “battery too low” before automatically saving and pausing the game, then waiting for either power to be restored via USB or the unit to shut down due to brown out detection.

Both of the above situations could be desirable, but if only one low battery time could be determined, what would you personally want it to be?

IIRC the Nintendo 3DS battery lasts about 4-5 hours, the light turns red when you have about 20 minutes left, and the light starts flashing when you have about 1-2 minutes left. That seems like a good idea, right down to standardizing the use of the LED for that purpose. It doesn’t interrupt gameplay to the extent that you would need to pause everything or pop up a message or worry about how different games might want to handle it. Just a warning light that eventually starts flashing when it gets really low.

1 Like

Well, I don’t think it would be one for starters… if the discharge curve looks anything like the Gamebuino we could tell them things like 30%, 20%, 10%, 5% and let the program decide what to do with that information. The program could decide to not even check the battery. Entirely the app’s decision. Really good apps probably should check if the build on lib doesn’t have some “os level” logic.

It also might be nice if when you first turned it on if the battery was super low you got a notification - before your game even loads…

One negative here is that Kevin has said the LED uses more current than the entire OLED screen… so by flashing the LED we’re not only telling th user about the problem, but also making it potentially a lot worse. Flashing for “really low” sounds neat though.

We could flash a few times when the user has about 10-20 minutes left and hope they see it, and then flash continuously when it’s extremely low. It wouldn’t even have to be a fast flash, could be once every second. It’s piercingly bright so I don’t think we’d want to leave it on for extended periods of time anyway, it should always be a quick flash. Is it possible to set up an interrupt for that?