Battery power level

Sure, those are some good possibilities… would kind of be up to the app what it should do if it even cares. Though writing the EEPROM at the last second might be scary… we’d need to know it was more like last minutes I think.

If this is accurate at all of how our LIPO might behave it seems it might be possible to have an actual battery indicator.

OK, since you don’t want to be specific, let me ask in another way. In your opinion, how much time would be ideal between getting a low battery indication and the unit powering off due to brownout detection, assuming that you can only have a single time value?

Some people might want a fairly long time, like 10 to 20 minutes, so they can continue playing for a while, before connecting to USB to continue playing, or manually pausing and saving the game and then switching the unit off. However, for this case they would run the risk of loosing their position or other data if they continue for too long and the unit shuts down abruptly, without warning and without saving anything.

Others may want just a few seconds to likely only give the sketch time to save important data and then display “battery too low” before automatically saving and pausing the game, then waiting for either power to be restored via USB or the unit to shut down due to brown out detection.

Both of the above situations could be desirable, but if only one low battery time could be determined, what would you personally want it to be?

IIRC the Nintendo 3DS battery lasts about 4-5 hours, the light turns red when you have about 20 minutes left, and the light starts flashing when you have about 1-2 minutes left. That seems like a good idea, right down to standardizing the use of the LED for that purpose. It doesn’t interrupt gameplay to the extent that you would need to pause everything or pop up a message or worry about how different games might want to handle it. Just a warning light that eventually starts flashing when it gets really low.

Well, I don’t think it would be one for starters… if the discharge curve looks anything like the Gamebuino we could tell them things like 30%, 20%, 10%, 5% and let the program decide what to do with that information. The program could decide to not even check the battery. Entirely the app’s decision. Really good apps probably should check if the build on lib doesn’t have some “os level” logic.

It also might be nice if when you first turned it on if the battery was super low you got a notification - before your game even loads…

One negative here is that Kevin has said the LED uses more current than the entire OLED screen… so by flashing the LED we’re not only telling th user about the problem, but also making it potentially a lot worse. Flashing for “really low” sounds neat though.

We could flash a few times when the user has about 10-20 minutes left and hope they see it, and then flash continuously when it’s extremely low. It wouldn’t even have to be a fast flash, could be once every second. It’s piercingly bright so I don’t think we’d want to leave it on for extended periods of time anyway, it should always be a quick flash. Is it possible to set up an interrupt for that?

We don’t need to if people use the frame API. We could check it in our frame management code.

I’ve now come to realise that you probably meant gamebuino, not gameduino, which is why I was confused (but never mentioned it) when you talked about it monitoring a battery.

I’ve now taken a look at the Game b uino schematics. The reason that the Gamebuino can monitor the battery voltage accurately enough to determine its charge state is because it was designed to be able to do so.

It always runs the CPU from a regulated 3.3V supply, regardless if running from battery or USB. Actually it only runs from battery but can do so while it’s being charged. Having a stable and accurate 3.3V (+/- 1%) powering the CPU means it can use this as an exact reference for the ADC. The Gamebuino also has the raw battery positive terminal connected to an analog input through a resistor divider (as I proposed doing for the Arduboy), allowing the ADC to take a reading and calculate the battery’s true voltage accurately. The 3.3V is also connected to AREF but I don’t think it’s required.

However, @bateske has mentioned that the Arduboy’s CPU will be running at 3.7V, suggesting that it will be powered by the raw, unregulated voltage directly from the battery. Therefore Vcc cannot provide the accurate known reference voltage that we need to determine the battery state. And, we already know that the internal reference and bandgap are not precise enough either.

With no precise, stable reference, we’re left with individually calibrating and saving a value for each individual Arduboy, as I proposed.

However, I’ve realised that even if the Arduboy’s CPU is run direct from the battery, it may be easy to avoid needing individual calibration, with just a few simple, inexpensive hardware changes:
There appears to be a linear LDO regulator on the Arduboy in the photos shown on Kickstarter. It’s the small, black, chip with five leads, surrounded by C7, C10 and R7, in the lower left corner. I’m guessing it’s a 3.3V regulator required for powering the display. If this is the case, its output could be connected to the CPUs AREF pin, giving us the accurate reference that we need. Now all we need to do is tie the positive lead of the battery to an unused analog input through a two resistor voltage divider. (The resistors should have a tolerance of 1% or better.) Who knows?.. Maybe this is already part of the design. @bateske has been pretty quiet about some of the production Arduboy’s specific hardware and wiring.

Again, accurate is not so relevant if we’re using the same reference point as the chip itself… if the curve of our LIPO looks like theirs then we’d see the discharge pattern even if all we had was the bandgap for reference. Again, will require experimentation to see how useful it is. If we can get a dedicated voltage measuring circuit that’s even better (and a great suggestion) - but if not we’ll try and manage with what we have. I don’t see how calibration helps with the low point… since the shutoff is fixed at 2.4 volts as measured by the CPU without correction… but it might help with a more accurate battery level such as 50%, 70%. I’ve never said I oppose calibration. I’ve only said I think the raw value has value for some uses.

My point has just been using bandgap has to be way better than nothing at all - and it’s as accurate as the chip’s own brownout detection.

No reason to keep discussing this with me - but please talk with @bateske about improving the circuits. If we get better voltage detection we’ll use that. But when I have my devunit I’m going to add support for the simple measurement and see how well it works in practice with different units and diff batteries.

Give me just one of those uses. What is the goal and what procedure would be followed to accomplish it?

I know that at this point you’re probably seeing my latest posts as long winded, redundant rants, but if you’re willing to listen one more time, I’d like to use whatever example you give to present a detailed explanation of why attempting to use just the raw, uncalibrated bandgap and Vcc to accomplish it would be difficult. This would be assuming that the bandgap could vary between different Arduboys, from the minimum to maximum values given in the specification (1.0V to 1.2V), and the battery measurement technique would be based on:

I would use the Gamebuino battery discharge curve that you’ve posted.

I already have and said why “accuracy” does not matter on multiple occasions. So nothing more to be said on the topic. Next please.

You already have given an example use? Please just point me to a post of yours that gives it then.

Anyone know what the magic constant here is all about?

@Dreamer3,

It’s explained in the comments section of the original article.

Thanks, super helpful.