Greyscale for arduboy

FYI if, instead of using the usual nextFrame() call in loop(), you busy waited for a specific time (in microseconds) to pass between switching buffers and made the wait time interactively selectable you could find a wait value that will make things run fine for a few seconds. The wait value will be different for different Arduboy units, and not repeatable/stable on the same unit so of course it would not be acceptable to do something like this for a released application. I tried this as part of my full screen double buffer tests just to confirm my expectations before giving up on it.

Regarding this magic value I made another proposal in a different thread. Maybe it is possible to find a good average value and the jitter around it instead of using a fixed value. So in the update loop you jitter randomly around this value. I thought this might reduce flicker.

Many thanks for the detailed answer and your work on the topic.
I am still curious about your below comment:

Does that mean the command can fail or we fail to send it in the right moment?

Does that mean the command can fail or we fail to send it in the right moment?

We fail to send it at the right moment.

Maybe it is possible to find a good average value and the jitter around it instead of using a fixed value. So in the update loop you jitter randomly around this value. I thought this might reduce flicker.

It won’t work (reliably) and the jumping around will not be an improvement on using a constant to approximate the expected value. You have two systems (MCU and Display controller) each with its clock and the phase relationship between the clocks is unknown and varying. You can only get reliable behaviour if you use the same clock for both (or derive one clock from the other), or setup a feedback loop (i.e. via vsync signal).

Ok I see. So the clock drift between the MCU and the display controller cannot be compensated as there is no feedback nor a common clock source.
I wonder if there is any other source of feedback that can be used (just kidding) like measuring some voltage glitches on the oled connector to see when it does vblanks etc? I assume there will be some variation as the operation of the controller is very periodic. :star_struck:

Again, you thought this out very deeply! I am impressed. Nice work. I am a fan of outside the box thinking. :smiley:

I just want to note that we have tried on 4 separate occasions from multiple factories to try and get the vsync signal broken out, but they just won’t do it. Some even said it was impossible, which, isn’t right.

I wonder if there is any other source of feedback that can be used (just kidding) like measuring some voltage glitches on the oled connector to see when it does vblanks etc?

lol :laughing:

Again, you thought this out very deeply! I am impressed. Nice work.

Why, thank you. I wasn’t expecting to succeed, and indeed I didn’t, but when I got my Arduboy last week I decided to try my luck and see if something had been overlooked with the display controller. The silver lining is that I learned a lot about this controller’s idiosyncrasies and I have stumbled upon some command’s side effects (i.e. hacks) that may be useful for demos, cutscenes or special effects some day.

@bateske Impossible probably means ‘too expensive’ in this case but thank you for trying!

This actually is possible. I’ve placed an oscilloscope probe close to the display where it acts as an antenna which picks up voltage spikes that are in sync with the display’s internal clock frequency. I don’t remember for sure but I think the spike frequency matched the horizontal row timing.

However, to make use of this you would need pickup, amplification and decoding circuitry, which would probably be (unjustifiably) expensive to add and take up a fair amount of real estate.

2 Likes

@veritazz FYI I have pushed a commit to the repository which reduces flickering.

Thread I command thee: get up and walk

Hi all, apologies for bringing this back from the dead (lies, I totally loved it) but I have updated the demo in this repo and I think some of you might be interested.

By using the zoom function of the display controller (doubling the line size) the demo can now show 2bit grayscale fullscreen using RAM footprint for the framebuffer. The demo also cycles through different modes when the B button is pressed. half-screen centered -> half-screen top -> half-screen bottom -> half-screen bouncing up and down -> full-screen.

The demo uses 1/3 less CPU by keeping one of the half-frames up twice as long. The demo uses display controller’s RAM as frame buffer and switches between the two halves of the display controller frame buffer using controller commands only, so as long as the image doesn’t change the CPU doesn’t have to do much.

I have a better phone so this time I could capture a video. Still, it does look better on a real device!

untitled.m4v (1.5 MB)

3 Likes

Necropooooooost.

Wow! Uh, so is the flickering visible to the eye? It still doesn’t solve the vsync problem, this just reduces the memory footprint, is that right?

Yes, but it’s not that bad because the entire screen is swapped by the controller during vsync, please try it and let me know what you think. The flickering could be further reduced by adding an interactive calibration screen (I tried this already) for the user to adjust a time constant every time they feel there is too much flickering. Upon exiting the calibration screen the time constant is then saved to EEPROM.

It allows to use the current size framebuffer (1024 bytes) for 128x32 pixel grayscale screen mode. The mode would need a separate set of drawing and SPI transfer methods because the layout of the framebuffer is different but it uses very little extra CPU cycles compared to monochrome because, while the number of pixels to be drawn/transferred is the same as monochrome mode, switching between buffers on the controller is done with a two-byte controller command.

Another advantage is that the game’s frame rate is decoupled from updating the display to achieve the gray effect i.e. a game can run at any frame rate, even a very low one, while the switching between buffers on the display can happen at a (fixed) high speed to achieve grayscale display.

The technique boils down to:
1- Start with a good default for the display buffer swap time constant and use the value from EEPROM if available.
2- Provide a calibration screen to fine tune the display buffer swap time constant and save it to EEPROM. The user will use the calibration screen when needed.
3- Use a timer with a delay derived from the display buffer swap time constant to swap the half buffer being displayed (using a short display controller command). If a new framebuffer update has been posted by the application then transfer the buffer to the display controller taking care to write to the half buffer not being currently displayed.
4- The user application paints the framebuffer in the MCU’s RAM and posts it.

I have updated the repository with a couple of changes:

1- A-button toggles suspended rendering i.e. the display’s frame buffer is no longer updated. Bouncing demo mode still bounces the screen up/down because the effect is achieved via display controller commands instead of using the CPU.
2- Adjust display controller buffer swap period using UP/DOWN buttons to make flicker almost disappear. New video attached.

period_adj.m4v (1.2 MB)

2 Likes

Don’t say this, it’s misleading. The entire screen might be swapped by the controller (which seems to be a win), but there is no way you can guarantee it is during vsync (unless the controller magically does - in which case there would not be any flickering or need to calibrate anything). A software sync can’t stay in sync because the speed of the CPU/display frequency actually changes with the amount of charge in the battery. IE, the frequencies you are trying to synchronize are NOT constant.

Unless you can somehow have a battery reading and use that to adjust the sync accordingly.

This [decoupling] can be done in either case, just at the expense of CPU. You can have a frame rate of 10 and paint the screen 120 times a second, decoupled. Doing it via the zoom mode just makes it a VERY low CPU operation, which is interesting.

You can’t “adjust the sync”. You’re adjusting the speed. There is no real sync. And the problem isn’t the speed, it’s the SYNC. Without true sync you’re fracked. Even if you invented a perfect mapping table and everything was exact it’s too easy to get out of SYNC… and once you’re out of sync even if your timing is PERFECT now it looks every WORSE.

Perfect timing with NO sync is worse than random timing. Play with all the demos and get your timing just a little off then get it perfect again. You can be left with ugly artifacts on the screen for MINUTES - and the closer you are to the perfect speed, the longer your display stays borked. :slight_smile:

What your suggesting only makes sense if you can get a TRUE sync PERIODICALLY. Like if you could sync up every 60 seconds… could you keep the display stable inbetween with a carefully calibrated timer - that might be worth a solid effort… but with NO way to capture a true sync, you’re just a ship without a rudder.

Call me crazy but that sounds like if the more off-sync it will be the better the gray will display. It’s obvious that I don’t have any real idea of how that really works.

1 Like

You’re not crazy. The examples that showed grayscale before that attempted this artificial SYNC also suffered from this same artifact and it would populate as a slowly moving scan line. So you are trading what would otherwise look like a random flicker into something that is traceable with the eye.

This example here is novel as it saves on both cpu overhead and ram.

1 Like

There is some truth in that. Too bad that “make it perfectly out of sync” is the same type of problem as “make it perfectly in sync”, LOL. You can’t do either well without a sync signal.