Greyscale for arduboy


Hi I was wondering would anyone know how to display grey scale images on the arduboy. I have seen examples but they only apply to the arduboy dev kit.
Any help would be appreciated

1 Like

'Ard Drivin' is coming
Grayscale dithering + lower FPS = higher resolution!
(Mike McRoberts) #2

It cannot be done. The display is made up of white LEDs that can either be on full brightness or off. No other values are possible.

What you have seen is examples of a pixel being flashed on and off very quickly to simulate grey. This is fine for small demos but the Arduboy and its display cannot do this quick enough to get grey in a game without lots of flicker.

1 Like

(Mike McRoberts) #3

For now you’ll just have to simulate grey values using dither. Check out ‘Shadow Runner’ for an example of this.


(Josh Goebel) #4

The Arduboy can most definitely do it fast enough - to be clear: the issue isn’t speed in synchronization. You’d need to sync the display with the CPU (for best results), which isn’t possible without a vsync pinout.

There is nothing devkit specific about the original demos other than that they were compiled against the original libs… a lot of people here shoudl be able to easily port them to the production Arduboy if they really wanted to.



I don’t think this deserve its own topic so I’ll leave this here in case someone finds it interesting/useful. This is a demo for half height double buffering grayscale for Arduboy. I’m pasting here a copy of the text which is also included with the source code.

= Arduboy ssd1306 GDDRAM half-height double buffering demo

== Theory of operation
The number of displayed lines refreshed is reduced by setting
ssd1306’s COM mux ratio, in the example code below the mux ratio
is set to 32 lines (half of the maximum). Each height frame is
uploaded to the GDDRAM region not being rendered, then
the set display line offset command is used to display the newly
uploaded data.

== Advantages and disadvantages
The set display line offset command takes effect between frames
so there is no tearing. However, the display is refreshed at
higher frequency (twice as fast in this example because the number
of lines is halved). When displaying grascale, and in order to minimize
flickering, the contents of GDDRAM have to be updated as fast as the
display’s frame rate. The exact frame rate is unknown, changes
with time (part tolerances, temperature, battery voltage) and there
is no mechanism available in “Arduboy V1” production hardware to
control ssd1306’s fosc or pace screen updates using some feedback
from ssd1306 [1]. This in practice means that while there is no tearing,
flickering is unavoidable to some degree but this example may still
be useful for demos using grascale or cutscenes.
If anyone uses this technique it would be nice to be credited for it :wink:
unless this turns out to be something everyone knew already and I was
just late to the party!

== Full screen grayscale
I have investigated using this technique to achieve full screen
grayscale by alternating ssd1306 refresh between the top and
bottom half of the screen and my conclusion is that it is not viable.
The technique involves reducing the mux ratio to half-height as above
but both the start line offset and the screen display offset are changed
during display blanking so they are applied at the same time
between frames. Both commands involved take affect during blanking, but
it appears the set display line offset command takes effect on current frame+2
(i.e. during the second blanking period after the command is sent) so this
has to be taken into account.
While in the half screen case only one command is sent, in the full screen
case two commands are sent back to back and it is possible for the second
to be received too late to take effect in the next frame depending on the
exact timing the command is sent in relation to the blanking period. [2]

Additionally even if the non-atomicity of the two commands execution is ignored
the implementaiton needs to make sure it is sending exactly one command sequence
per frame which requires control of ssd1306’s fosc or a feedback mechanism to
notify the MCU of the blanking period neither of which is possible on "Arduboy V1"
production hardware. So the same issue that makes tearing unavoidable
when rendering the full screen at once exists in this case, only the effect
is visually worse than tearing because when the glitch occurs the bottom half
of the screen is rendered in the top half for a single frame before being corrected.
The overall effect of the glitch is at best a ghosting image of one half of the screen
appearing in the other half once in a while. Which is probably undesirable for
most applications unless the glitchy behaviour serves the theme of the application/demo

[1] The relevant pins are not available on the connector.
[2] Interestingly the fact that the set display line offset command
takes effect one frame later than expected would help working around
the non-atomicity of two sequencial commands in the case when the internal
registers they affect are latched between the reception of the two commands.
So this ‘unexpected’ (and undocumented) behaviour of the set display
line offset command may indeed be intentional.


(Michael Gollnick) #6

Very nice description. Can you share a video?
A question about the flickering. If you transfer data into the GDRAM region not being rendered why does it flicker?
I understood that you use the set display line command to tell the controller which part of the GDRAM to transfer. Is this command executed on a frame boundary by the controller? If so there should be no flicker, right? Maybe I did not understand it correctly? :wink:


(Kevin) #7

Very well written post, would also like to see some video! :smiley:



Can you share a video?

Sharing is not the issue but first I have to take a decent one. I tried and failed :grimacing:

Is this command executed on a frame boundary by the controller?

Yes, the command is executed by the controller on a frame boundary but see below.

If you transfer data into the GDRAM region not being rendered why does it flicker?

We are still turning pixels on/off so the perceived intensity is somewhere in between white and black instead of setting the intensity of pixels to an average. So there are two main sources of flickering:

1- Switching the other half of GDDRAM in each display refresh cycle without fail cannot be guaranteed because we don’t have a vsync signal. This causes some pixel state to persist longer than others for a short while and makes the resulting apparent intensity flicker.
2- Even if you were able to do the above there would be some residual flickering depending on other factors, mainly the refresh rate of the display.

1 Like


FYI if, instead of using the usual nextFrame() call in loop(), you busy waited for a specific time (in microseconds) to pass between switching buffers and made the wait time interactively selectable you could find a wait value that will make things run fine for a few seconds. The wait value will be different for different Arduboy units, and not repeatable/stable on the same unit so of course it would not be acceptable to do something like this for a released application. I tried this as part of my full screen double buffer tests just to confirm my expectations before giving up on it.


(Michael Gollnick) #10

Regarding this magic value I made another proposal in a different thread. Maybe it is possible to find a good average value and the jitter around it instead of using a fixed value. So in the update loop you jitter randomly around this value. I thought this might reduce flicker.


(Michael Gollnick) #11

Many thanks for the detailed answer and your work on the topic.
I am still curious about your below comment:

Does that mean the command can fail or we fail to send it in the right moment?



Does that mean the command can fail or we fail to send it in the right moment?

We fail to send it at the right moment.

Maybe it is possible to find a good average value and the jitter around it instead of using a fixed value. So in the update loop you jitter randomly around this value. I thought this might reduce flicker.

It won’t work (reliably) and the jumping around will not be an improvement on using a constant to approximate the expected value. You have two systems (MCU and Display controller) each with its clock and the phase relationship between the clocks is unknown and varying. You can only get reliable behaviour if you use the same clock for both (or derive one clock from the other), or setup a feedback loop (i.e. via vsync signal).


(Michael Gollnick) #13

Ok I see. So the clock drift between the MCU and the display controller cannot be compensated as there is no feedback nor a common clock source.
I wonder if there is any other source of feedback that can be used (just kidding) like measuring some voltage glitches on the oled connector to see when it does vblanks etc? I assume there will be some variation as the operation of the controller is very periodic. :star_struck:

Again, you thought this out very deeply! I am impressed. Nice work. I am a fan of outside the box thinking. :smiley:


(Kevin) #14

I just want to note that we have tried on 4 separate occasions from multiple factories to try and get the vsync signal broken out, but they just won’t do it. Some even said it was impossible, which, isn’t right.


Greyscale 2bit 4 Colour success with SSD1306

I wonder if there is any other source of feedback that can be used (just kidding) like measuring some voltage glitches on the oled connector to see when it does vblanks etc?

lol :laughing:

Again, you thought this out very deeply! I am impressed. Nice work.

Why, thank you. I wasn’t expecting to succeed, and indeed I didn’t, but when I got my Arduboy last week I decided to try my luck and see if something had been overlooked with the display controller. The silver lining is that I learned a lot about this controller’s idiosyncrasies and I have stumbled upon some command’s side effects (i.e. hacks) that may be useful for demos, cutscenes or special effects some day.

@bateske Impossible probably means ‘too expensive’ in this case but thank you for trying!


(Scott) #16

This actually is possible. I’ve placed an oscilloscope probe close to the display where it acts as an antenna which picks up voltage spikes that are in sync with the display’s internal clock frequency. I don’t remember for sure but I think the spike frequency matched the horizontal row timing.

However, to make use of this you would need pickup, amplification and decoding circuitry, which would probably be (unjustifiably) expensive to add and take up a fair amount of real estate.


FR Sync Signal for Everyone

@veritazz FYI I have pushed a commit to the repository which reduces flickering.



Thread I command thee: get up and walk

Hi all, apologies for bringing this back from the dead (lies, I totally loved it) but I have updated the demo in this repo and I think some of you might be interested.

By using the zoom function of the display controller (doubling the line size) the demo can now show 2bit grayscale fullscreen using RAM footprint for the framebuffer. The demo also cycles through different modes when the B button is pressed. half-screen centered -> half-screen top -> half-screen bottom -> half-screen bouncing up and down -> full-screen.

The demo uses 1/3 less CPU by keeping one of the half-frames up twice as long. The demo uses display controller’s RAM as frame buffer and switches between the two halves of the display controller frame buffer using controller commands only, so as long as the image doesn’t change the CPU doesn’t have to do much.

I have a better phone so this time I could capture a video. Still, it does look better on a real device!

untitled.m4v (1.5 MB)


(Kevin) #19


Wow! Uh, so is the flickering visible to the eye? It still doesn’t solve the vsync problem, this just reduces the memory footprint, is that right?



Yes, but it’s not that bad because the entire screen is swapped by the controller during vsync, please try it and let me know what you think. The flickering could be further reduced by adding an interactive calibration screen (I tried this already) for the user to adjust a time constant every time they feel there is too much flickering. Upon exiting the calibration screen the time constant is then saved to EEPROM.

It allows to use the current size framebuffer (1024 bytes) for 128x32 pixel grayscale screen mode. The mode would need a separate set of drawing and SPI transfer methods because the layout of the framebuffer is different but it uses very little extra CPU cycles compared to monochrome because, while the number of pixels to be drawn/transferred is the same as monochrome mode, switching between buffers on the controller is done with a two-byte controller command.

Another advantage is that the game’s frame rate is decoupled from updating the display to achieve the gray effect i.e. a game can run at any frame rate, even a very low one, while the switching between buffers on the display can happen at a (fixed) high speed to achieve grayscale display.

The technique boils down to:
1- Start with a good default for the display buffer swap time constant and use the value from EEPROM if available.
2- Provide a calibration screen to fine tune the display buffer swap time constant and save it to EEPROM. The user will use the calibration screen when needed.
3- Use a timer with a delay derived from the display buffer swap time constant to swap the half buffer being displayed (using a short display controller command). If a new framebuffer update has been posted by the application then transfer the buffer to the display controller taking care to write to the half buffer not being currently displayed.
4- The user application paints the framebuffer in the MCU’s RAM and posts it.