Greyscale for arduboy

I have updated the repository with a couple of changes:

1- A-button toggles suspended rendering i.e. the display’s frame buffer is no longer updated. Bouncing demo mode still bounces the screen up/down because the effect is achieved via display controller commands instead of using the CPU.
2- Adjust display controller buffer swap period using UP/DOWN buttons to make flicker almost disappear. New video attached.

period_adj.m4v (1.2 MB)


Don’t say this, it’s misleading. The entire screen might be swapped by the controller (which seems to be a win), but there is no way you can guarantee it is during vsync (unless the controller magically does - in which case there would not be any flickering or need to calibrate anything). A software sync can’t stay in sync because the speed of the CPU/display frequency actually changes with the amount of charge in the battery. IE, the frequencies you are trying to synchronize are NOT constant.

Unless you can somehow have a battery reading and use that to adjust the sync accordingly.

This [decoupling] can be done in either case, just at the expense of CPU. You can have a frame rate of 10 and paint the screen 120 times a second, decoupled. Doing it via the zoom mode just makes it a VERY low CPU operation, which is interesting.

You can’t “adjust the sync”. You’re adjusting the speed. There is no real sync. And the problem isn’t the speed, it’s the SYNC. Without true sync you’re fracked. Even if you invented a perfect mapping table and everything was exact it’s too easy to get out of SYNC… and once you’re out of sync even if your timing is PERFECT now it looks every WORSE.

Perfect timing with NO sync is worse than random timing. Play with all the demos and get your timing just a little off then get it perfect again. You can be left with ugly artifacts on the screen for MINUTES - and the closer you are to the perfect speed, the longer your display stays borked. :slight_smile:

What your suggesting only makes sense if you can get a TRUE sync PERIODICALLY. Like if you could sync up every 60 seconds… could you keep the display stable inbetween with a carefully calibrated timer - that might be worth a solid effort… but with NO way to capture a true sync, you’re just a ship without a rudder.

Call me crazy but that sounds like if the more off-sync it will be the better the gray will display. It’s obvious that I don’t have any real idea of how that really works.

1 Like

You’re not crazy. The examples that showed grayscale before that attempted this artificial SYNC also suffered from this same artifact and it would populate as a slowly moving scan line. So you are trading what would otherwise look like a random flicker into something that is traceable with the eye.

This example here is novel as it saves on both cpu overhead and ram.

1 Like

There is some truth in that. Too bad that “make it perfectly out of sync” is the same type of problem as “make it perfectly in sync”, LOL. You can’t do either well without a sync signal.



@bateske LMAO damn in almost peed my pants laughing there.:laughing:

All I can say Kevin is get back to work …


I do not believe it is misleading for two reasons: first because I explain in the source code how this work and the fact that flickering cannot be avoided (without VSYNC). It’s just less flickering with less CPU usage (after manual calibration). And second:

because this is exactly what the controller does (and, from memory, I believe I even wrote that in the source code). Also, even when the controller swaps the buffer during vsync, flickering doesn’t go away completely because without VSYNC you cannot guarantee you will send the commands to swap the screen at the right time resulting in some frames staying visible longer then they should.

I know this and I have posted on this topic before. I am one of those annoying “it can’t be done without VSYNC” persons. What I published here is a low-overhead full screen (albeit half-vertical-resolution) gray technique based on double buffering on the display controller that results in less flickering compared to relying on the CPU pushing updates while racing display refresh.

FYI I never claimed the decoupling is only possible done this way. What I said is that the decoupling can be done with low CPU usage. And, if you involve the CPU, the frequency and intensity of gliches is likely to increase.

Phew I did it again: I wrote too much! :stuck_out_tongue_closed_eyes:

Did anyone actually try the demo? While far from perfect or convenient (because of the calibration step) I think it may be usable especially when the screen is not zoomed (i.e. half-height) because even without calibration the flickering is not very annoying (the controller is refreshing the display twice as fast because there is half the number of scanlines).

1 Like

How is the buffer different for zoom mode? Would it not just be a 128x32 buffer instead?

Yes, but it’s not that bad because the entire screen is swapped by the controller during vsync

If true (which seems possible) that’s pretty cool, but too bad you’re stuck with 32 pixels of vertical resolution, seems like a big loss.

In pixel it’s 128x32 but with 2 bits per pixel there are different ways of organizing the buffers. The demo is using two 128x4 bytes buffers. Given a gray level enconded as two bits b1b0, buffer0 holds the value of b0 for all pixels and buffer1 holds the value of b1 for all the pixels. Grayscale drawing functions need to take that into account. Note that you could use existing functions to draw the same primitive twice in two different buffers but that wasteful because coordinate to offset computing would happen twice while a layout-aware function can find the position in the second buffer by adding a constant offset.

Conversely if the grayscale level were to be encoded in adjacent bits into the same byte (4 pixels per byte) both the SPI transfer function (because it would have to unpack/select the bits) and grascale drawing functions would have to be designed to support the specific layout.

Yes, it feels that way to me too. WHich reminded me (if memory serves) that the Commodor 64 had a multicolor sprite mode that resulted in sprites with half of the (horizontal) px count.

1 Like

If speed mattered the only way to do this would be separate sprites and buffers. (as you say you’re doing already) One benefit being you don’t necessarily need any new drawing code. Bit-wise operations (shifting) on AVR are ridiculously slow. Mixing the buffers would require a lot of additional effort at render-time to tear the buffers back apart and render just half the content.

Of course in a lot of cases speed doesn’t matter so much.

Really impressive demo. After manual calibration the flicker was greatly reduced. Wild to see gray work so well on Arduboy :smiley:


I just had a sorta-crazy idea. A lot of time has been spend trying to “dial in” the delay so it’s “just right”… but we know that’s impossible without a FR pin… and some games have done well just picking a static timer value and running with it (and accepting some flicker).

I wonder what it would look like if dialed it in but then ran with a random timer offset… so if the “perfect” value was say 16ms… then you render between 15.75 and 16.26ms, randomly… or non-randomly in some repeating pattern… I wonder if that would produce a “nicer” flicker pattern.

Instead of the “scan line” flying by (quickly or slowly) this would have the effect of randomizing it’s appearance - for better or worse.

I think I made a suggestion to this effect, but my solution was to allow the user to fiddle with the vsync until they were happy with it. Randomizing it would theoretically give a better “perception” of the tear, I think. Give it a try and let us know! :slight_smile:

Probably won’t find the time, but if someone else did I’d be happy to review the code and help think about if it was actually doing what they thought it was or not.

Yeah, the idea is you’d still let them “fiddle” but once you got it tuned in you’d switch to “exactly what you tuned +/- random offset” mode.

It might not really look that different from “slightly off tuned” (which is how Sensitive looked to me).

It seems Thumby users have come up with a (better?) method to keep the SSD1306 in sync. Technical details are interesting…