firmware.hex (61.2 KB)
I got a little time last weekend to get some of my proof-of-concept code verified, so I’m ready to start talking about my game jam entry. This is still very much Work-in-Progress, but I’ll move it over to the Jam category once I’ve got the actual game play loop implemented.
My idea for Find the Story is one I’ve had for a while. I’ve loved the idea of making a coloring book app for the Arduboy due to the absurdity of it. The screen is black and white, so you don’t have a lot of choices when it comes to coloring things in. However, there’s a game on the Nintendo 3DS I got years ago called Puzzler, and one of the puzzle types on it is called “Fill in the Pix”. In this, you move across a large black-and-white space with lots of different regions. Some have dots in them, and you tap on those to fill in the area. When you’re done, you’re left with a black-and-white cartoon with some sort of punchline.
In taking this concept to the Arduboy, the biggest technical challenge was figuring out how to represent the bitmap. I wanted a bitmap much larger than the screen to give a sense of scale, so I picked 512x256, which is similar to the size of the original Macintosh screen. I also wanted the drawing of this screen to be relatively quick, so I came up with an encoding scheme that converted the B/W bitmap into a 256 color bitmap, with each region represented as a different color value and a palette mapping in the decoder that picks if each region is black or white or gray. I ended up having the external lines of the image coded as 0, internal lines as 1, unfilled regions as 2-127, and filled regions as 128-255.
To produce my first bitmap, I started with some B/W line art and used a paint.net filter to create an 2-pixel outline on a layer, then deleted the original art and started doing freehand drawing of the lines. Then, I moved from paint.net to Grafx2, another open source tool that is similar to the classic Deluxe Paint and designed for working with 8-bit palette images. I made a “editing palette” where all the colors were different and a few preview palettes which would show me just outlines or the filled image.
To convert this into something the Arduboy can use, I wrote a Python script to load the PNG, then output it as C source with a simple RLE format. For the testing code, I’m not yet loading from the FX chip, but just saving it to the program flash. Using that scheme, the dragon bitmap compresses to 15738 bytes. Right now, I’m saving the length in bytes of each row at the start of the row, but when converting this to FX, I’ll move all those row offsets to the start of the image so they can be read at once, then we just read the row data for the rows that will be shown on screen.
I also still need to write my scaled drawing code that does the 4:1 downscale of the bitmap to show an overview of the image. That will be used when you’ve completed the puzzle.
I’m planning another non-FX test soon, once I get some sort of cursor control implemented. This test just automatically scrolls around the edge of the picture with three different palette settings. It’s running mostly at 60fps, but occasionally you’ll see the TX LED light up, indicating a frame took a little long. I’ll do perf optimization once I’ve incorporated the FX data, since reading from SPI flash will be a bit less performant than reading from PROGMEM.