That could also be related to the contrast switching. Depending on where in the 2-plane frame cycle you are at when you leave the title screen, the game’s gray shades will either use the higher or lower contrast plane of the L4_Contrast mode. Changing the mode to L3 could also fix that as it doesn’t mess with the contrast at all.
I wonder if the default configuration should use L3 to begin with.
Will it be possible to render directly from the SD card with this library? It would be excellent to be able to stream some hi-res, 4 colour images from the SD card but I suspect that the use of the screen memory will be an issue. @Mr.Blinky and @brow1067 is it possible?
The main concern is the tight rendering budget, but depending on what kind of drawing you want to do I believe it should be possible. The refresh needs to be 120-150 Hz (the lib defaults to 135) to avoid strobing. This gives a budget of 7.4 ms, of which about 1.2 to 1.4 ms is needed to perform the transfer to display memory, depending on sync mode used, so you need to finish all rendering and game logic within 6 ms.
Best case example: at 18 cycles/byte you could fill the entire buffer from the flash chip in 1.2 ms which is only 20% of the rendering budget, allowing static fullscreen 4-color images.
EDIT: judging by the performance of @Mr.Blinky’s FX sprite drawing routines, you could probably stream sprites as well. The rendering in that video is at 60 fps so stretching to ~150 fps doesn’t seem impossible.
I haven’t looked into this new grayscale method yet (still catching up) but it is possible to stream 1K image data directly from flash to the OLED display (without buffering) and there’s the FX::displayPrefetch() that read data from flash while copying the display buffer to OLED. The last one will be useful for fetching a background plane and then use FX::drawBitmap to draw the sprites on.
Frankly, I don’t understand the magic that is happening under the covers in this or the FX library. However, I am surprised that FX::displayPrefetch() would work as it seems to be handling the copying of data to the OLED and that is where the magic of the grey scale library is also controlling the output.
Likewise streaming directly to the OLED would mess up the delicate timing of the frames that the gs library is handling.
Or … I could be totally misunderstanding how this whole thing works and how the two libraries interact with each other.