If you have a usbasp icsp programmer and avrdude already installed (you should if you have the arduino ide installed) I wrote a super simple windows batch script to automatically set/verify the correct fuse bits, write the bootloader and even flash a game hex all in one go (you just plug in the usb, icsp wire, and drag the hex onto the bat file and it will handle the rest). I have it on my old laptop so if it’s something that could help you I can dig it up.
@JinxLynx daylight readability is important for me, so I probably won’t go down that way but I’m open to other LCD panels if they don’t rely on the back light to be readable…
Thanks. I had a quick look at the PDF and it’s interface is a bit different than the displays I’ve played with sofar. It doesn’t use standard SPI for serial interface. But it’s not that complicated to implement.
One important thing about (most) LCDs to remember is that the refresh rate is very low compared to OLED and fast games will be hard to play on them.
As for the bootloader @Keyboard_Camper and @sjm4306 already gave you some tips. If you have the Homemade package installed you can easily install the bootloader through the tools menu By selecting Homemade Arduboy board then your Arduino board, cathy3K and display type. You can us an USBasp, Arduino UNO or any other supported programmer by the IDE.
like @Mr.Blinky, I have used similar 128x64 graphic LCDs in the past, and would strongly advise against using them for an Arduboy build. They usually have horrible viewing angles, produce a lot of ghosting if driven more than a frame or two a second, and require a lot of pins to interface with.
A few years ago, I went through the effort of creating a shift register backpack that reduces the number of required interface pins down to 2, but that really slows down data transfer speeds and introduces some timing constraints that the code has to take into account as well. https://bitbucket.org/serisman/pcb-ks0108-128x64-glcd-backpack/src/master/
So, not very well suited for an Arduboy I’m afraid.
If you end up using the SAMD21G18a (or similar), you might just have enough RAM and CPU speed (especially if you can use a DMA channel to drive the LCD) to make one of those cheap color TFT LCD screens work out.
Something like this: https://www.aliexpress.com/item/p/32392376642.html
Edit: One challenge would be that the 128x64 resolution wouldn’t cleanly map to the 220x170 resolution. So you would either have to deal with a black border, or an un-even stretched screen.
Oh yes, I completely agree. I think the same could be said about using such an upgraded MCU (and some of the other deviations in this thread) as well, though. It doesn’t really seem like an Arduboy at that point, even though some/most Arduboy games could be back-ported to play on it.
I was simply trying to respond with another potential display option that fulfills the cheaper-than-but-as-big-as the 2.42" oled criteria. In hind-sight, because of the resolution scaling issue, the TFT display is not really a good option for back-ported Arduboy games, even though it feels like a better fit for the suggested MCU.
I guess my 2 cents are that I don’t really see a compelling reason to move up to the SAMD MCU for an Arduboy to begin with, unless the platform was expanding more towards the Meta/Pokitto territory (which doesn’t seem likely). If more RAM or flash space is the driver, there are higher-end 8-bit AVRs available that are much closer and more compatible to the 32u4 than moving all the way to 32-bit ARM. Or, just embrace the ‘cart’ SPI flash expansion that seems to be the likely future and will still allow for larger games (i.e. more levels and/or more graphics) that could still be directly backwards compatible (without the extra levels/graphics) with Arduboy’s that don’t have the extra SPI flash space. I think most games run out of flash space before RAM anyway, and it is probably usually all the assets (i.e graphics) that push the flash space so high, not so much the program code, so offloading these to the external flash makes a lot of sense and should mean larger and more complicated games are possible.
Part of that is because there isn’t enough RAM to make much use of RAM,
so people tend to shove everything into progmem.
E.g. there isn’t enough RAM to store a particularly large map, so most people put their maps in progmem and thus we don’t have many games that generate their map ‘on the fly’.
@Botisaurus’s Arduwars is an example of a game where RAM started to become an issue.
The map was/is struggling to fit in RAM.
Time and time again we see people “wasting” a lot of time trying to find more program space (and frequently RAM, as well) to get a good concept working the way they would like it to. Although this can lead to improving the skills required to work with limited resources, it can also be frustrating for beginners.
Another common complaint is the restriction of only having one game loaded at a time, requiring hooking the Arduboy up to another device to change games.
IMHO the simplicity and size (and to some extent cost) of the Arduboy are big selling points. So, other than increasing available flash and RAM, and adding an SD slot for multiple games and for things such as game levels, I think a new Arduboy should be identical to the current one. Consider that using the same case (perhaps with minor modifications), display, battery, and most other components has economic advantages for production, especially if the original Arduboy were to remain in production along side a new one.
Have you checked the price and availability of a higher-end AVR compared to SAMD or other ARM based chips with at least equivalent capability? Since most games stick to using the provided libraries for interfacing with the hardware, switching to a different processor architecture would only require a re-compile against ported libraries for the majority.
As I’ve said in the past, you have to be wary of feature creep.
RAM and PROGMEM were real culture shock for me, for the first project I tried I tried to make random maps for a roguelike and discovered real quick that you cant fit a 64x64 array of bytes into RAM (i used INT first actually) While this encouraged me to learn how to program bitwise operators which I think is a good thing it can be very daunting. Basically I like some of the restrictions of the arduboy platform but feel that more RAM and memory in general would be useful and not ruin the feel.
That is a fair point. The SAMD CPUs are definitely the cost optimized option at this point. But, can’t the same be said about the color TFT displays?
If the MCU has more available RAM and a game makes use of it, won’t it lose its backwards compatibility and be locked to the ‘new’ hardware version anyway? So, why not switch out to a nicer display at the same time?
But, I agree that now we are talking about a pretty significant difference from the Arduboy, and much closer to other platforms.
My first homemade used a Pro Micro with alternate wiring and a SH1106 OLED display. In practice, I found that over a third of the games I tried to put on there needed actual code changes for one reason or another. There are also a few games that still only provide a .hex file and not source code. These are the big reasons I decided to stay as close as possible to the original for my ArduBiyBOY hardware. It only needs the SSD1309 patch which can be applied directly to .hex files. Much easier, and so far everything I have tried works fine without any fiddling around.
So, by all means play around with new hardware, but beware of what you are getting into. Also, keep in mind that this will probably be a one-off device and there won’t necessarily be many (if any) games that make use of the expanded hardware. The only way around this is to get massive adoption and fragment the platform/community.
Good point. But, I wonder how much of an issue this will continue to be once we have access to massive amounts of additional flash space through the SPI flash chip?
True. Although the SPI flash chip will help out with the trying to find more flash space issue. And, personally I really like the limitations which causes one to hone their skills and find new ways of optimizing things. But, I agree that it can stifle innovation, especially for beginners.
This is solved with the SPI flash chip and Cathy3k bootloader.
To allow the new system to easily be able to run the hundreds of existing games. Even those that directly talk to the display hardware. Again, the goal is to just provide more flash and RAM.
Still not totally following this logic. Also, one of the stated goals that I was originally replying to was cheaper/bigger display.
If the goal is easy compatibility with existing games, then neither the display NOR the cpu should be changed dramatically. That means stay with the SSD1306 or SSD1309 display, and stay with the 32u4 or similar 8-bit AVR.
If one is ok with modifying libraries for backwards compatibility and re-compiling everything, it isn’t much more work to switch out the display for something else at the same time as switching to a pretty different cpu. Either way there is a bit of hassle involved and there are going to be games that won’t work (without potentially significant changes).
I think this is a case where one can pick any two of “cost optimized”, “easy compatibility”, or “extra features”.
For “easy compatibility” and “extra features” (but NOT “cost optimized”), go with the 2.42" SSD1309, add an SPI flash, and/or swap in a AT90USB1286 or ATmega1284P (loses built-in USB) or just stick with the 32u4. (i.e. this is the ArduBigBOY)
For “cost optimized” and “extra features” (but NOT as “easy compatibility”), switch out the display for a color TFT (or other cheap/larger display) , add an SD card, and/or go with an ARM cpu. (i.e. what this thread seems to be about, and closer to what the Meta/Pokkito is)
For “cost optimized” and “easy compatibility”, just stay with the smaller SSD1306 display, 32u4, and maybe add an SPI flash. (i.e. exactly what the next Arduboy will probably end up being)
Flash space behaves more like progmem than RAM,
it would free up progmem by allowing for more constant data,
but it wouldn’t help with situations where writable memory is needed,
e.g. making large-ish modifiable maps.
Oh yes, I fully understand this.
I was responding to your comment where people tend to use up more flash space (for things like static maps) because otherwise they run out of RAM.
By adding a massive amount of available flash through an SPI flash chip, people can continue to do this (i.e. using pre-built maps instead of dynamically in-memory generated maps) while also not running out of flash space as quickly, and keep RAM around for things that actually need to be changeable. By the way, re-programming SPI flash blocks is fairly easy to do as well, so it still could potentially be used for dynamically generated maps (although probably not modifiable maps).
Obviously, having more of both would be better, but I was pondering how much the RAM limitation would really matter assuming the flash limitation wasn’t really there any more.
If I recall correctly, most of the games I tried to compile manually were much closer to running out of flash space than RAM.
It is, but you have to erase an entire block, so it’s of limited use.
I think it’s important for any game that wants a large-ish modifiable map.
I could imagine having more RAM would allow for more terraria-like games, or more in-game map editors.
It would also allow for more flexibility with procedurally generated dungeons and make it easier to have complex data structures (e.g. trees) that could be used for various improvements (e.g. better AI).
Yes, but like I said, part of the reason people run out of progmem faster is because they don’t even attempt the games that would be likely to run out of RAM because the ~1000 bytes currently available is barely enough to do anything more advanced with.
For example, a 128x64 map consisting of just 2 kinds of tiles would eat 1024 bytes of RAM.
There’s currently not enough RAM to pull something like that off and still have a frame buffer (unless the stack was kept tiny).
If there was double the RAM it would be possible to have a 128x64 map with 4 kinds of tiles and still have a frame buffer.
Also RAM is harder to measure the usage of because the stack grows and shrinks at runtime.
Sometimes you can think you have enough RAM and then the stack grows particularly deep and it causes runtime bugs.
Stack is honestly pretty easy to reason about if you know how the function calls work and how arguments are pushed onto the stack. It’d be decently easy to manually inspect the library and say “Starting from loop() you need to leave x bytes for JUST core library functions.” I don’t think we use recursion or anything crazy, so the stack shouldn’t go THAT deep.
But you won’t know an exact figure unless you follow every possible branch of execution to figure out the maximum stack depth of the program.
Without a tool to do that it’s a lot of effort, and guessing is likely to be inaccurate.
Other people might though.
And you also have to account for additional stack use by any interrupt service routines that may occur when your mainline code is at its maximum stack depth.