Can you put Midnight Wild on it?
Wow! That is glorious!
With a lot of help from @Mr.Blinky I am close to finishing the prototype of my VGA / FX / NES controlled Arduboy.
A preview …
… and …
Next up: design a PCB and mount it in a Retro NES Pi case.
So the case has arrived and I have modded it. Gone are the USB controllers and HDMI port and in their places are an old school NES port and VGA port.
Just need to get my board made …
Dang this is sweet I wanna add the VGA to HDMI chip and make the Arduboy console!
Yeah the VGA is quite limiting. I happen to have an old screen and a newish TV that accepts VGA but these are going to die one day.
@Vampirics suggested loading RetroArch onto a PI as they emulate the Arduboy. Then it already has HDMI and can use a USB controller. A lot simpler than the approach I have taken.
I thought HDMI was still fairly standard?
Is there something replacing HDMI that I’m not aware of?
Sorry I meant VGA not HDMI.
That’s what I was expecting, but the ‘these’ threw me.
Hardly anyone has mentioned DVI. Is DVI unpopular for some reason?
(Aside from the fact it doesn’t handle sound.)
I was under the impression that it was supposed to have superceded VGA so I would have thought it’s likely to still be hanging around for a while.
DVI to me looked to be the more business facing standard over hdmi which is consumer oriented. In either case though the physical layer is different, but the video protocol is the same either way.
DVI is also royalty free so it’s ok to put over HDMI. If you use actual HDMI signaling even though it’s been reverse engineered technically you still need the license to sell a product that uses it.
The reason why many products just go the easy route and use hdmi encoder chips that are already certified and have the cost of licensing baked in.
Without meaning to go off on too much of a tangent…
I can’t say I’ve ever noticed that.
The only DVIs I’ve encountered have been in a home setting.
So the VGA1306 hardware would work for DVI, or could be adapted to work for DVI?
If DVI is likely to outlast VGA then it might be worth it since DVI is licence-free and DVI to HDMI converters exist.
AFAIK DVI and HDMI use the same video protocol which is why converter dongles between the two standards are just pass through without any conversion chips. For converting between VGA and DVI though you would need a converter as one signal is analog and the other is differential digital.
You can transmit DVI over an HDMI cable, so the converter just sets the pins as far as I know.
The fpga that is being used I think is plenty capable of outputting DVI, just needs to be reprogrammed.
Depends on if there’s enough resources for at least a line buffer for the differential output (but may possibly need at least a full frame buffer for the ssd1306 it’s emulating). I really wish I had the time to keep with fgpa development since graduating college, I instead went down the path of embedded micro and driver design.
When I read a 25MHz pixelclock was required (same used as for the VGA board) for 640 x 480 @ 60Hz display and I became hopeful but then I read the bit clock was required to be 10 times higher requiring a 250 MHz clock. Not sure the FPGA used for the VGA board can handle it. Need to look more into it.
If there’s enough resources you can always implement a discrete pll, feed it the base system clock and boost it to the desired frequency (depends on acceptable jitter, routing and available lb’s).