Stuttering when using screen mirroring via Serial

I tried implementing screen mirroring in my game (as per Screen Mirroring Guide / How To Screenshot Your Game) and so far it almost works.

However, I am running into the issue that whenever I open the serial port and then disconnect from it (either by disconnecting the cable or closing the port again on the PC), my game starts to stutter horribly.
If I then reconnect the cable and listen to the port again, my game goes back to normal.

If I never open the serial port from my PC, my game runs fine even while not connected.

Does anyone know why this might be?

I haven’t tested what happens after a disconnect, but it sounds like it’s still trying to transmit screen data.
Since nothing is listening, the buffer gets filled. When it tries to send data with a full buffer, it halts until either space is cleared in the buffer or a timeout happens. Since space never clears, it’s constantly waiting until the timeout and you get lots of stuttering.

Which leaves me with two questions:

  • Why is the game not lagging before the serial port is opened for the first time?
  • Is there any way to detect whenever the serial port gets closed again and then stop sending or try to keep sending but have it fail immediately when the buffer is full?

It makes sense that it wouldn’t start transmitting until a serial port connection is opened.

In CDC.cpp:


size_t Serial_::write(const uint8_t *buffer, size_t size)
{
	/* only try to send bytes if the high-level CDC connection itself 
	 is open (not just the pipe) - the OS should set lineState when the port
	 is opened and clear lineState when the port is closed.
	 bytes sent before the user opens the connection or after
	 the connection is closed are lost - just like with a UART. */

	if (_usbLineInfo.lineState > 0)	{
		int r = USB_Send(CDC_TX,buffer,size);

After that, I don’t think lineState ever gets set back to 0 (a closed state).

It must be some glitch in how the serial library is implemented. It’s strange because it should only be using outputs, not waiting on any inputs from the host… so not sure why disconnecting it is what causing the problems.

Is there some kind of handshaking that occurs with serial? I don’t think there is any CRC or anything. @FManga may be on to something that the code may only dump a buffer it detects an open connection perhaps, but other segments of the code aren’t prepared for a disconnect mid transfer.

Just the USB stack handshake that tells the PC that the port is for serial. Once that happens, a flag (lineState) is set and it starts copying data to a buffer. The serial library doesn’t detect disconnects (as far as I can tell), so it keeps filling the buffer even after that. That’s when it starts waiting for the timeouts and stuttering happens.

Unless the serial library is modified, I think the only way to prevent the stutter is if the application on the other end were to reply every now and then to indicate it’s still listening.

In that case, wouldn’t it be best to actually fix the underlying CDC library that the arduino uses?
It even says right in the comment: “the OS should set lineState when the port is opened and clear lineState when the port is closed.”

But reading that code, the lineState is never cleared again after initially being set.

Edit: this whole topic discusses this exact issue (with some promising solutions): https://forum.arduino.cc/index.php?topic=360286.0

As a band aid fix, I used this in my code for now:

if (Serial.availableForWrite() > 0) {
    Serial.write(arduboy.getBuffer(), /* (128 * 64 / 8) */ 1024);
}

This only makes the game stutter for one “time out cycle” after taking out the USB cable or closing the COM port from the host and runs fine after that.