As the apartment shito is beginning to settle down, I have been able to start working on this.
My first goal was to fix the two annoying issues I described in the previous post. In this situation, I was trying to get the pause command to simultaneously pause all local melonDS instances, instead of just the one that received the command. There is more to be done in the way of cross-instance sync, but this seemed an obvious starting point to me.
The first issue was due to the way the interface works. Originally, the only way to pause melonDS was through the interface (System->Pause). Later on, the pause hotkey was added. Hotkeys are checked and handled in the emu thread (separate from the UI thread), so to keep things simple, the pause hotkey would just send a signal to the main window which would behave like using the System->Pause menu command. It's a bit of a roundabout way to handle this, but it has the advantages that it avoids duplicating code too much, and keeps the UI state (the Pause checkmark) in sync without having to worry about it.
When I started adding cross-instance pause, I made the pause command handler send a message to other melonDS instances through IPC. Then the other instances would receive that message and treat it the same as pressing the pause hotkey. Easy peasy.
Yeah, except doing so would cause these instances to send more pause messages, essentially entering a feedback loop.
So I had to add a separate handler for the IPC pause command to avoid this. Not the best solution, but it works.
Next problem was that during a local multiplayer game, cross-instance pause would interfere with the local multiplayer sync system, and could essentially cause some instances to get stuck. To deal with this, I had to add some more intelligence to the IPC comm layer to avoid waiting on instances that are paused. And it does the trick. Pausing a local multiplayer game may cause minor packet loss, due to the way this works, but I haven't seen any problems in my testing -- Nintendo's local multiplayer protocol is resilient, so this should be mostly fine.
There is more state that should be shared across melonDS instances, like the recent ROM menu. We'll get there. Now that the system is in place for cross-instance comm, it shouldn't be very difficult.
But for now, I want to build the base for netplay. So let's talk about this.
Due to the way local multiplayer games work on the DS, this netplay implementation is going to be somewhat different. The main thing to take into account is that, when I say the local multiplayer protocol is resilient, it can deal with missing packets, but due to the way it works, packets can't be received late -- they're either received on time or not received. This is the main reason why it would not be feasible to extend this protocol over the network.
So instead, the entire local multiplayer network has to be emulated on each player's computer. For example, consider this case of a 3-player game:
Here, all 3 instances are running on player 1's computer, but player 1 only sees their corresponding instance -- the other players' instances will be kept hidden. Player 1 controls their instance, and their inputs are forwarded to the corresponding instances on player 2 and 3's computers. Similarly, instances 2 and 3 on player 1's computer receive inputs from players 2 and 3 and mirror them. Same deal for every player. Since a given instance can only be controlled by one player (just like how you'd have one player per DS in a real-world setting), the communication is mostly unidirectional. Then, since the local multiplayer comm layer does the job of keeping local instances in sync, we only have to worry about keeping players in sync between eachother.
This has the downside that running multiple melonDS instances requires more powerful hardware, but we're confident that modern hardware can handle this.
So I'm currently working on building the basic netplay system. So far, I'm able to forward inputs to another instance over the network, but I have yet to structure this properly for a local multiplayer game.
8 comments have been posted.