I'm designing an industrial machine controller that has to deal with multiple servo drives and lots of IO channels. As I'd run out of pins with only one prop, anyway, I decided to use one slave prop per servo axis and one master prop doing the trajectory planning. And BTW, as I'd also run out of memory I decided to use an Intel NUC for the GUI and as file/network server.
Each slave prop sits on it's own PCB. All slave PCBs are stacked on top of the master PCB with daisy chained pin headers. The maximum bord-to-board ditance is <8" with good grounding and no high currents nearby so I don't expect noise problems. I have less than 8 pins left over so I don't think that parallel/multi-bit transmission would give any advantage.
I've seen a trick somewhere where the PHSA register is used to output a serial data stream at 20Mbit/s. But the highest data rate a propeller could receive is 10Mbit/s because shifting in one bit always takes two instructions (TEST and RCL), 4 clocks each. If all propellers run from the same 5MHz crystal, there is no need to transmit an extra serial bus clock. As all slaves have to be served exactly once per time slice there is also no need to transmit any address information. Each slave knows it's time slot. So I think I'll only need two IO pins per prop, one "start of timeslice" strobe and a (bidirectional) data line. If I can afford two cogs for the communication I could do it full-duplex and transmit downstream and upstream data at the same time over two bus lines. But I don't think that'll be necessary.
So the (peak) data rate is 10Mbit/s or 8 clocks per bit. I send one start bit followed by 32 data bits. Now the question:
How do I know the exact timing of the WAITPEQ command? Given...
WAITPEQ dataInPin,dataInPin ' wait for start bit
TEST dataInPin,INA wc
... how many clocks pass between the triggering edge of WAITPEQ until the sampling of the input pin of the TEST instruction? For the transmitter I can easily check timing with the scope. But for the receiver it's difficult to tell the exact sampling time. For optimum reliability it woul be best to sample in the middle of a data bit. If the WAITPEQ and the TEST instruction sample data at the same clock (out of 4 per instruction) then it would be best to insert two NOPs between the two instructions to skip the start bit and the first half of the first data bit.
Is this correct? Anyone a better idea?