Here is the situation, I am sending out two bytes of data at 2400 baud with
an inverted signal with a Stamp 2. The command below:
I am receiving this data with a PIC.
My question is: What is the actual time lag between when the stop bit is
transmitted from byte1 and the start bit of byte2.
I have discovered, the hard way, that there is a time lag between the two
bytes of data.
Also, in the stamp manual it has the formula of INT(1,000,000/baud)-20 for
setting the stamp's transfer rate. What does the '-20' do?
One more question, is the same time lag between the two bytes of data the
same for a stamp1 and serin commands?
For those who will ask this question, I am programming a PIC for dedicated
PWM control of DC motors. The PIC will output a steady PWM signal, and will
only change when it is instructed from a serial input. The serial data has
direction and/or duty cycle. It is a 4 MHz PIC
All I am interested is in the timing lag between the bytes, not programming
techniques for the PIC.
Any info would be appreciated.