SERIN/SEROUT and binary arrays
Archiver
Posts: 46,084
In a message dated 5/6/2004 1:40:41 PM Eastern Daylight Time,
tedstoner@1... writes:
> Another question I have about SERIN not explicitly answered in the
> manual from what I read: if you set SERIN to read say 3 bytes and only
> receive 2, even if you have a timeout specified it will hang forever?
>
You could write
serin pin, baud, [noparse][[/noparse]serstring\10\"A"]
and it would end at 10 bytes or at the letter A, whichever came first.
Sid
[noparse][[/noparse]Non-text portions of this message have been removed]
tedstoner@1... writes:
> Another question I have about SERIN not explicitly answered in the
> manual from what I read: if you set SERIN to read say 3 bytes and only
> receive 2, even if you have a timeout specified it will hang forever?
>
You could write
serin pin, baud, [noparse][[/noparse]serstring\10\"A"]
and it would end at 10 bytes or at the letter A, whichever came first.
Sid
[noparse][[/noparse]Non-text portions of this message have been removed]
Comments
tedstoner@1... writes:
> Thanks Sid. The problem with sending an "A" is that the stream is
> binary and "A" could appear at any time as a valid byte in the data
> (where all possible bit combinations are possible). I could provide
> escape sequences, but that gets messy in terms of CPU time and RAM
> storage.
>
True, but you can send your binary data in decimal form, like
serout pin, baud, [noparse][[/noparse]16, 32, 64, 128, 256, "A"]
since the decimal data gets converted to binary before it is serouted.
and when the serin got the "A" it would stop receiving.
Sid
[noparse][[/noparse]Non-text portions of this message have been removed]
SERIN and SEROUT with BS2. I'm hoping I missed something.
If I want to read a binary array, I have to know the length to read I
think (even though the binary length is in the first byte I read).
Am I correct that the "\L" for the STR modifier must be a constant?
The manual doesn't say but I didn't have good results specifying a
var. Example:
buff var byte(10)
len = 4;
serin p1\p2, baud, timeout, [noparse][[/noparse]STR buff\len]
It has to be
serin p1\p2, baud, timeout, [noparse][[/noparse]STR buff\10]
instead (or pick some other constant)? Same for serout? The first form
(with \len) did tokenize successfully.
Serout with a binary array is effectively hopeless since it will stop
sending if a 0x00 is encountered? So I have to loop sending a byte at
a time.
When reading, I was unable to read my length byte, then loop reading a
byte at a time and manually storing in an array, due to timing
problems I believe (even at 2400 baud and 2 stop bits).
Another question I have about SERIN not explicitly answered in the
manual from what I read: if you set SERIN to read say 3 bytes and only
receive 2, even if you have a timeout specified it will hang forever?
If that's true, it would have made more sense to me to timeout the
command once the timeout value was exceeded between any 2 bytes (or
have a separate timeout value for "interbyte" receiving).
Thanks for any help or clarifications.
Harry
binary and "A" could appear at any time as a valid byte in the data
(where all possible bit combinations are possible). I could provide
escape sequences, but that gets messy in terms of CPU time and RAM
storage.
If the stream is ASCII then a scheme like that works as the SERIN
design intended. But ASCII data is inefficient in space and then you
are back to using more CPU and RAM.
Thanks.
Harry
--- In basicstamps@yahoogroups.com, Newzed@a... wrote:
> In a message dated 5/6/2004 1:40:41 PM Eastern Daylight Time,
> tedstoner@1... writes:
>
>
> > Another question I have about SERIN not explicitly answered in the
> > manual from what I read: if you set SERIN to read say 3 bytes and only
> > receive 2, even if you have a timeout specified it will hang forever?
> >
>
> You could write
>
> serin pin, baud, [noparse][[/noparse]serstring\10\"A"]
>
> and it would end at 10 bytes or at the letter A, whichever came first.
>
> Sid
>
>
> [noparse][[/noparse]Non-text portions of this message have been removed]
data transmitted will grow.
If I privately convert the data using base64 encoding (for example),
the size of the data increases by 33%. This is easy to encode/decode
on the PC, but would require custom decoding on the Stamp (blech).
If my understanding of reality is correct, then I will probably end up
converting the binary protocol to ASCII in some form and use the end
character format as you suggest.
Lots of experimenting to try tonite still though ...
Thanks again.
Harry
--- In basicstamps@yahoogroups.com, Newzed@a... wrote:
> In a message dated 5/6/2004 2:22:46 PM Eastern Daylight Time,
> tedstoner@1... writes:
>
>
> > Thanks Sid. The problem with sending an "A" is that the stream is
> > binary and "A" could appear at any time as a valid byte in the data
> > (where all possible bit combinations are possible). I could provide
> > escape sequences, but that gets messy in terms of CPU time and RAM
> > storage.
> >
>
> True, but you can send your binary data in decimal form, like
>
> serout pin, baud, [noparse][[/noparse]16, 32, 64, 128, 256, "A"]
> since the decimal data gets converted to binary before it is serouted.
>
> and when the serin got the "A" it would stop receiving.
>
> Sid
>
>
> [noparse][[/noparse]Non-text portions of this message have been removed]
>The manual doesn't say but I didn't have good results specifying a
>var. Example:
>
>buff var byte(10)
>len = 4;
>serin p1\p2, baud, timeout, [noparse][[/noparse]STR buff\len]
I'm very surprised that didn't work. Check for other hardware or pbasic bugs.
>Serout with a binary array is effectively hopeless since it will stop
>sending if a 0x00 is encountered? So I have to loop sending a byte at
>a time.
Yes, that's right.
>When reading, I was unable to read my length byte, then loop reading a
>byte at a time and manually storing in an array, due to timing
>problems I believe (even at 2400 baud and 2 stop bits).
How about,
serin p1\p2, baud, timeout,[noparse][[/noparse]len, STR buff\len]
not sure about that, but the setup delay should be less than a loop.
>
>Another question I have about SERIN not explicitly answered in the
>manual from what I read: if you set SERIN to read say 3 bytes and only
>receive 2, even if you have a timeout specified it will hang forever?
No, the timeout delay applies to the interval between every single
byte. (Another confusion people have is that the timeout applies to
the entire message, but no, it is every byte. Thus, if there is a
WAIT modifier, and there is a lot of noise on a serial line, that is
when it can get hung forever waiting for the WAIT string.
-- Tracy
specific character for 'end-of-message'. Now, as
you point out, this could occur any time. So, what
you do is have an 'escape' character.
So you encode: two 'escape' characters becomes
one 'escape' character. When your 'end-of-message'
character occurs in the data, you convert it to
an 'escape' character and some other character.
Now, you can use your 'end-of-message' character
because your encoding scheme will never send one
in the 'real' data.
If you pick your escape char and end-msg char
so they are 'rare', this can be used with
minimal processing overhead.
--- In basicstamps@yahoogroups.com, "harrybstoner" <tedstoner@1...>
wrote:
> Hi Sid. If the binary data is converted to ASCII, then the size of
the
> data transmitted will grow.
>
> If I privately convert the data using base64 encoding (for example),
> the size of the data increases by 33%. This is easy to encode/decode
> on the PC, but would require custom decoding on the Stamp (blech).
>
> If my understanding of reality is correct, then I will probably end
up
> converting the binary protocol to ASCII in some form and use the end
> character format as you suggest.
>
> Lots of experimenting to try tonite still though ...
>
> Thanks again.
>
> Harry
>
>
>
> --- In basicstamps@yahoogroups.com, Newzed@a... wrote:
> > In a message dated 5/6/2004 2:22:46 PM Eastern Daylight Time,
> > tedstoner@1... writes:
> >
> >
> > > Thanks Sid. The problem with sending an "A" is that the stream
is
> > > binary and "A" could appear at any time as a valid byte in the
data
> > > (where all possible bit combinations are possible). I could
provide
> > > escape sequences, but that gets messy in terms of CPU time and
RAM
> > > storage.
> > >
> >
> > True, but you can send your binary data in decimal form, like
> >
> > serout pin, baud, [noparse][[/noparse]16, 32, 64, 128, 256, "A"]
> > since the decimal data gets converted to binary before it is
serouted.
> >
> > and when the serin got the "A" it would stop receiving.
> >
> > Sid
> >
> >
> > [noparse][[/noparse]Non-text portions of this message have been removed]
Harry
--- In basicstamps@yahoogroups.com, Tracy Allen <tracy@e...> wrote:
> >Am I correct that the "\L" for the STR modifier must be a constant?
> >The manual doesn't say but I didn't have good results specifying a
> >var. Example:
> >
> >buff var byte(10)
> >len = 4;
> >serin p1\p2, baud, timeout, [noparse][[/noparse]STR buff\len]
>
> I'm very surprised that didn't work. Check for other hardware or
pbasic bugs.
Ok. I will retest that. Maybe other problems masked this. Things would
be very straightforward (and more efficient) if this works.
> >Serout with a binary array is effectively hopeless since it will stop
> >sending if a 0x00 is encountered? So I have to loop sending a byte at
> >a time.
>
> Yes, that's right.
Thanks for clarifying.
> >When reading, I was unable to read my length byte, then loop reading a
> >byte at a time and manually storing in an array, due to timing
> >problems I believe (even at 2400 baud and 2 stop bits).
>
> How about,
>
> serin p1\p2, baud, timeout,[noparse][[/noparse]len, STR buff\len]
> not sure about that, but the setup delay should be less than a loop.
That would be the ideal solution. I will test that.
> >Another question I have about SERIN not explicitly answered in the
> >manual from what I read: if you set SERIN to read say 3 bytes and only
> >receive 2, even if you have a timeout specified it will hang forever?
>
> No, the timeout delay applies to the interval between every single
> byte. (Another confusion people have is that the timeout applies to
> the entire message, but no, it is every byte. Thus, if there is a
> WAIT modifier, and there is a lot of noise on a serial line, that is
> when it can get hung forever waiting for the WAIT string.
Externally all I saw was the serin command hung and my timeout branch
was not taken. At least one byte was received. But what you are saying
is if my command was of format:
a var byte
b var byte
c var byte
serin p1\p2, baud, 50, timeout_branch [noparse][[/noparse]a,b,c]
and only 2 bytes are received (setting a and b), then if the 3rd byte
does not arrive within 50ms of the 2nd byte, then a branch to
"timeout_branch" should be taken?
As a followup, I found during testing last night that I could not get
reliable communications at 2400 baud. At 2400 baud, I only correctly
received the 1st 4 or 5 bytes of an 11 byte message. The other bytes
were received but incorrect. At 1200 baud they were received correctly.
If I can use the length modifier as described above, 2400 baud should
work. I found event the addition of 1 instruction to a byte-receive
loop threw off the timing.
I also suspect based on my experiences of the last week, and as
documented in several places, that the PC hardware/driver does not
play nice with flow control on a byte-by-byte basis.
Protean Logic (http://proteanlogic.com/) seems to make a buffering
chip, RSB509, that in theory would make a lot of this headache go
away. Anyone have experiences using this chip?
makes sense. Unfortunately, the binary data I have will assume all
possible values, so no one value is more likely than another.
But, it did give me an idea that would work every time and cost only
and additional 4 bits (which could be buried in free bits in an
existing header byte that I have).
This works because my transmissions will always be less than 16 bytes.
So I pick any terminating byte value I want, say $FF. Then I terminate
the string with that terminating value. The stamp SERIN command knows
to terminate on this value as the STR operand \E.
The 4 bits I use for encoding will be an XOR (excusive or) value. All
data bytes will be XOR'ed with this value (both lower and upper nibble
of each byte). The XOR value chosen will be such that the value of any
byte XOR'd with it will not be $FF. Given that there will be less than
16 bytes to encode, I am guaranteed that I can find an XOR value
between $0 and $F that will work.
The stamp needs to undo the encoding by XOR'ing all data back.
This scheme may or may not be preferable to a normal escape sequence.
It is desirable in that the command size is always fixed, with an
exact 1-to-1 relationship in size between encoded and unencoded data
(aside from the terminating byte). This also as described only works
on data streams of less than 16 bytes. More bits required as the data
stream size limit grows.
It may use more CPU than a normal escape sequence but it is a constant
overhead. But if the XOR value is $0 you can skip the decode part.
Hopefully I won't need this...
Thanks.
Harry
--- In basicstamps@yahoogroups.com, "Allan Lane" <allan.lane@h...> wrote:
> A very simple work-around for this is to have a
> specific character for 'end-of-message'. Now, as
> you point out, this could occur any time. So, what
> you do is have an 'escape' character.
>
> So you encode: two 'escape' characters becomes
> one 'escape' character. When your 'end-of-message'
> character occurs in the data, you convert it to
> an 'escape' character and some other character.
> Now, you can use your 'end-of-message' character
> because your encoding scheme will never send one
> in the 'real' data.
>
> If you pick your escape char and end-msg char
> so they are 'rare', this can be used with
> minimal processing overhead.
>
> --- In basicstamps@yahoogroups.com, "harrybstoner" <tedstoner@1...>
> wrote:
> > Hi Sid. If the binary data is converted to ASCII, then the size of
> the
> > data transmitted will grow.
> >
> > If I privately convert the data using base64 encoding (for example),
> > the size of the data increases by 33%. This is easy to encode/decode
> > on the PC, but would require custom decoding on the Stamp (blech).
> >
> > If my understanding of reality is correct, then I will probably end
> up
> > converting the binary protocol to ASCII in some form and use the end
> > character format as you suggest.
> >
> > Lots of experimenting to try tonite still though ...
> >
> > Thanks again.
> >
> > Harry
> >
> >
> >
> > --- In basicstamps@yahoogroups.com, Newzed@a... wrote:
> > > In a message dated 5/6/2004 2:22:46 PM Eastern Daylight Time,
> > > tedstoner@1... writes:
> > >
> > >
> > > > Thanks Sid. The problem with sending an "A" is that the stream
> is
> > > > binary and "A" could appear at any time as a valid byte in the
> data
> > > > (where all possible bit combinations are possible). I could
> provide
> > > > escape sequences, but that gets messy in terms of CPU time and
> RAM
> > > > storage.
> > > >
> > >
> > > True, but you can send your binary data in decimal form, like
> > >
> > > serout pin, baud, [noparse][[/noparse]16, 32, 64, 128, 256, "A"]
> > > since the decimal data gets converted to binary before it is
> serouted.
> > >
> > > and when the serin got the "A" it would stop receiving.
> > >
> > > Sid
> > >
> > >
> > > [noparse][[/noparse]Non-text portions of this message have been removed]
(snip)
> How about,
>
> serin p1\p2, baud, timeout,[noparse][[/noparse]len, STR buff\len]
> not sure about that, but the setup delay should be less than a loop.
To follow up, the above does indeed work (contrary to my earlier
experience), and for me it works great at 2400 baud. However, for
error-checking purposes I wanted to instead do:
serin p1\p2, baud, timeout,[noparse][[/noparse]len]
IF (len > LEN_MAX) THEN BAD_LEN_RECEIVED
serin p1\p2, baud, timeout,[noparse][[/noparse]STR buff\len]
That did not work at 2400 baud but did work at 1200 baud.
Thanks again.
Harry