Shop OBEX P1 Docs P2 Docs Learn Events
SERIN via MAX232 Serial Port missing last character — Parallax Forums

SERIN via MAX232 Serial Port missing last character

john_exonetsjohn_exonets Posts: 3
edited 2006-10-25 16:59 in BASIC Stamp
Hi All,

I've looked and searched on here, but can find anything about this, so please forgive me if this has been covered before [noparse]:([/noparse]

I'm trying to connect a base BS2 to an external terminal controller using a MAX232 signal converter using the following code:

DEBUG CLS, CR, CR
'
' I/O Defs
'
RX CON 10
TX CON 11
RTS CON 12
CTS CON 13
'
' Constants
'
Baud96 CON 84 ' 9600-8-N-1-Non-Inverted
Baud96I CON 16468 ' 9600-8-N-1-Inverted
Baud12 CON 813 ' 1200-8-N-1
'
' Data
'
CC VAR Byte(10)
'
' MAIN CODE
'
CC(9) = 0
HIGH CTS ' Not ready to receive yet
LOW RTS

DEBUG ">>:", CR

main:
SERIN RX\CTS, Baud96, 2, main, [noparse][[/noparse]STR CC\9]
DEBUG STR cc, "|"
GOTO main

So, 9600 baud, non-inverting, 10-byte buffer. I added the "|" char to better show the missing end char....here is the debug output:

KANTRON|CS PACKET|CONTROLLE| III VERS|ON 5.1

(|) COPYRIG|T 1988-19|2 BY KANT|ONICS INC| ALL RIG|TS RESERV|D.

DUPLI|ATION PRO|IBITED WI|HOUT PERM|SSION OF |ANTRONICS|

so it reads in 9 bytes, "misses" one byte, and reads the next 9 bytes correctly. I'm using hardware flow control, so it *should* be receiving all bytes, but seems to be missing one between reads. If I adjust the size of the buffer (5 - 20 bytes) it does not seem to have any effect . Turning off flow control (IE. Not using CTS/RTS signals) produces many more skipped chars.

If I try a different application that requires real-time byte by byte reads, it consistantly reads one byte, skips the next one, and reads the third one.

Is my problem in the MAX232 signal converter, my code, or a limitation of the Basic Stamp??

Thanks smile.gif
-- John

Comments

  • allanlane5allanlane5 Posts: 3,815
    edited 2006-10-23 20:45
    "STR CC\9" says to read 9 bytes, and put the result in CC. So it's doing exactly what you told it to do.

    Probably you want "STR CC\10" instead, to have it read 10 bytes, and put them in CC[noparse][[/noparse]0] through CC[noparse][[/noparse]9].
  • john_exonetsjohn_exonets Posts: 3
    edited 2006-10-23 21:02
    The Buffer size does not matter....its in a loop to SERIN, so whether its 1 or 20, the next byte after the SERIN is lost and not picked up by the next SERIN command. Its a 10-byte buffer, but I'm reading in 9 bytes in this example just to make sure there is no boundry issue of some sort.

    Thanks for responding [noparse]:)[/noparse]
  • allanlane5allanlane5 Posts: 3,815
    edited 2006-10-23 21:12
    I agree "the buffer size does not matter", I'm not talking about "the buffer size", I'm talking about the "str" modifier. If you tell it "CC\9", then it's going to read 9 bytes, no matter WHAT the 'size' of CC is.

    Oh, and note that the BS2 has no 'serial input buffer' which reads data 'in the background'.· If you're not in a SERIN 'listening' for data, the BS2 is not going to recieve the data.· So right now you've TOLD it to only 'listen' for 9 bytes then go on.· Naturally, the tenth byte will have vanished in the 1 mSec or so it takes to go around the loop and start 'listening' in the SERIN statement again.

    Oh -- hardware control.· It's completely possible that the 'sending' device is getting the 'CTS' low "too late" for it -- it may have already sent the tenth byte by the time the BS2 'signals' it that the SERIN is done.

    You might also want to change your timeout.· At 9600 baud, each byte takes about 1 mSec.· So 2 mSec is too short a time-out for 10 bytes.

    Post Edited (allanlane5) : 10/23/2006 9:22:42 PM GMT
  • Tom WalkerTom Walker Posts: 509
    edited 2006-10-24 13:06
    allan,
    One small point. As I understand it, the timeout in a SERIN might more accurately be called a "time-without" or perhaps a "within". From what I have read (or at least what I recall at the moment), timeout only applies to "dead time". If you set a timeout of 1 ms and you receive "noise" occurring at .5 ms intervals, SERIN won't timeout until it hits the end of its input formatter.

    If my understanding is correct, then your previous 2 ms comment doesn't really apply...as long as the data starts arriving WITHIN 2 ms.

    Admittedly, I'm basing this on an understanding gleaned from back in the Yahoo group days...when the subject was raised several times.

    Perhaps a clarification from Parallax?

    ▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
    Truly Understand the Fundamentals and the Path will be so much easier...
  • allanlane5allanlane5 Posts: 3,815
    edited 2006-10-24 13:16
    In testing, I've found the "timeout" to occur, whether data is running or not.

    Thus, if I have a 10 character buffer taking a 'STR' value, and the timeout happens after three characters, the rest of the characters are not recieved.

    But that's just me, and I may have misinterpreted the results. So I'd like to hear from Parallax about this as well.
  • Tom WalkerTom Walker Posts: 509
    edited 2006-10-24 13:28
    FWIW, I recall a thread with the following characteristics:

    I think that someone was in a noisy environment and the code was never leaving the SERIN due to noise during a relatively long timeout, and it was stated by a Parallax representative that if you set up a SERIN with a 1 sec timeout, then anything received within 1 second will reset the timeout timer.

    ...but I could be mis-remembering...

    ▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
    Truly Understand the Fundamentals and the Path will be so much easier...
  • Chris SavageChris Savage Parallax Engineering Posts: 14,406
    edited 2006-10-24 14:26
    Gentlemen,
    ·
    ·· The timeout is reset by any data coming in.· Even garbage will reset the timeout, which responds to any activity on the line.· In a very noisy environment it is possible for timeout to never occur.· I hope this helps.· Take care.

    ▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
    Chris Savage
    Parallax Tech Support
  • allanlane5allanlane5 Posts: 3,815
    edited 2006-10-24 14:46
    Well, there you go. Thanks Chris.

    So, the 2 mSec timeout should not be a problem, unless the incoming data has 'pauses' in it. So it looks more like the dropping of the CTS signal comes 'too late' to stop the sender. You might try a second "SERIN" to try to capture the 10th character, but that probably won't work either, since the SERIN takes a few hundred micro-seconds to start up.

    One more thing -- it's possible the "STR" modifier DOES put a 'null' byte on the end of the string it captures.· If this is true, and you want to capture a 10 byte string, you should size the buffer to recieve 11 bytes, then set the STR modifier as "CC\10".· Of course, you've already tested this condition by changing the buffer size, so this probably won't help.

    Post Edited (allanlane5) : 10/24/2006 2:50:52 PM GMT
  • john_exonetsjohn_exonets Posts: 3
    edited 2006-10-24 16:14
    Ok, so there's no magical code fix for this issue.....the BS2 is just missing the last byte, and the switching of the CTS line through the MAX232 chip is just not fast enough to stop the flow in time. Bummer.

    Ok, so I guess I need to implement a hardware buffer (Something like this: http://www.proteanlogic.com/applications/an012/rsb509.htm ) to ensure all the bytes are captured.

    Thanks for everyone's help.
    -- John
  • allanlane5allanlane5 Posts: 3,815
    edited 2006-10-24 16:29
    Cute chip. Thanks for the link.
  • UnsoundcodeUnsoundcode Posts: 1,532
    edited 2006-10-25 16:59
    Hi John, the missing byte is lost during the time it takes for the debug statement to output cc, the buffer in your link should work nicely and is probably the best route to take. Without a seperate buffer the only work around I could see would be if your terminal was able to buffer the data while you output to the display, you could try this by making CTS high immediately after the serin request and before the debug statement. You could also try reducing the baud rate or using a faster processor.

    Jeff T.
Sign In or Register to comment.