View Full Version : SERIN issues... New to this.

02-04-2005, 11:18 PM
Hi everybody!

I just began working with a BS2sx, and I have encountered a problem...

A little background: the stamp is going to be controlling a heat pad, among other things... the idea is that a PC will be able to send a "target" temperature and the stamp will monitor the temperature and turn the heat pad control pin on or off as needed to keep the actual temp within an acceptable range of the target.

It works great if I hard code the target temperature. When I try to use SERIN to communicate with the PC over the USB port is when I have some trouble. Here is the section that is giving me issues...

cmd and cmdVal are both Bytes.
targetTemp is defined as a Word

'retrives a current actual temp from heatpad

'Check for input
SERIN 16, 240, 10, NO_INPUT, [WAIT("*"), cmd, cmdVal]

IF(cmd = "T") THEN
targetTemp = cmdVal * 16 'temperature is stored in 1/16 degree units, hence 1600 is 100 Fahrenheit

The idea is one byte will signify the beginning of an incoming command. The next byte specifies what the command is... in this case "T" indicates that the next byte is a temperature.

I have it echoing the values read back with debug, and OCCASIONALLY it will report back "Cmd: T Val: x" correctly. Other times though it appears to read complete garbage. I would guess that 1 in 7 tiumes it reads the correct data.

I have tried to tweak the timeout value... I understand 10 ms is short, but I have went as high as 100ms. I have tried it at 4800 baud.

Any ideas why this isn't working?

Thanks in advance for the help!

Jon Williams
02-05-2005, 12:17 AM
Have a look at the attached program, I think it will help.· I opened up the timeout since you're using a terminal program (I think) and I used the DEC modifier in SERIN to take the command value.· To reset the temperature you would type this in your terminal:


You need a non-decimal character at the end of the temperature for the DEC modifier to work.

Jon Williams
Applications Engineer, Parallax
Dallas, TX· USA

02-05-2005, 01:21 AM
Thanks for the help, Jon.

I haven't actually tried using the DEC modifier up to this point, primarily because that would require 3 times the number of bytes to represent the temperature. It seems insignificant, but the heat pad is incredibly quick to react (it can warm from room temp to 150 degrees F in under a second). I noticed that just outputting a handful of extra characters with DEBUG caused the temperature to fluctuate more. The pad doesn't take a temperature. It simply is turned on (in which case it gets hotter and hotter) or off. The stamp's job is to check it as often as possible and constantly set the pin high or low, as necessary, to keep the pad as close as possible to the desired temp.

The temperature is read each cycle of execution with an analog-to-digital converter... it simply converts a voltage differential to a digital signal.
The loop is essentially:
1)check for incoming commands and store target temp if needed,
2)poll ADC for current temperature,
3) check current temperature against the "target" temperature, and
4) if current temp is above target+X, turn pad off, if temp is below target-X turn pad on.

There will be other devices connected as well. This is actually being used for a haptic feedback glove, which contains heat pads and vibrators. Ultimately I was thinking of using a 2 byte bitmask-type command structure... Bits 0-2 to select the command, bits 3-8 reserved to on/off state of vibrators etc, and bits 9-15 for an unsigned integral value for temp or whatever else is needed.

And I am currently using the debug terminal to control it until I know what I am doing, but eventually the device will be controlled by applications running on a PC.

Initially I thought that the "garbage" I was getting was due to problems synching with the PC... that's when I added the WAIT("*"). If I have to eventually change to the decimal input I'll figure out a way to work it out, but I would like to avoid that if possible.

Could it have something to do with the fact that it is communicating through a USB port? I'm not much of an expert (I'm a complete novice) at serial communication. Does that require some special considerations?

I'm simply baffled about what could be causing this.

Jon Williams
02-05-2005, 01:37 AM
I don't think USB is the problem; I just tried my demo on a USB-BOE and it works the same as when using a standard COM port.

It's going to be very tricky using a terminal program and wanting to check the temperature very frequently. The reason is that humans are slow and need a bit of time to enter the characters. Since you're ultimately moving to a PC application (perhaps something written in VB) you could use available IO pins and an inverter (like our RS-232 AppMod) to communicate with the PC using flow control. The docs for the RS-232 AppMod show how.


Jon Williams
Applications Engineer, Parallax
Dallas, TX· USA

02-05-2005, 02:18 AM
We may, indeed, end up having to look at an option like that.

Also, I'm not using the terminal to check the temperature.· That is all done on the BS2, every cycle through the loop.· All the terminal is being used for is to send the command to actually set the target temperature on the stamp.· This only has to be done as often as we want the temperature to change.

I'm currently using macros in the PBASIC debug terminal.· I have a macro set to "*Tx" where the decimal value of 'x' is 88 (for 88 degrees) and another for "*TX" where the decimal value for 'X' is 120.· These would set the targetTemperature on the stamp to 88F and 120F, respectively.· It should, therefore, be received just fine by SERIN... am I wrong?··Maybe there is something·about the·way the macros are being sent that is causing problems.· I don't know why this could happen, but that *could* account for garbage input.· I just made the assumption that the "*Tx" macro would be sent in a single 3 byte packet.

As far as actually monitoring the temperature... I am currently using DEBUG to output a line consisting of the actual temp and the target temp.· Once in a while, the macro will work and the DEBUG output will reflect the new target temperature correctly, as will the actual temperature over time.

I can't help but to think this is somehow related to the timing between the stamp and the PC.· When the timing actually works out right, the command comes through... it seems though that waiting for the leading "*" would have fixed that.

If you don't mind, a potentially silly question... I noticed that DEBUGIN used "a special case of SERIN" with inverted input, etc...·· Is there any particular reason that it communicates with the debug terminal using inverted input?· Is this something that could account for my problem... something related to the way the *terminal* is communicating as opposed to the way the stamp is?· If so, it seems odd that it would occasionally work the way it is.

Also... sorry again to sound like a rank amateur... what is the purpose of inverted input?


Jon Williams
02-05-2005, 04:17 AM
You have to invert serial signals going through the programming/DEBUG port because there are hardware inverters inline.

Jon Williams
Applications Engineer, Parallax
Dallas, TX· USA

Tom Walker
02-05-2005, 04:32 AM
This sounds like a look at the "Industrial Controls" documentation would be of benefit. Isn't one of the projects a PID heating kind of thing with a Stamp as the controller?


02-05-2005, 05:06 AM
Hmmm... I'll check it out.

I was thinking that my problem was more of a "newbie" type problem concerning the input. The temperature works fine, it just isn't able to read the commands to change temps. I'll look at it for sure, though.

A quick question...

Is it possible that I would be able to read the data more reliably if I read into a WORD instead of 2 separate BYTEs? I would just try it, but I'm not able to get to the hardware at the moment.

02-05-2005, 05:12 AM
RS-232 is a byte oriented protocol, so it will be sent and read as bytes. The BS2 has the DEC modifier, which can load the "100" string INTO a Word as a '100' value, so you'll use less space in the BS2.