Shop OBEX P1 Docs P2 Docs Learn Events
[SOLVED] Serial Comunication with Labview — Parallax Forums

[SOLVED] Serial Comunication with Labview

BernardoBernardo Posts: 10
edited 2014-03-03 23:21 in BASIC Stamp
Hi,

I'm trying to use a BS2 on a HomeWork Board (USB) to drive a Continuous Rotation Servo while controlling [at least] the direction through Labview.

The code I'm working is attached, and I've tried the debuging (even changing the Latency Timer of the USB port). I've also tried small snipets of code, such as:
' {$STAMP BS2}
' {$PBASIC 2.5}

Main:
  DO
    PAUSE 20
    PULSOUT 13, 650
  LOOP
and
(Baud selection part from help)
Baud            CON     T9600 + Inverted        ' match DEBUG
SerialPin       CON     16
comando VAR Byte

Main:
  SERIN  SerialPin, Baud, [comando]
  SEROUT SerialPin, Baud, [comando]
  GOTO Main
The change from DO...LOOP to GOTO reflects the fact that more recently I've been trying to use the fastest code possible (as far as I know) as this is extremely important in the attached code.
While both the second snippet and de DEBUG and DEBUGIN comands work fine (notice I am using port 16, the same as the debugger, for serial coms), the motor is hard to start even with the simple snippet above. It requires a small push on the right direction to start and only then it runs fine. So first question:

1. I would like to know if this reluctance to start indicates a problem with the servo (which is brand new) or if it might be a timing issue with the code, or even the battery (9V Alkaline feeds board and servo through (+5)Vdd) reaching the end of its life, or if I should switch to a 1.5V 4-pack independent source for the servo.

Referring to the attached code, I am using Stamp_Serial_Comm.vi to communicate with the device.
As you can see, I have included a "-" on the SEROUT that acknowledges the received instruction. On the Labview (v11.0 32bit) end, I had a "+" to the number I'm sending, to act as the first non-numerical symbol.
I ask the vi to receive four bytes after sending the message, two from the echo and two from the acknowledgement.
For instance, if I send
5+
I receive
5+1-
As expected, the echo, the remainder by 4 of the acquired value (5%4=1) and the acknowledgement sign.
So I expect the variable Comando would have a new value and thus Main would branch to the corresponding section.
But the servo, after a brief pause (that happens after the message is sent and received or after the timeout warning on the vi), resumes spinning in the same direction (after a soft push on that direction - if I try the other direction it seems to require too much force and thus I haven't made it spin the way it is intended by force).
Notice I initiate Comando with the value 2 (even though it is supposed to start with 0 when everything works fine). If I start with other values (0,1,3...)the Main branches correctly, so I can start the program and make the servo spin one or the other way or stay still. If I use the commented debugs, I always get a string like:
62626262626262626262626262......
On the debug window, or a similar output with the echo and acknowledgement somewhere in the middle on the labview "Read String" field (reading ~10 bytes should suffice). I have already tried using DATA @0, 2 to place a Comando-like "variable" on the EEPROM, and WRITE and READ accordingly to update Comando on the Main portion of the program, but to no avail.
So the second question is:

2. Why isn't Main branching as intended? What am I doing wrong?

On top of that, communication is fiddly, sometimes it works and sometimes it fails, even after testing several baudrates. This should be mitigated by the feedback, as the main vi could detect the response and thus resend if required. But still it bugs me that the servo stops only after the timeout is reached on the vi.
I have also tried several timeouts for the SERIN, as this is what controls the 20ms gap between pulses for the servo. It works from 18 to 20, after the same gentle push to start the servo.
The third question is

3. Why is the servo only stopping after the timeout on the vi, shouldn't it stop as messages are received / sent by the BS2?

Finally:

4. Is there an easier way to do what I'm trying to do?

I looked through the forums, I already know there are no accessible interrupts in this development environment (previous experience with Arduino and ATmega left bad habits), and that it is a tricky issue to use features such as POOL (which is not even applicable to the BS2) with PWM and serial communications. I don't mind the servo stopping while the message is received, as most of the times the message will be to stop or change direction of rotation. But it must spin until it gets a new command from the vi, preferably in a smooth rate for a vi controlled period of time.

The servo is to be used in a crank shaft configuration, to drive a dynamometer to-and-fro, that measures the required force to move it (should not exceed 100gF). The vi has other indicatores that can be used to automatically correct drift after a few cycles, if required, so I am not particularly picky about it, but would prefer if there were none.

Last but not the least, I'm using Windows 7 Pro 32bit, on a Asus X5AVn laptop (intel core2 duo p8700 2.53GHz, 4GB RAM (1GB is borrowed to graphics card - nVidia 9650M GT - which has another 1GB dedicated). The HomeWork board appears as USB Serial Port (COM6) on the device manager, with the FTDI 2.8.28.0 version (18-01-2013). I also use a USB mouse, on another USB door (the laptop has 3, the last one is unused or has a pen attached).

Thanks in advance for any help, sorry if this is just a rookie mistake / bad programing habits or bad choice of platform, I only started working on this last friday (and didn't touch it during weekend), so I admit a little more time could get me there. Still, if someone has had similar issues, any advice is welcome.

Best regards,
Bernardo Bordalo

Comments

  • Chris SavageChris Savage Parallax Engineering Posts: 14,406
    edited 2014-02-10 09:56
    Hello,

    There's a lot of branching based on commands, which are sent as ASCII text. I think if I were writing this I might try something more like:
    DO  SERIN SerialPin, Baud, 20, Main, [pulseWidth.HIGHBYTE, pulseWidth.LOWBYTE]
    Refresh:
      PULSOUT ServoPin, pulseWidth
      PAUSE 5
    LOOP
    

    This way you'd be passing a single pulse value to the servo and no decoding of the command would be necessary. The high/low byte order would have to match how you're sending from LabView. Again, this is just how I would do it, given the information you have provided.
  • BernardoBernardo Posts: 10
    edited 2014-02-10 11:47
    Thank you Chris,

    Your code is much simpler and elegant and looks like it can do the same - I'll give it a try tomorrow (I left the message in the board moments before leaving the Faculty).

    The purpose of using a single number instead of the full pulseWidth as the message comes from my habit of programming such devices as state machines (even in LabVIEW I tend to use this kind of architecture), and possibility to use the same single number to enter different states, even if not yet foreseen (modularity in mind, using other pins for other devices, etc.).

    I suppose Main should be read as Refresh:
    DO
      SERIN SerialPin, Baud, 20, Refresh, [pulseWidth.HIGHBYTE, pulseWidth.LOWBYTE]
      Refresh:
      PULSOUT ServoPin, pulseWidth
      PAUSE 5
    LOOP
    
    By the way, won't the PAUSE 5 get in the way of the 20ms between pulses for the servo? I'll try it tomorrow and report results.

    And I know it is kind of useless to press with my code when this example is better, but I would really like to understand what I did wrong and why wasn't I achieving the intended branching, so I can avoid it in the future.

    Thank you again, best regards,
    Bernardo Bordalo
  • Chris SavageChris Savage Parallax Engineering Posts: 14,406
    edited 2014-02-10 15:25
    The servo requires a minimum of 20 ms in between pulses as per the typical specification (some can handle 10-15 ms), however a little more doesn't hurt. Since we don't know how quickly the SERIN command will be serviced we need to make sure there's some overhead in delays. You may even need to increase the PAUSE 5 to a higher value if the servo starts stuttering. All of this really depends on how fast data is coming in. I don't recall without testing if the SERIN variable is nulled during a timeout. If it is this could be a problem, but I don't think it is. That would be easy enough to test,
  • BernardoBernardo Posts: 10
    edited 2014-02-11 03:45
    Good morning,

    I tested your code today, with the attached vi to send the PulseWidth, and added a manual echo from the device to make sure the number is received, as well as the WAIT ("+") inside the SERIN to also ensure only valid values are accepted by the BS2.

    However, the servo still refuses to change direction (or start moving when initiated to stand still), even when the pulseWidth is received and echoed as intended.

    If I start with the servo stopped and make the vi run repeatedly sending the PulseWidth the servo responds in small jolts in the right direction.

    Is there anyway that the
    pulseWidth = 750 ' Start stopped
    
    is being run inside the loop somehow?

    On a timeout it doesn't seems the variable is nulled, it could be 'reset' to starting value, thought that would be even weirder. I tested (attached Test_Serin_Null.rar) and to my surprise when I send 650 (the value is only passed to pulseWidth if it is larger than 650) I get:
    +Š-+750-+650-
    despite having sent other values previously. The meaning is:
    +"Sent high and low bytes"-+"DEC pulseWidth "-+"DEC secondThing "-
    For any other value:
    +£-+675-+675-
    So when the value of secondThing is not given to pulseWidth it goes back to 750 somehow. EDIT: it seems to go back in any case, as the servo remains stoped whatever value is sent.

    Am I assigning the values the wrong way? Maybe confusing memory pointers and values they point to?

    Thanks,
    Bernardo Bordalo

    Ps.:On the bright side, I isolated the problem of the weak response to bad contact in the wiring, so for anyone having a stuttering or hard to start servo, check your wiring.
  • BernardoBernardo Posts: 10
    edited 2014-02-11 07:00
    In the meanwhile, I've found an example similar to what I intend to do here.

    The code "Intended to illustrate SERIN and use of the LOOKDOWN command" ran and I was able to use the DEBUG console to comunicate and obtain the correct answers, and when I changed the prompt line to
    SEROUT 16, 84+$4000, 10, ["Last was ", DEC CHOICE, ". Now enter a number which is less than 100", CR, 10]
    
    the returned value for choice was always right whether or not the timeout was reached.

    This knocked away the suspicion of "Unwanted resets from noise on the serial port ATN line", as described here.

    I can't really figure why the value is always changed back to its initial value in my case, I must be missing some detail.

    Thanks for the help so far,
    Bernardo
  • BernardoBernardo Posts: 10
    edited 2014-02-11 08:19
    After some testing I finally tested if the device was reset after receiving the signal (added a DEBUG "RESET!" before the main loop). To my surprise, it was reset not only once but twice:
    +R-
  • BernardoBernardo Posts: 10
    edited 2014-02-11 08:49
    The FAQ available here sums three possible solutions to the ATN pin being activated after SERIN:
    1. disconnect the DTR to ATN pin connection when trying to communicate serially to the BASIC Stamp II
    2. explicitly set the DTR pin to low after selecting the port if the terminal package or development tool allows this
    3. modify the programming connection to include a 0.1 uF capacitor in series with the DTR to ATN pin connection.
    According to it, the last two options allow the use of the same connection for both programming and serial communications. So one can use a property node, as indicated here, to control the value of the DTR pin.

    However there are three options:
    1. Unknown
    2. Unasserted (high)
    3. Asserted (low)
    that provide the respective outputs:
    1. [nothing]
    2. +R-RESET!
    3. +R-
  • BernardoBernardo Posts: 10
    edited 2014-02-11 09:25
    Apparently I'm not alone, though 9 years late... I've already tried to delay the data I'm sending after opening the port on Labview, to as much as 20ms, but the results were the same.

    Would change SerialPin to another value help? I mean, not using the debug channel?

    What bugs me is that it appears the reset only happens after SEROUT, so I removed the Close VISA. Nothing changed. Then I removed VISA Read - and now it no longer resets!

    The disadvantage is that I have no feedback to whether the message was received or not, and the only way I have to ensure this is to repeat the message enough times until the servo starts, and then I can stop and the servo keeps moving.

    I'll try to find why VISA Read causes the reset and report back. If I manage to receive messages without reseting the device, I'll post the solution and edit the top message with a link to the that post in bold, as well as mark it 'solved'.

    Being as it is, only the problem is found, not the solution.

    Hopefully this will be helpful to somebody else, best regards,
    Bernardo
  • Chris SavageChris Savage Parallax Engineering Posts: 14,406
    edited 2014-02-11 15:58
    It sounds like the DTR line is being toggled by the software. When this happens it will reset the BASIC Stamp Microcontroller. DTR is used for this purpose.
  • BernardoBernardo Posts: 10
    edited 2014-02-12 03:15
    Yes, both VISA Read and VISA Close are somehow using the DTR line.

    I'm trying to find the setting to change this, meanwhile I can use the servo without feedback - just have to send the message enough times to ensure it was received, and hope the drift isn't that great so it doesn't have to be corrected too often.

    Thanks for the help and feedback,
    Bernardo
  • BernardoBernardo Posts: 10
    edited 2014-02-19 07:28
    Hi,

    It took me a while to come back to the issue after settling for brute force instruction sending (send the message with no feedback). After all I had still to configure the lock-in, the sourcemeter and the DAQ to work together.

    But today I returned to the issue and I think I solved it. What I did was to add a pause in the BS2 after SERIN and SEROUT, and to wait in labview for the whole response to be sent, read the number of bytes to be read and then read. I also flushed the buffer before sending anything and after receiving anything.

    Attached are the VI's and BS2 code I'm using, for any question related to them you can contact me at up200704656[at]fc.up.pt

    Thanks for all the insights (particularly the "Since we don't know how quickly the SERIN command will be serviced we need to make sure there's some overhead in delays"), best regards,
    Bernardo

    EDIT: the loop where 'Send Instruction' waits for the correct response should have a 'number of attempts' limit, so in case the device stops responding for any reason the VI doesn't get stuck there. Use with care and insert required changes.
  • BernardoBernardo Posts: 10
    edited 2014-03-03 16:39
    Hi,

    Just a finishing note: if you feed the servo with the 9V battery (through one of the 5Vcc pins on the board), the current drop when the load varies can also cause a reset of the board as it is temporarily blacked out.

    So find another power source for the servo, or try using a capacitor (large one) to avoid the current drops caused by the servo.

    Best regards,
    Bernardo
  • GenetixGenetix Posts: 1,754
    edited 2014-03-03 23:21
    Mike Green has mentioned several times that a servo can draw up to 1 Amp and that a 9V battery can not continuously deliver that much current.

    The BOE-Bot uses a 4 AA battery holder but Parallax also sells a 5 AA battery holder for use with rechargeable batteries.
Sign In or Register to comment.