Shop OBEX P1 Docs P2 Docs Learn Events
Stingray Raspberry Pi trying to see! — Parallax Forums

Stingray Raspberry Pi trying to see!

ratronicratronic Posts: 1,451
edited 2014-09-09 13:16 in Robotics
I am working with a Stingray robot that has a Raspberry Pi bolted to it. The Pi serial port ttyAMA0 is connected to the Propeller to pass object tracking data from a USB webcam.

The serial tracking data is sent after every captured video frame from a Python program that is using OpenCV2 to mask a particular color (orange in this program).

Then it sorts out the x, y centroid of the largest detected orange object in the video frame and sends a flag of !& then a byte each of the x, y screen location. Then a word z showing how

large the object is to the Propeller. This video only shows the Stingrays rear x/y camera platform trying to keep physical track of the detected object. I am attaching a test program that

displays the tracking data x, y, z and the FPS(frames per second) in the Parallax serial terminal called RaspiPropTest.spin. The attached Python program called Detect.txt should be saved as

Detect.py to the Pi. With the Detect.py program I get ~ 12 to 13 FPS because I am displaying the small video's on the Pi HDMI monitor. The Rasbian OS on the Pi causes the frame rate to

fluctuate. An example is when you move the Pi's mouse the frame rate drops to around 7. The best I've been able to get is a little over 14 FPS with frames resized @ 110 by 80.

The video has to play full screen to see the x, y camera platform on the back of the Stingray try to follow the orange ball on top of the Scribbler2. The ball falls off the Scribbler and the camera

keeps track. In the end though as I am waving the ball around with a grabber it loses track. I attached all of the programs I'm using as I am just starting on this adventure. The center method

and command when removed and the blocked out portion restored of the top Spin program in the zip file is my current experiment with moving the whole Stingray to try to track the orange ball.

Sorry if you can't see the camera platform moving in the video.

Comments

  • RobotWorkshopRobotWorkshop Posts: 2,307
    edited 2014-06-28 19:32
    Nice job on this project! It is nice to see you have the two boards working together. Have you tried connecting the Propeller board to the Pi with the USB connection? You should be able to use that for both loading new Propeller code and also for talking back and forth. I was going to add a small powered USB on my robot and go this route.

    Robert
  • ratronicratronic Posts: 1,451
    edited 2014-06-28 19:47
    Thanks Robert. It took a bit of searching to find that method of detecting colors. As far as connecting to the Pi you can choose the route. Using the serial port on the Pi does take a little extra setup at the start so choose the easy way that's my motto.
  • Bill HenningBill Henning Posts: 6,445
    edited 2014-06-30 20:51
    Nice work Dave!

    I hope to add one of my Pi cameras to Elf soon, but work has kept me from it so far.

    Bill
  • ratronicratronic Posts: 1,451
    edited 2014-07-13 16:20
    This video is a little better to see than the last one. It is still using the Detect.py from the first post. One thing I learned is consistent lighting is a must. It also helps to have no

    color close to your set color in the picture other than your color object. I made the Spin program wait until it reaquires an object before it moves the AX12's. It is still slow tracking.

    I have been trying to build a C++ program on the Pi with OpenCV but with no luck so far. It will be interesting to see what Bill comes up with using the Raspberry Pi camera.
    Pub Main
    
      dyn.start(DYNA, 1_000_000)
      fds.start(RXPIN, TXPIN, 0, 115_200) 
      initRobot
      
      repeat
        centerCamera
        
    Pub centerCamera                                            
                                                          
      x := dyn.getsrvpos(CAMX)                            
      y := dyn.getsrvpos(CAMY)                            
                                                          
      repeat
                                                    
        objectDetect                                      
        case xAxis                                        
          0..49   : x += SKIP                               
                    dyn.setsrvpos(CAMX, (x <#= 900))      
                                                          
          50..60  : xAxis := 55                           
                    quit                                  
                                                          
          61..109 : x -= SKIP                               
                    dyn.setsrvpos(CAMX, (x #>= 400))      
                                                          
      repeat
                                                    
        objectDetect                                      
        case yAxis                                        
          0..34  : y += SKIP                                  
                   dyn.setsrvpos(CAMY, (y <#= 900))       
                                                          
          35..45 : yAxis := 40                            
                   quit                                   
                                                          
          46..79 : y -= SKIP                                
                   dyn.setsrvpos(CAMY, (y #>= 400))
    

    I have also attached a program for letting you tweak your detected color called detexp.py. It shows larger masked and real video pictures with scrollbars added to adjust the low and high

    threshhold values for H (HUE - HL,HH) and the lower threshold values for S (Saturation - SL) and V (Value VL). You can get a little more info on the HSV colorspace here.

    When you have your color masked well you can use the numbers in the Detect.py program to send your detected color object coordinates and size to the Propeller.

    To use the python programs save the .txt files as .py files on the Pi. To use Python with OpenCV2 on the Pi enter sudo apt-get update && sudo apt-get upgrade .

    Then enter sudo apt-get install libopencv-dev python-opencv . Reboot and you can use these Python programs with a USB webcam. The Pi is not quick so you have to be patient using the

    mouse to adjust the values.
  • NWCCTVNWCCTV Posts: 3,629
    edited 2014-07-13 19:54
    use these Python programs with a USB webcam
    But not the Pi Cam?
  • ratronicratronic Posts: 1,451
    edited 2014-07-14 08:24
    Andy I am just starting to figure this stuff out. Bill Henning say's he is going to work on the Pi camera module for his system on RoboPi. This guy has the Raspberry Pi camera module working very

    well with his balancing robot here - http://letsmakerobots.com/node/38610 . Kind of makes me feel embarrassed about how far I've gotten so far.
  • ratronicratronic Posts: 1,451
    edited 2014-07-31 13:29
    I have been experimenting with using the Activity board and a regular servo plugged into the P16 jack. The servo has the RPi camera module mounted to it (in a case) to pan the camera.

    The Activity board needs external power for the servo. I'm at the moment using SimpleIDE on the RPi to work on/load the Spin program via the Pi's USB port ttyUSB0. Also the Spin

    program uses the same USB port to monitor the Pi's object detection coordinates. The Python program tst.txt should be saved as tst.py on the Pi. This program outputs around 2 - 3 updates a

    second. The camera's video frame detected object coordinates for x = 0 left edge of frame to 109 the right edge. Center = 55. This is very crude but works. But maybe someone w/ a

    pan/tilt servo setup, it could be a start. The servo keeps track by keeping the object in the center of the x axis so that the camera follows the object.

    EDIT: The Python code shows small video pictures w/ the real video picture having a small red box around the detected object. The other small video shows the masked objects in black and white - using a HDMI video monitor.
    Con                  
                                                               
      _CLKMODE = XTAL1 + PLL16X                              
      _XINFREQ = 5_000_000
      
      RXPIN = 31  'prop USB to ttyUSB0 on Pi 
      TXPIN = 30  'prop USB to ttyUSB0 on Pi
      SERVO = 16  'prop output to camera x servo
    
    Var
    
      long xAxis, yAxis, zAxis, x, y
         
    Obj
    
      srv : "Servo32v7"  
      fds : "FullDuplexSerial"
                                
    Pub Main
    
      x := 1_500
      srv.start
      srv.set(SERVO, x)
      fds.start(RXPIN, TXPIN, 0, 115_200)
      
      repeat
        
        objectDetect
        case xAxis
          0..39   : x += 50
                    srv.set(SERVO, x <#= 2_000)
    
          40..49  : x += 2
                    srv.set(SERVO, x <#= 2_000)
          
          61..70  : x -= 2
                    srv.set(SERVO, x #>= 1_000)
    
          70..109 :  x -= 50
                     srv.set(SERVO, x #>= 1_000)
        
    Pub objectDetect | zl, zu                   'object detect tracking info from pi  
                                                'xAxis = 0 to 109 horizontal lft/rgt  
      repeat                                    'yAxis = 0 to 79 vertical up/dwn      
        if fds.rxcheck == "!"                   'zAxis = 0 to word size sm/lg
          if fds.rxtime(10) == "&"                                                   
            xAxis := fds.rxtime(10)                                                  
            yAxis := fds.rxtime(10)                                                  
            zl := fds.rxtime(10)                                                     
            zu := fds.rxtime(10)                                                     
            zAxis := zl + zu << 8                                                     
            return
    
  • RsadeikaRsadeika Posts: 3,837
    edited 2014-07-31 16:16
    It looks like you are comming along nicely with your project. Since I am considering using a USB webcam for my project, could you describe in some detail as to how you have your webcam setup and how you are using it. Also could tell me what brand your webcam is? I take it that you have your webcam plugged into the Raspberry Pi? so can you view streaming video from your webcam and where do you view it?

    Ray
  • ratronicratronic Posts: 1,451
    edited 2014-07-31 16:48
    Ray the USB webcam is a Logitech C270 and plugs straight into the Pi. If you are using the latest distribution of Rasbian 2014-06-20 then the only thing you need to download to the Pi to use any of the Python programs I posted is the opencv library.

    To do that at the command prompt in the Pi terminal type sudo apt-get install libopencv-dev python-opencv .That takes a few minutes to finish. So far I get better results using the USB webcam. I get 12 - 13 responses a second. As far as what port

    you want to send the serial data out this is the line in the Python code for sending out USB -
    ser = serial.Serial('/dev/ttyUSB0', 115200)
    

    This is the line for sending out the Pi's serial port -
    ser = serial.Serial('/dev/ttyAMA0', 115200)
    

    EDIT: I view it on a HDMI monitor. Also I would try whatever webcam you have it may work. The webcam I listed puts around 7 errors out before it starts up. I have a Creative live webcam that gives no errors.
  • ratronicratronic Posts: 1,451
    edited 2014-08-06 11:06
    Here is a program detexpicam.py to adjust the threshold values for the Raspberry Pi camera module to tweak your color. It is similar to the detexp.py in post#5 for the USB webcam

    but this is for the Raspberry Pi camera module as used in post#8. I'm posting some screen shot's of the Pi desktop with video screens showing a green and a orange ball sitting on the floor

    and how I tweaked the threshold values to detect the green or orange ball. There is also a shot of Detect.py running on the desktop to show how small the videos are.

    EDIT: This does not adjust any camera settings. It only adjusts the program for what the camera is sending. Though the camera settings can be changed using the picamera module which is used in this program.

    EDIT2: The pix of small videos is really the Pi camera running tst.py from post#8 but the Detect.py gives similar videos from USB webcams.
    1024 x 552 - 41K
    1024 x 552 - 41K
    1024 x 552 - 33K
  • ratronicratronic Posts: 1,451
    edited 2014-08-21 09:42
    I have found a raspicam driver here that I followed the instructions just after the part of loading the driver at bootup. I did not load the last 3 packages on that page.

    I did try them but did not need them for what I'm doing. But after installing this driver instead of using a USB webcam with all my above programs that do, instead it

    will use the Raspberry Pi camera module in it's place as the driver creates the device /dev/video0. The way I have it set now the Propeller is reporting 29 - 30 object

    detection location/area packets per second and there is no detectable latency. Maybe a few milliseconds. It is lower resolution video than what I have been getting.

    I will post a video with the Stingray's arm vertically tracking/following an object when I get it tuned in a little better.
  • ratronicratronic Posts: 1,451
    edited 2014-08-22 13:00
    I was able to get the Stingray's arm to follow the object vertically and the whole robot to keep the detected object in the middle of it sight. The area returned can change depending

    on the light source and fluctuates too much. So I was unable to use the camera so far for distance following like I did with a pair of ping sensors here.
    Con                        
                                                             
      _CLKMODE = XTAL1 + PLL16X                              
      _XINFREQ = 5_000_000
      
      DYNA  = 0           'dynamixel servo bus i/o
      CAMX  = 1           'dyna servo id x camera platform 
      CAMY  = 2           'dyna servo id y camera platform 
      MARM  = 3           'dyna servo id main arm
      LGRP  = 4           'dyna servo id left gripper 
      RGRP  = 5           'dyna servo id right grippper 
      RSRV  = 12          'right ping servo output                         
      LSRV  = 13          'left ping servo output                           
      RPNG  = 14          'right ping sensor i/o                            
      LPNG  = 15          'left ping sensor i/o                             
      PWMB  = 24          'stingray pwm base output pin
      WSPD  = 80          'stingray wheel speed limit (0-255)
        TX  = 30          'tracking transmit output (to Pi USB0)
        RX  = 31          'tracking receive input   (to Pi USB0)    
        
    Var
      
      long x, y, xAxis, yAxis, zAxis
            
    Obj
     
      fds    : "FullDuplexSerial"
      dyn    : "DynaComV4"
      srv    : "Servo32v7"
      pid[2] : "PID"
      pwm    : "PWMx8"         
          
    Pub Main | lo, ro
    
      fds.start(RX, TX, 0, 115_200) 
      dyn.start(DYNA, 1_000_000)
      dyn.setsrvspdall(300)
      pwm.start(PWMB, %0000_1111, 2_000) 
      srv.start
      initRobot
      pid[0].init(2.5, 0.0, 0.4, 55.0, 0.0, 0.0, 0.0)
      pid[1].init(2.5, 0.0, 0.4, 55.0, 0.0, 0.0, 0.0)
     
      fds.rxflush     
      repeat
        objectDetect
        zAxis -= 50
        dyn.setsrvpos(MARM, 600 - yAxis << 2)
        lo := -pid[0].calculate(xAxis) 
        ro := pid[1].calculate(xAxis) 
        lo := -WSPD #> lo <# WSPD 
        ro := -WSPD #> ro <# WSPD 
        moveRobot(lo, ro)
                   
    Pub initRobot                               'set initial robot servo positions
    
      srv.set(RSRV, 1500)      'right ping face forward
      srv.set(LSRV, 1000)      'left ping face forward
      dyn.setsrvpos(CAMX, 512) 'lower# = cw, higher = ccw 0-1023
      dyn.setsrvpos(CAMY, 490) 'lower# = down, higher# = up
      dyn.setsrvpos(MARM, 370) 'main arm level with ground
      dyn.setsrvpos(LGRP, 550) 'left gripper claw open
      dyn.setsrvpos(RGRP, 490) 'right gripper claw open
      waitcnt(clkfreq*2+cnt)   'delay for servo position change 
    
    Pub moveRobot(leftWheel, rightWheel)      'set the speed of stingray's left and right wheels, -255 max.reverse to 255 max.forward, 0=stop    
                                                                                              
      ifnot leftWheel & $8000_0000                                      
        pwm.duty(PWMB + 1, leftWheel & $ff)                            
        pwm.duty(PWMB, 0)                                                          
      else                                                                             
        pwm.duty(PWMB + 1, 0)                                                      
        pwm.duty(PWMB, (0 - leftWheel) & $ff)                        
                                                                                       
      ifnot rightWheel & $8000_0000                                 
        pwm.duty(PWMB + 2, rightWheel & $ff)                           
        pwm.duty(PWMB + 3, 0)                                                      
      else                                                                             
        pwm.duty(PWMB + 2, 0)                                                      
        pwm.duty(PWMB + 3, (0 - rightWheel) & $ff)                
      
    Pub objectDetect | zl, zu                   'object detect tracking info from pi
                                                'xAxis = 0 to 109 horizontal lft/rgt
      repeat                                    'yAxis = 0 to 79 vertical up/dwn
        if fds.rxcheck == "!"                   'zAxis = 0 to word size sm/lg
          if fds.rxtime(10) == "&"  
            xAxis := fds.rxtime(10)   
            yAxis := fds.rxtime(10)   
            zl := fds.rxtime(10)       
            zu := fds.rxtime(10)       
            zAxis := zu << 8 + zl        
            return
    
  • ercoerco Posts: 20,256
    edited 2014-08-22 15:23
    Beautiful job Dave! Great tracking.
  • WhitWhit Posts: 4,191
    edited 2014-08-22 19:40
    Amazing Dave! The best part is when you drop the ball and it goes rolling across the table and the bot stays right on it!

    Thanks for the update.
  • mindrobotsmindrobots Posts: 6,506
    edited 2014-08-23 05:24
    Impressive!

    How much our little friends can learn when we help them along!

    Thanks for sharing your project with us!
  • Bill HenningBill Henning Posts: 6,445
    edited 2014-08-23 06:03
    Looks great!

    I am getting closer to working with the Pi camera module, my TODO list is starting to get under control.
  • ratronicratronic Posts: 1,451
    edited 2014-08-23 08:51
    Thanks everybody. I do have to state that I'm a beginner to this stuff so there is probably a better way to go about. Whit ''dropping the ball' was not planned but I was surprised too that the robot kept track.

    This is all just playing around but now that Stingray has some vision I will have to figure out something to do with that!
  • ratronicratronic Posts: 1,451
    edited 2014-08-27 10:17
    One last mention for this thread - if you have installed the raspicam driver as described in post#12 then you can use this program to find the threshold values for your color and

    light conditions with the Raspberry Pi camera module or a USB webcam. It picks the USB webcam if it is plugged in at powerup or the Pi cam if the USB cam is unplugged. I also attached

    screen shots of both cameras. This program also shows the frame rate in the real video picture. The Pi cam has lower resolution with hardly detectable latency and the USB web cam has

    great resolution with about a second of latency. The frame rates shown for both camera's the picture size is 320x240. Smaller videos sizes increase the frame rate. When I use 110x80 for

    object detection the frame rate flickers between about 15 ~ 30 FPS with sending serial detected object info from the Pi cam module. The .txt is supposed to be .py .

    Edit: Answered next post in the thread listed.
    1024 x 552 - 50K
    1024 x 552 - 45K
  • WhitWhit Posts: 4,191
    edited 2014-08-29 11:13
    Dave,

    Could you take a look at this thread and give me some input? http://forums.parallax.com/showthread.php/157157-S2-ROS-ala-ChrisL8-and-Arlo-Brainstorming-Thread

    Since you are interfacing a Pi and Propeller, thought you might have some advice...Are you communicating with the Pi via SSH? Any tips would be appreciated.
  • ratronicratronic Posts: 1,451
    edited 2014-08-31 08:36
    Whit a little more detail about my setup than I gave in the other thread. For that last video I posted the Raspberry Pi B+ is setup to run the Python program det.py at bootup.

    As for the Spin program stngfllw.spin I used SimpleIDE Version 0.9.64 on the Pi to load the program to EEPROM via USB. The R-Pi is powered from Stingrays 5v on the P1 connector.

    So when the power switch on the Stingray is turned on both boards start, the Stingray finishes first and waits for data via USB from the Pi after it finishes booting.

    I have a wireless keyboard/mouse USB dongle and a wireless WiFi USB dongle. So I can work on the Pi directly with a HDMI monitor connected and/or a ssh connection to my

    computer via WiFi. When I mentioned what I was doing with the WiFi dongle in the other thread you can also use it to connect and control the Pi from the internet. I have no idea how

    to do that at the moment. I will be happy to answer any other question you might have (if I can!). Have lot's of fun with your project Whit!
  • RsadeikaRsadeika Posts: 3,837
    edited 2014-08-31 10:20
    ...you can also use it to connect and control the Pi from the internet. I have no idea how
    to do that at the moment.
    Not sure if you are using Windows or Linux, but in Windows, at the Start prompt, type in 'mstsc', this is a Windows version of remote desktop, then you will be asked for the ip address. Of course you have to have done 'sudo apt-get install xrdp' on the Raspberry Pi and an ifconfig. If you are using a Linux desktop then just choose the remote desktop program. On my setup, this is my preferred way of programming the Activity Board using SimpleIDE.

    Ray
  • ratronicratronic Posts: 1,451
    edited 2014-08-31 11:20
    Thanks so much for that Ray! I thought the Pi might be to slow to do that. But now I have a graphical Pi desktop on my Windows computer that I can check video stuff with.
  • RsadeikaRsadeika Posts: 3,837
    edited 2014-08-31 12:37
    I thought the Pi might be to slow to do that.
    I am also using a class 10 uSD card on my Pi B+, that makes a big difference. I have used some of the standard SD cards, and they seem to slow down the Pi, or at least it seems like it is not as responsive.

    Ray
  • ratronicratronic Posts: 1,451
    edited 2014-08-31 14:28
    I only use class 10 cards with the Pi and I should probably let this thread know my Pi is always clocked @ 800mhz instead of the default 700mhz.
  • ValeTValeT Posts: 308
    edited 2014-08-31 15:00
    By any chance do you know if I can communicate to the ActivityBoard with Java?

    I am trying to use java to communicate with the activityboard, and cant seem to get it to work. Since you were able to get I to work on a RPI with python, do you think it possible to do it with java?
  • ratronicratronic Posts: 1,451
    edited 2014-08-31 15:02
    I have no Java knowledge maybe someone else will step in.
  • WhitWhit Posts: 4,191
    edited 2014-09-01 05:54
    Rsadeika wrote: »
    Not sure if you are using Windows or Linux, but in Windows, at the Start prompt, type in 'mstsc', this is a Windows version of remote desktop, then you will be asked for the ip address. Of course you have to have done 'sudo apt-get install xrdp' on the Raspberry Pi and an ifconfig.

    Ray - Holy smokes! This is awesome. CAN"T wait to try this. Why haven't I found this info anywhere else? I was going to try tightvnc, but there were so many security warnings that I got a little scared to try it. Thanks so much for this tip!

    By the way, a friend of mine recommended the class 10 uSD cards when I was starting with my Pi. It is all I have and my Pi is very responsive.

    Whit+
  • ratronicratronic Posts: 1,451
    edited 2014-09-09 13:16
    I have figured out that using the command sudo apt-get install python-opencv only gets OpenCV 2.4.3 which is the version I used for everything above.

    The 2.4.3 version only outputs 64x64 pixel frames using the Pi camera module. I have since installed version 2.4.9 (latest working version) which I actually built on the Pi.

    With 2.4.9 you are no longer restricted to 64x64 frames (I am using 640x480). Even though I only resize to 110x80 for object detection the video is better looking now using 2.4.9. FYI.
Sign In or Register to comment.