Stingray Raspberry Pi trying to see!
I am working with a Stingray robot that has a Raspberry Pi bolted to it. The Pi serial port ttyAMA0 is connected to the Propeller to pass object tracking data from a USB webcam.
The serial tracking data is sent after every captured video frame from a Python program that is using OpenCV2 to mask a particular color (orange in this program).
Then it sorts out the x, y centroid of the largest detected orange object in the video frame and sends a flag of !& then a byte each of the x, y screen location. Then a word z showing how
large the object is to the Propeller. This video only shows the Stingrays rear x/y camera platform trying to keep physical track of the detected object. I am attaching a test program that
displays the tracking data x, y, z and the FPS(frames per second) in the Parallax serial terminal called RaspiPropTest.spin. The attached Python program called Detect.txt should be saved as
Detect.py to the Pi. With the Detect.py program I get ~ 12 to 13 FPS because I am displaying the small video's on the Pi HDMI monitor. The Rasbian OS on the Pi causes the frame rate to
fluctuate. An example is when you move the Pi's mouse the frame rate drops to around 7. The best I've been able to get is a little over 14 FPS with frames resized @ 110 by 80.
The video has to play full screen to see the x, y camera platform on the back of the Stingray try to follow the orange ball on top of the Scribbler2. The ball falls off the Scribbler and the camera
keeps track. In the end though as I am waving the ball around with a grabber it loses track. I attached all of the programs I'm using as I am just starting on this adventure. The center method
and command when removed and the blocked out portion restored of the top Spin program in the zip file is my current experiment with moving the whole Stingray to try to track the orange ball.
Sorry if you can't see the camera platform moving in the video.
The serial tracking data is sent after every captured video frame from a Python program that is using OpenCV2 to mask a particular color (orange in this program).
Then it sorts out the x, y centroid of the largest detected orange object in the video frame and sends a flag of !& then a byte each of the x, y screen location. Then a word z showing how
large the object is to the Propeller. This video only shows the Stingrays rear x/y camera platform trying to keep physical track of the detected object. I am attaching a test program that
displays the tracking data x, y, z and the FPS(frames per second) in the Parallax serial terminal called RaspiPropTest.spin. The attached Python program called Detect.txt should be saved as
Detect.py to the Pi. With the Detect.py program I get ~ 12 to 13 FPS because I am displaying the small video's on the Pi HDMI monitor. The Rasbian OS on the Pi causes the frame rate to
fluctuate. An example is when you move the Pi's mouse the frame rate drops to around 7. The best I've been able to get is a little over 14 FPS with frames resized @ 110 by 80.
The video has to play full screen to see the x, y camera platform on the back of the Stingray try to follow the orange ball on top of the Scribbler2. The ball falls off the Scribbler and the camera
keeps track. In the end though as I am waving the ball around with a grabber it loses track. I attached all of the programs I'm using as I am just starting on this adventure. The center method
and command when removed and the blocked out portion restored of the top Spin program in the zip file is my current experiment with moving the whole Stingray to try to track the orange ball.
Sorry if you can't see the camera platform moving in the video.
Comments
Robert
I hope to add one of my Pi cameras to Elf soon, but work has kept me from it so far.
Bill
color close to your set color in the picture other than your color object. I made the Spin program wait until it reaquires an object before it moves the AX12's. It is still slow tracking.
I have been trying to build a C++ program on the Pi with OpenCV but with no luck so far. It will be interesting to see what Bill comes up with using the Raspberry Pi camera.
I have also attached a program for letting you tweak your detected color called detexp.py. It shows larger masked and real video pictures with scrollbars added to adjust the low and high
threshhold values for H (HUE - HL,HH) and the lower threshold values for S (Saturation - SL) and V (Value VL). You can get a little more info on the HSV colorspace here.
When you have your color masked well you can use the numbers in the Detect.py program to send your detected color object coordinates and size to the Propeller.
To use the python programs save the .txt files as .py files on the Pi. To use Python with OpenCV2 on the Pi enter sudo apt-get update && sudo apt-get upgrade .
Then enter sudo apt-get install libopencv-dev python-opencv . Reboot and you can use these Python programs with a USB webcam. The Pi is not quick so you have to be patient using the
mouse to adjust the values.
well with his balancing robot here - http://letsmakerobots.com/node/38610 . Kind of makes me feel embarrassed about how far I've gotten so far.
The Activity board needs external power for the servo. I'm at the moment using SimpleIDE on the RPi to work on/load the Spin program via the Pi's USB port ttyUSB0. Also the Spin
program uses the same USB port to monitor the Pi's object detection coordinates. The Python program tst.txt should be saved as tst.py on the Pi. This program outputs around 2 - 3 updates a
second. The camera's video frame detected object coordinates for x = 0 left edge of frame to 109 the right edge. Center = 55. This is very crude but works. But maybe someone w/ a
pan/tilt servo setup, it could be a start. The servo keeps track by keeping the object in the center of the x axis so that the camera follows the object.
EDIT: The Python code shows small video pictures w/ the real video picture having a small red box around the detected object. The other small video shows the masked objects in black and white - using a HDMI video monitor.
Ray
To do that at the command prompt in the Pi terminal type sudo apt-get install libopencv-dev python-opencv .That takes a few minutes to finish. So far I get better results using the USB webcam. I get 12 - 13 responses a second. As far as what port
you want to send the serial data out this is the line in the Python code for sending out USB -
This is the line for sending out the Pi's serial port -
EDIT: I view it on a HDMI monitor. Also I would try whatever webcam you have it may work. The webcam I listed puts around 7 errors out before it starts up. I have a Creative live webcam that gives no errors.
but this is for the Raspberry Pi camera module as used in post#8. I'm posting some screen shot's of the Pi desktop with video screens showing a green and a orange ball sitting on the floor
and how I tweaked the threshold values to detect the green or orange ball. There is also a shot of Detect.py running on the desktop to show how small the videos are.
EDIT: This does not adjust any camera settings. It only adjusts the program for what the camera is sending. Though the camera settings can be changed using the picamera module which is used in this program.
EDIT2: The pix of small videos is really the Pi camera running tst.py from post#8 but the Detect.py gives similar videos from USB webcams.
I did try them but did not need them for what I'm doing. But after installing this driver instead of using a USB webcam with all my above programs that do, instead it
will use the Raspberry Pi camera module in it's place as the driver creates the device /dev/video0. The way I have it set now the Propeller is reporting 29 - 30 object
detection location/area packets per second and there is no detectable latency. Maybe a few milliseconds. It is lower resolution video than what I have been getting.
I will post a video with the Stingray's arm vertically tracking/following an object when I get it tuned in a little better.
on the light source and fluctuates too much. So I was unable to use the camera so far for distance following like I did with a pair of ping sensors here.
Thanks for the update.
How much our little friends can learn when we help them along!
Thanks for sharing your project with us!
I am getting closer to working with the Pi camera module, my TODO list is starting to get under control.
This is all just playing around but now that Stingray has some vision I will have to figure out something to do with that!
light conditions with the Raspberry Pi camera module or a USB webcam. It picks the USB webcam if it is plugged in at powerup or the Pi cam if the USB cam is unplugged. I also attached
screen shots of both cameras. This program also shows the frame rate in the real video picture. The Pi cam has lower resolution with hardly detectable latency and the USB web cam has
great resolution with about a second of latency. The frame rates shown for both camera's the picture size is 320x240. Smaller videos sizes increase the frame rate. When I use 110x80 for
object detection the frame rate flickers between about 15 ~ 30 FPS with sending serial detected object info from the Pi cam module. The .txt is supposed to be .py .
Edit: Answered next post in the thread listed.
Could you take a look at this thread and give me some input? http://forums.parallax.com/showthread.php/157157-S2-ROS-ala-ChrisL8-and-Arlo-Brainstorming-Thread
Since you are interfacing a Pi and Propeller, thought you might have some advice...Are you communicating with the Pi via SSH? Any tips would be appreciated.
As for the Spin program stngfllw.spin I used SimpleIDE Version 0.9.64 on the Pi to load the program to EEPROM via USB. The R-Pi is powered from Stingrays 5v on the P1 connector.
So when the power switch on the Stingray is turned on both boards start, the Stingray finishes first and waits for data via USB from the Pi after it finishes booting.
I have a wireless keyboard/mouse USB dongle and a wireless WiFi USB dongle. So I can work on the Pi directly with a HDMI monitor connected and/or a ssh connection to my
computer via WiFi. When I mentioned what I was doing with the WiFi dongle in the other thread you can also use it to connect and control the Pi from the internet. I have no idea how
to do that at the moment. I will be happy to answer any other question you might have (if I can!). Have lot's of fun with your project Whit!
Ray
Ray
I am trying to use java to communicate with the activityboard, and cant seem to get it to work. Since you were able to get I to work on a RPI with python, do you think it possible to do it with java?
Ray - Holy smokes! This is awesome. CAN"T wait to try this. Why haven't I found this info anywhere else? I was going to try tightvnc, but there were so many security warnings that I got a little scared to try it. Thanks so much for this tip!
By the way, a friend of mine recommended the class 10 uSD cards when I was starting with my Pi. It is all I have and my Pi is very responsive.
Whit+
The 2.4.3 version only outputs 64x64 pixel frames using the Pi camera module. I have since installed version 2.4.9 (latest working version) which I actually built on the Pi.
With 2.4.9 you are no longer restricted to 64x64 frames (I am using 640x480). Even though I only resize to 110x80 for object detection the video is better looking now using 2.4.9. FYI.