Shop OBEX P1 Docs P2 Docs Learn Events
RoboPi+Create - Page 3 — Parallax Forums

RoboPi+Create

13»

Comments

  • RsadeikaRsadeika Posts: 3,837
    edited 2014-07-23 06:25
    I thought I would update my progress so far, I have created a small C program for the Propeller side of this:
    /*
      CXBase.c
     
    */
    #include "simpletools.h"
    #include "simpletext.h"
    #include "fdserial.h"
    
    serial *term;
    serial *create;
    
    void menu(void);
    void CsafeM(void);
    void On_LED(void);
    void Off_LED(void);
    
    int main()
    {
      // Add startup code here.
    /*                       Rx Tx Mode BAUD*/
      create = fdserial_open(8, 9,  0,  57600);
      pause(1000);
      term = fdserial_open(31,30,0,115200);
      pause(250);
    
      char inBuff[40];
     
      while(1)
      {
        // Add main loop code here.
        //print("> ");
        writeStr(term,"> ");
        //gets(inBuff);
        readStr(term,inBuff,40);
        if(!strcmp(inBuff,"help")) menu();
        else if(!strcmp(inBuff,"safem")) CsafeM();
        else if(!strcmp(inBuff,"onled")) On_LED();
        else if(!strcmp(inBuff,"offled")) Off_LED();
        else
        {
          //print("Invalid Command\n");
          writeStr(term,"Invalid Command\n");
        }
      }  
    }
    
    void menu()
    {
      //print("Menu - \n");
      writeStr(term,"Menu - help\n");
    }
    
    void CsafeM()
    {
      fdserial_txChar(create,128);
      pause(250);
      fdserial_txChar(create,131);
    }
    
    void On_LED()
    {
      high(26);
    }
    
    void Off_LED()
    {
      low(26);
    }
    
    This is just to test communication process between the Activity Board and the RPi. The one thing I did notice is that when I had 'term = serial_open...), I was not getting any communication between the two units, only when I changed it to fdserial_open that things started to work. I wonder what is going on with that?

    On the RPi side, I am using:
    #!/usr/bin/env python3
    # pyCXbase.py
    
    from tkinter import *
    import tkinter as ttk
    import serial
    import threading
    import queue
    import time
    import datetime
    
    ser = serial.Serial(port='/dev/ttyUSB0',baudrate=115200)
    
    def goodbye(*args):
    	ser.close()
    	quit()
    def on_led(*args):
    	out_data = b'onled\n'
    	ser.write(out_data)
    def off_led(*args):
    	out_data = b'offled\n'
    	ser.write(out_data)
    
    class SerialThread(threading.Thread):
    	def __init__(self, queue):
    		threading.Thread.__init__(self)
    		self.queue = queue
    	def run(self):
    		while True:
    			if ser.inWaiting():
    				text = ser.readline(ser.inWaiting())
    				print(text)
    
    class App(ttk.Tk):
    	def __init__(self):
    		ttk.Tk.__init__(self)
    		self.geometry("400x300")
    		self.wm_title("CreateBot Control Program")
    		ttk.Button(self,text="Quit", command=goodbye).grid(column=1,row=1)
    		ttk.Button(self,text="On LED", command=on_led).grid(column=1,row=6)
    		ttk.Button(self,text="Off LED", command=off_led).grid(column=3,row=6)
    
    		thread = SerialThread(self)
    		thread.start()
    
    app = App()
    app.mainloop()
    
    This is a quick and dirty tkinter GUI program, so far it is working as expected. I have also installed, on the RPI, a wx development package, which now I will start developing the GUI that I will be really using. Asteep laerning curve on the wx stuff, not as simple as doing it with Windows Visual xxx software.

    Ray
  • RsadeikaRsadeika Posts: 3,837
    edited 2014-07-28 06:55
    I was doing some more testing this weekend, and I noticed that between the RPi and the Activity Board, the combination really draws a lot of power from my powerbank. I wonder if the same is occuring with the RPi and RoboPi board?

    I was also trying to find a good video streaming program for the Pi camera module, not much out there. Since most of the examples that I have seen are browser based, I am now considering the Propeller program to be XBee based, for direct control of the robot chassis and the video streaming to be handled by the RPi as a job by itself, not sure how complicated that would become, because I have not found a good functioning video streaming program. The combination that I see is a browser window that would be displaying the live video and a seperate terminal GUI window which would control the robot chassis. I am not a good enough programer to come up with a single program to do both. So, if that works then the Robot Control Board would also have to have an XBee connector.

    Ray
  • RsadeikaRsadeika Posts: 3,837
    edited 2014-07-29 13:16
    The last couple of days I have been testing, for my purposes, the capabilities of the pi camera module for use in video streaming. My testing shows that it is not up to the job! The still picture part of the camera is very good and I will probably use that for different things.

    So now I probably need a wireless webcam that woud fit nicely on the robot chassis, but as always the power supply source will be an issue. If anybody has some good suggestions or a lead, I will check it out. This is a totally new area for me, I have never worked with a webcam before, it will probably be a steep learning curve for me, hopefully it will not take to much time too learn how to work it.

    Ray
  • Bill HenningBill Henning Posts: 6,445
    edited 2014-07-29 13:30
    If you want to save power, use a Model A, or a new Model B+ ... I just use a larger power bank :)
    Rsadeika wrote: »
    I was doing some more testing this weekend, and I noticed that between the RPi and the Activity Board, the combination really draws a lot of power from my powerbank. I wonder if the same is occuring with the RPi and RoboPi board?

    I was also trying to find a good video streaming program for the Pi camera module, not much out there. Since most of the examples that I have seen are browser based, I am now considering the Propeller program to be XBee based, for direct control of the robot chassis and the video streaming to be handled by the RPi as a job by itself, not sure how complicated that would become, because I have not found a good functioning video streaming program. The combination that I see is a browser window that would be displaying the live video and a seperate terminal GUI window which would control the robot chassis. I am not a good enough programer to come up with a single program to do both. So, if that works then the Robot Control Board would also have to have an XBee connector.

    Ray
  • Bill HenningBill Henning Posts: 6,445
    edited 2014-07-29 13:31
    I'd suggest reducing the resolution to VGA (640x480), or use a USB web cam.
    Rsadeika wrote: »
    The last couple of days I have been testing, for my purposes, the capabilities of the pi camera module for use in video streaming. My testing shows that it is not up to the job! The still picture part of the camera is very good and I will probably use that for different things.

    So now I probably need a wireless webcam that woud fit nicely on the robot chassis, but as always the power supply source will be an issue. If anybody has some good suggestions or a lead, I will check it out. This is a totally new area for me, I have never worked with a webcam before, it will probably be a steep learning curve for me, hopefully it will not take to much time too learn how to work it.

    Ray
  • RsadeikaRsadeika Posts: 3,837
    edited 2014-07-29 13:59
    If you want to save power, use a Model A, or a new Model B+ ... I just use a larger power bank.
    I am already using a Model B+, plus the power bank is 10AH. How much larger of a power bank can you get before it starts looking like a 12V SLA 9AH battery?
    I'd suggest reducing the resolution to VGA (640x480), or use a USB web cam.
    I have been to some websites that describe using a USB web cam with the RPi, and it did not sound pretty, plus it might not do what I want it to do. I am thinking that response time between the video stream capture and what you see on your screen may be a critical point, I do not want the robot responding after a few seconds of elapsed time.

    Ray
  • Bill HenningBill Henning Posts: 6,445
    edited 2014-07-29 14:17
    10Ah should be plenty - how much run time are you getting? Could the RPi cam increase power usage that much? (it's possible)

    I'd think that the power draw for the activity board would be similar to RoboPi's power draw.

    I am getting 5h+ for RPi, RoboPi, WiFi stick from a 5400mAh pack - much more life than the batteries that power the motors on the bots.

    Gotcha, you want low lag. I've seen command line switches to run the Pi camera at a lower resolution, perhaps that will help.

    I should be experimenting with the Pi camera soon, I'll post my results. I probably won't have time to play with my Create for a few weeks :(
    Rsadeika wrote: »
    I am already using a Model B+, plus the power bank is 10AH. How much larger of a power bank can you get before it starts looking like a 12V SLA 9AH battery?


    I have been to some websites that describe using a USB web cam with the RPi, and it did not sound pretty, plus it might not do what I want it to do. I am thinking that response time between the video stream capture and what you see on your screen may be a critical point, I do not want the robot responding after a few seconds of elapsed time.

    Ray
  • RsadeikaRsadeika Posts: 3,837
    edited 2014-07-30 06:39
    I have been doing some more testing with the pi camera streaming, on the RPi, there is a problem with he VLC media player for viewing the playback. When I view the xxx.h264 on my Kubuntu desktop, using VLC, I get a correct representation of the captured stream. I just did a test with stream capture with me walking into the camera view and back out, the lag is very bad, it looks like I was walking on the moon, or some very low gravity conditions. I trie two different frame pr second settings, 35 and 90, which did not make any noticeable difference.

    The other part of the problem is getting the RPi VLC to work with using: | cvlc -vvv stream:///dev/stdin --sout '#rtp{sdp=rtsp://:8554/}' :demux=h264
    RPi complains that there is something wrong with that command. Of all the examples that I have seen, everybody uses that command and report that that is working for them, well it does not work for me. I have come to the conclusion that something has changed with the latest VLC that I installed and it is not obvious as to what that could be. The docs are very terse and not clear as to the correct way of using the command line commands to make it work.

    I was hoping that I could set it up so I could test this on the robot, just to see how bad the lag is in real time, but I can not get it to stream to my Kubuntu desktop. Not sure what the next step with this is...

    Ray
  • ratronicratronic Posts: 1,451
    edited 2014-07-30 08:35
    Ray I have been trying to get the RPi camera streaming video. Not too much luck so far. I do have a Python program giving me around 2 - 3 frames per second sending object detect

    information from the RPi camera to the Propellers serial port. I think I will have to wait for the more knowledgeable people to get into this.
  • RsadeikaRsadeika Posts: 3,837
    edited 2014-08-03 05:02
    I have finally reached a point where I have decided to put the video streaming process on hold. At this point their is no viable solution, that I like, to pursue this any further at this time. Hopefully the Raspberry Pi community will come up with a satisfactory compromise that I would be able to add to my robot project.

    What I have going now is an XBee centric manual control of the robot. On my Kubuntu desktop I have a Python GUI program that uses the XBee to communicate with the Activity Board that controls the robot. With the XBee models that I am using I have a range of 60 to maybe 100 feet, that means I do not have to worry about loss of communication like I would have with the WiFi. I am also working on having two major operations, at this time, the manual control and a roaming centric control.

    The manual control is very straight forward in its approach, it is the roaming centric that is giving me trouble. Should I make the RPi the main unit of control or do I make the Propeller the control unit. Of course with the Propeller their is that fear of running out of resources very quickly which would include avaiable memory and COGs. I have not made a decission yet, I am sort of stuck at this point.

    Ray
  • RsadeikaRsadeika Posts: 3,837
    edited 2014-08-10 03:08
    After a little distraction, something called a Be MICRO CV FPGA, still have not received the thing yet, I did make a little progress with the robot. I am probably going with an Activity board centric implimentation. Also, before I try to impliment the roaming feature, I have decided to add a crude interpreter. This thing will basically take a script file, which will have commands, interpret it, and have the robot respond accordingly. This mainly is an adaptation for this robot, since I already worked out the code on my ActivityBot many months ago.

    What I have now is, on my desktop, a simple python GUI robot control program. The robot control program has an interpreter, which after putting together a script file, I will be able to control the robot, or at least move it, to a specific location. Next I will need to do some repeatability tests, after programming a course, run it a few times to see if it always goes to, or at least ends up at the same place. Now to refine the interpreter segment of code...

    Ray
  • RsadeikaRsadeika Posts: 3,837
    edited 2014-08-26 03:15
    After viewing the chrisL8 building ARLO video, I am considering a direction change, as it applies to my robot chassis. I was really impressed with what the Kinect and ROS would add to my Create robot, especially if I were to add the picamera to the mix. I priced the Kinect, and that is not very cheap, plus I would have to get a netbook on the bot, doable, I think, but that means going vertical, not sure how much weight the chassis can support.

    This would be one heck of a project, if I were to change direction, but it would end up being one very functionally capable robot. I am giving this a lot of thought. Not sure what I would have hear to put me over the edge.

    Ray
  • WhitWhit Posts: 4,191
    edited 2014-08-29 11:34
    Ray, Dave and Bill,

    Would you guys mind checking in on this thread - http://forums.parallax.com/showthread.php/157157-S2-ROS-ala-ChrisL8-and-Arlo-Brainstorming-Thread (Ray's already found it). There is much in common in all the work you guys are already doing and with Ray's last post here.

    I could do this with Alro, but was thinking shooting for an even lower cost alternative using the S2 (which I love and and am very familiar with).

    Would love to hear warnings, suggestions, possible new ideas - since y'all are a good bit ahead of me - especially as regards the Pi. - Just got a B+ about a month ago and been learning and studying Bill's stuff a bit.
  • RsadeikaRsadeika Posts: 3,837
    edited 2014-09-02 04:39
    Just an update of what I am doing. Below is a very very simple Python serial terminal program, which I am using on my Windows 7 box. It is simple enough to where I type in a known command and my robot responds accordingly. I have decided to give some time to a command line program to control the robot, the Python GUI program works, but I have come up against a wall in terms of expanding the GUI program, mainly trying to figure out how to use the other widgets like Enter, Combobox, ..., etc.

    Since I am not a Python expert, I wonder if somebody can point out why I am not able to display incoming strings? On the Python GUI program I am able to successfully run some script files, so I will be attempting the same thing for the this command line version. Since the Activity Board C program is now at ~9000 bytes in length, I have plenty of room for expansion on the Activity Board. I will have to refresh my memory as to how to start a new COG which will be handling a Ping sensor or more, and maybe the compass module.

    In summary the program below works like a serial terminal program, except for the part where it is not displaying incoming data. Running into a lot of excepts lately.

    Ray
    #!/usr/bin/env python3
    # Command line serial terminal
    
    import time
    import serial
    
    # Connect to my XBee module on COM3:
    ser = serial.Serial('COM3:',baudrate=9600)
    
    
    print("Hello")
    instr = 1
    
    while 1:
    	instr = input("> ")
    	if instr == 'quit':
    		ser.close()
    		exit()
    	else:
    		ser.write(bytearray(instr,'ascii'))
    		ser.write(b'\r')
    		out = ''
    
    		while ser.inWaiting() > 0:
    			out = ser.readline()
    			time.sleep(.25)
    			if out != b' ':  # This part is not working as expected
    				print("> ")
    				print(out)
    
  • RsadeikaRsadeika Posts: 3,837
    edited 2014-09-06 05:14
    I added some stuff to the Python terminal Program, it now has script function, you can open up a txt file with your script commands. The method is, one command per line that is terminated with a CR, and the last line of the file is '::'. The Python program opens up the designated txt file and starts to read it one line at a time until it reaches a '::', then it closes the file and continues with the program.

    This is designed to work more with my C UI program on the Activity Board, that program has the complimentary commands necessary to interpret the script file commands that are being sent to the robot.

    In the Python program, I think I have enough comments where you could adapt the code for your own requirements. The reason I am using Python is that I now can use the code, with some minor adjustment, on a Windows box, or a Linux box. For Windows, there is a program called py2exe which will take the Python program and turn it into an exe program that does not have to have a Python program installed, to run.

    Now I am looking at re-writing the Activity Board C UI program to make it much more easier to adapt to use with other robot forms, like an ActivityBot. I will keep the main commands standard, but will make it easier to change the code that is associated with a main command. Since there is no combo board available for the ActivityBot, the functionality might be limited. What would be nice is a combo board(Raspberry Pi+Propeller) that would be able to fit on the offsets that are on the ActivityBot.

    What I envision is a programing system that can be used with an ActivityBot or an S2 or an Create or an ..., etc.

    Ray
    #!/usr/bin/env python3
    # Command line serial terminal
    
    import time
    import serial
    
    # Connect to my XBee module on COM3:, Windows 7
    ser = serial.Serial('COM3:',baudrate=9600)
    
    ####################
    # Menu procedure
    ####################
    def run_menu():
    	print("Menu - quit, help, script ")
    ####################
    
    ####################
    # Run a script program procedure
    ####################
    def script():
      instrg = 1
      instrg = input("File Name: ")  # Get the script file name. ex - "doit.txt"
      print(instrg)
      out_data = b'runprog\n'
      ser.write(out_data)
      time.sleep(0.5)
    #  trun=open(r"D:\programming\Python\test1run.txt",'r') # r = Suppress escape mechanism
      trun=open("D:\\programming\\Python\\"+(instrg),'r')  # Open the file
      while 1:
        line = trun.readline() # Read a line from script file
        if not line:
          break
        print(line)  # Show the commands
        ser.write(bytearray(line,'ascii'))  # Send the command
        while True:
          time.sleep(0.5)
          if ser.inWaiting(): # Wait for acknowledge
            text = ser.readline(ser.inWaiting())
            if text == b'#':
              break  # Continue the while loop
          break  # Finished
      trun.close()  # Close the file
    ####################
    
    ####################
    # Intro title
    print("PiTerm Terminal Program")
    print("Type 'help' for menu")
    
    # Variable declaration
    instr = 1
    
    # Main program loop
    while 1:
    	instr = input("> ") # Get keyboard input
    	if instr == 'quit':
    		ser.close()
    		exit()
    	if instr == 'help':
    		run_menu()
    	if instr == 'script':
    		script()
    	else:
    		ser.write(bytearray(instr,'ascii'))  # Send command
    		ser.write(b'\r')  # Send a CR
    
    		out = ' '
    		time.sleep(0.5)
    		while ser.inWaiting() > 0:  # Check for incoming data
    			
    			out = ser.readline(ser.inWaiting)
    			
    			if out != '':  # This part is not working as expected
    				
    				print(":",(out))
    
    ####################
    
  • Bill HenningBill Henning Posts: 6,445
    edited 2014-09-06 06:31
    FYI,

    http://www.mikronauts.com/robot-zoo/piboe/

    I've even managed to make the stack shorter, by mounting the Raspberry Pi on the "EZasPie 300" protoboard without the intermediate "EZasPi (B)" - I'll take a pic of the shorter bot and upload it later.

    PiBOE.jpg

    Rsadeika wrote: »
    What would be nice is a combo board(Raspberry Pi+Propeller) that would be able to fit on the offsets that are on the ActivityBot.

    What I envision is a programing system that can be used with an ActivityBot or an S2 or an Create or an ..., etc.
    800 x 903 - 206K
  • PublisonPublison Posts: 12,366
    edited 2014-09-06 07:03
    Bill,

    Your going to make Humanoido jealous. :)
  • RsadeikaRsadeika Posts: 3,837
    edited 2014-09-10 07:50
    The attached zip contains the robot UI for the Activity Board, and of course it is PropGCC. I decided to add a little redundancy, meaning you can control the robot via XBee or the Raspberry Pi. I have noticed that even with the three WiFi hotspots that I have set up I still run into some dead areas, and the same is true of the XBee. I guess, now I can always have away of talking to the robot, I hope.

    The code is heavy in repetition of functions, which at some point I will have to look into. I also use three instances of fdserial, now I am not sure how many COGs those are using. I cannot remember if the fdserial is built like the Spin version, that I believe uses two COGs per. So, at this point I just may be at the limit of available COGs, but I am not sure. I am starting to wonder if I will have too put more things on the Raspberry Pi, in order to lighten the load on the Activity Board. Now that the B+ has four USB sockets, I have two available, maybe I should move the XBee from the Activity Board too the Raspberry Pi. But that means I will have to rewrite or at least modify my Python terminal program to able to run the XBee in a thread, I am ready for that?

    I would consider the code that is attached a pre-alpha state, more like a proof of concept at this point. It does contain some ideas that could be implemented in other places, I hope. I have some comments, but to me it seems like a lot of it is self-explanatory, any questions just ask. I am also open to suggestions as to how to do it better, so, if you have any ideas, I would like to read them.

    Ray
Sign In or Register to comment.