Limited AI robot

I would like to try to do a limited AI robot, using my spare Roomba 500, WX Activity Board, and using the SimpleIDE PropGCC language.

I am not sure how far you can get using the P1, and PropGCC, but I might give it a shot. I know there are memory limitations that would hinder the experiment, but maybe something could be developed where some specific functions could be stored on the SD card, and then loaded and used as needed. Has anybody done a RAM disk using the SD card. Not sure if that would even be feasible with this setup.

Since the forum has been somewhat quiet, not sure if I will get any responses or even any interest in some thing like this. Probably the most important aspect of this discussion would be, how far can you go with AI, using the P1 with PropGCC.

Ray

Comments

  • 16 Comments sorted by Date Added Votes
  • The SimpleIDE still supports XMM using an SD card for loading program code dynamically. I've never actually tried that mode, so I couldn't say how much it's slowed down by paging code from the card. I don't know how much room would be available in hub RAM, but it's probably less than what you'll want. How about adding some additional RAM for data. There are 64K SRAMs that interface via I2C (slower) or SPI (faster). These could be put on the little breadboard, maybe get 3 or 4 there using maybe 8 or 9 I/O pins. It would be pretty straightforward to create a database in SRAM that could be backed up to the SD card, then reloaded to perform garbage collection.
  • Mike Green wrote: »
    using maybe 8 or 9 I/O pins.
    Actually, zero I/O pins for I2C (as they can be shared with the boot EEPROM), and 3+(number of chips) I/O pins for SPI...

  • Mike GreenMike Green Posts: 22,828
    edited October 29 Vote Up0Vote Down
    Here's a datasheet for a Microchip SPI SRAM with 128K bytes and a 4-bit bus plus 1 for a clock and 1 for a chip select for each chip. Available in PDIP-8 packages and has a 32-byte page size.
  • Wuerfel_21Wuerfel_21 Posts: 185
    edited October 29 Vote Up0Vote Down
    I have one of these and can confirm that they work very well with the Prop (although I only used them in 2-bit mode with Spin drivers (and thus, sub-Megahertz clocks) yet - no idea how well they work at 20Mhz). In practice, the page size can be ignored, as the only mode you actually want to use is sequential mode. I don't know why the other two modes (single byte and paged) even exist.
  • Since I plan to use PropGCC, I do not recall seeing any C drivers for the Microchip SPI SRAM. I would like to stay away from writing C drivers for anything, I am a very lousy programmer for that kind of stuff. In fact my last attempt at converting a Spin ADC driver to PropGCC did not work out as expected.

    Ray

  • I completely forgot about the board selection - ACTIVITYBOARD-SDXMMC. If I am remembering correctly, it sort of allows you to increase the available ram for your programs, but I cannot remember at what cost. The memory model is still CMM, but now you are implementing SDXMM. I guess I will have to give it a try, and see what happens.

    So far I have a preliminary hardware build up of my Roomba, I will get around to taking some pictures. I will be using the RTC module, temperature module, and will be adding a compass module. Along side of the Activity Board, I have a small breadboard, so I can add more stuff. I also added a separate voltage regulator, and a voltage divider breakout board.

    Since I am using my WX Activity Board, I will be doing an html program. Now I am starting to think about adding my Raspberry Pi Zero board with a camera module, to give this robot some vision. Not sure of how I would be able to tie this into the Propeller WiFi html program.

    Ray
  • A big setback, ACTIVITYBOARD-SDXMMC is no longer supported in SimpleIDE. In fact when I chose that board selection, the compiler came back with an error, and did not compile. It also did an automatic board selection back to the plain ACTIVITYBOARD.

    Their was talk about Parallax no longer supporting the XMM memory model which I completely forgot about. Now, one of the options is to go with the Microchip SPI SRAM, not sure if I want to go that route. Thinking about this, I think Parallax was selling some kind of RAM chip, if it is still available, and they have PropGCC drivers, I could be interested.

    I see nobody has made any comments about the AI part, on what sort of algorithm would be available for the AI using PropGCC?

    Ray
  • Ray,

    Actually I was wondering what you meant by "AI" or even " limited AI".

    Now a days Artificial Intelligence often implies neural networks and or "deep learning". All of which is going to need a lot more memory and speed than that the Propeller has.

    Now you mention adding a Raspberry Pi which has a lot more space and speed.

    Of course there is a lot of "intelligent" behavior one can achieve with much simpler traditional algorithms, wall following, maze solving, all kind of things.

    What do you have in mind?

  • I consider top end of AI to be, neural networks and or "deep learning". Low end AI or as it applies here, limited AI, is to be determined. That is why I want to do an experiment with the equipment that I described in my first post.

    I was also trying to get feel for whether C could be used, for this kind of stuff, PropGCC in this case. I did some searches on the internet, and have read some descriptions of the low end of AI, could be as simple as a C code that asks for your name and age, and then you would have some appropriate responses, and that sort of thing.

    My experiment could be something to do with navigation and object avoidance, robot travel from location to another specified location, and possibly do some datalogging with some sensors that would be on board, plus an appropriate response. That would be for starters.

    "Simpler traditional algorithms", how is that expressed in C code. I have no idea what that means, as applied to code. In fact I have not seen any example code for that. I guess that would be the essence of AI, some code that you could derive an inference from, and then apply a specific/non-specific response.

    I am adding a compass module to my robot hardware. So, maybe the first thing the robot could figure out is how to map the surroundings, as it applies to its robot base unit. Since I have an RTC on board, that could be used for distance travel timed. I also have the SD card available, not sure how I could create a database, using the SD card for reference data for the robot to use. Since I will creating an html program for use with the robot, not sure if that could be used in some way to enhance the robot behavior. I will probably be adding a Raspberry Pi zero with the camera option, for some vision capabilities. Not sure how I will be communicating with the Propeller. This just a starting point, I think.

    But first off, I need to get a sense of what AI algorithms, as expressed in C code, would look like written in C. Not sure if anybody here has done anything like this.

    Ray
  • David BetzDavid Betz Posts: 12,915
    edited October 30 Vote Up0Vote Down
    XMM should still work with the underlying command line tools. I believe it has been removed from SimpleIDE. There is an XMM driver for SPI SRAM as well as SPI flash. I don't recommend the SD XMM driver as it is very slow.
  • My take on this is that as soon as you feel the need for XMM or SPI RAM/ROM or whatever in order to get big code running on the Propeller it's time to think about other solutions that can run the code you want without such expensive, complex, kludges. And do it faster and cheaper.

    In this case it seems we have a Raspberry Pi and a Propeller. Perfect, the Propeller can do what it is good at. Providing real-time deterministic interfaces to sensors and actuators. For example Ping sensors and PWM motor controls. All the kind of things that a Linux running Raspberry Pi is very bad at.

    Meanwhile the Pi can run all the higher level code, which is likely in C/C++, Python or whatever. It can do computer vision and so on. All the things the Propeller is not suited for.

    Communication between the two can be as simple as a serial link to the UART on the Pi's GPIO connector.

    A match made in heaven.

    Ray, I don't understand what you mean by "HTML program". HTML is not a programming language. It's a document markup language.
  • HTML program( WEB page), my shorthand for referring to the WEB page that will be associated with the robot.

    Some nomenclature, robot name, TheRobot1(TR1). Some where down the line, this might come in handy. I want to keep away from giving TR1 any human qualities, so I will refer to the onboard Propeller as activity level complex(ALC), not to be confused with a spinal column. The ALC will consist of ALB - COG0 … AL7 - COG7. ALB will, of course be associated with the main() function. The AL1 - AL7 will be associated with different tasks that it will be performing and keeping track of.

    A general observation of living things is the primary instinct, in my opinion, fight/flight and/or sustenance(food and water). As applied to the robot, sustenance(battery voltage) will get top priority over fight/flight, at the moment.

    So, before the robot gets any movement capabilities, the chief concern is the battery voltage. On the robot I have an extra little breadboard and the one power bar is dedicated to raw power that comes in from the robots battery. From this power bar I have it connected to the Activity Board power connector and a voltage regulator, for now. I also have a voltage divider breakout board connected to the power bar which is connected to the ADC on the Activity Board, which provides the voltage levels of the robot battery.

    At this point I can create my first little project, code for AL1, keep track of the voltage levels and an algorithm for determining and assigning voltage level responses. I will, at this time, probably add a red LED, and have the algorithm determine when to turn the red LED on, meaning carry the robot over to the charging station before everything stops working. Of course as the robot gets movement capabilities, the voltage levels will become extremely important.

    Ray
  • Ray,
    One thing you might want to look at is Robot Basic. RB runs on a windows PC and was designed to remotely control a robot, receive data back from the robot, analyze the data and then send updated commands to the robot. The intent is to do the calculation intensive processing on the PC. The micro is programmed using its language and IDE. The RB program sends the parameters needed by the micro's program by serial interface. The nice thing is that you can write a program and test it connected through a hardwire serial interface, and once debugged, simply changing the serial ID number in the RB program (and using the correct micro serial commands) the program can be run wireless. There are a number of RB books showing how to interface to different micros including the Propeller (spin) and R-Pi.

    The RB web site has a lot of info; robotbasic.org

    I wrote a Simple IDE C program to control an ActivityBot with GUI controls on a remote touch screen PC using RB and a Bluetooth radio on the ActivityBot. The wheel speed commands are echoed back to the PC and graphically displayed on the PC screen. The details and programs are here:

    https://forums.parallax.com/discussion/164454/activitybot-simpleide-bluetooth-and-robot-basic

    Hope this helps.
    Tom
  • I wrote a Simple IDE C program to control an ActivityBot with GUI controls on a remote touch screen PC using RB and a Bluetooth radio on the ActivityBot.

    When Parallax released the first WX Activity Board and WX WiFi, I immediately wrote a SimpleIDE C program with an accompanying WEB page to control one of my Roomba robots. I did not have a worthy enough tablet for mobile control of the robot, so I was confined to the desktop computer and the robot had to be visually seen. That was, what maybe two years ago, a lot of tech of things changed since then. That is why I am experimenting again.

    One of the reasons I am working with the Roomba, is because the robot has its own controller and robot commands, so I do not, or at a very minimum, have to create any of my own robot commands. Maybe some kind of PropGCC robot language could evolve. Basically I do a serial hook up with the Roomba via the Activity Board, and I am ready to go. The other reason for choosing the Roomba is the charging station, at some point I will have discovered a way for the Roomba to decide when to head for the charging station, and start a charging session, automatically, Limited Robot AI.

    While I am adding hardware to my robot, which is not an easy proposition, I am starting to think along the lines of visual capabilities for the robot. Not for the robot to see, necessarily, but for me to be able to see via the on board camera, when I do a manual control of the robot from my desktop computer. At the moment probably one of the cheapest and easiest ways is with the Raspberry Pi. Since there are two camera versions available for the Raspberry Pi, a NOIR(night vision), and a regular camera(daytime vision), plus a couple of Raspberry Pi Zero boards, you could place both on the robot and have daytime and night time manual viewing capabilities, at a very reasonable cost.

    The home robot capabilities are starting to be within the reach of the individual consumer, with a cheap budget for this kind of stuff. Maybe the home robotics will, after a having a forty year nap, become an interest again for the non technical home consumer.

    Ray
  • So far I am testing the capabilities of the roomba battery. I cannot remember what the battery rating is for the one that is installed, but I will have to pull it and replace it with a battery that is rated at 14.4 volts and 4.5Ah.

    I now have the following hardware on the Roomba:
    WX Activity Board (Parallax)
    WX WiFi (Parallax)
    RTC module (Parallax)
    Temp/Humid module (Parallax)
    Compass (Parallax)
    Ping (Parallax)
    red LED
    Voltage regulator
    Voltage divider
    Raspberry Pi 3
    After a full charge over night, I have turned on everything, and noted the voltage, 11.87V. I have an arbitrary recharge point that is set at 11.1V, which I will then carry the robot over to the charger and note how long it takes to recharge back too I think its 13.9V. This is basically doing some data collection to see how well or poorly the battery source is.

    I have not even considered giving any movement capabilities to the robot, just yet. I want to get a general feel for how much battery power is being used just too power the electronics. I did implement the red LED to give me a visual as to when it hits the 11.1V or less level. I have noticed that at times the red LED turns on and after a couple seconds it goes off. So it looks like something is putting a major draw on the battery at times. When the red LED goes to a constant on, then it will be time to carry the robot over to the charger.

    Ray


  • Mike Green wrote: »
    Here's a datasheet for a Microchip SPI SRAM with 128K bytes and a 4-bit bus plus 1 for a clock and 1 for a chip select for each chip. Available in PDIP-8 packages and has a 32-byte page size.

    Wow thanks for sharing!
Sign In or Register to comment.