Shop OBEX P1 Docs P2 Docs Learn Events
Autonomous Vehicle — Parallax Forums

Autonomous Vehicle

General CedricGeneral Cedric Posts: 18
edited 2010-04-06 02:55 in Robotics
I am building an autonomous vehicle and I am using this part www.acroname.com/robotics/parts/R145-SRF08.html for sensing. I was wondering if it is possible to have the robot only respond to the sensor within a certain distance. Like maybe 8 feet instead of the maximum 20 feet.

Also, I wanted to know if I could somehow use the sensors to map the environment. This way I could have the robot take the most logical path toward a GPS coordinate.

▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
-General Cedric


"Wikipedia is your Friend"-Me

Post Edited (General Cedric) : 12/19/2008 4:19:57 PM GMT

Comments

  • Mike GreenMike Green Posts: 23,101
    edited 2008-12-15 02:35
    The rangefinder you mentioned is very similar to the PING))) sensor from Parallax.

    Your program can discard readings outside of any range you decide. Either sensor will provide the distance to the first echo it receives which is the closest object.

    Other people have experimented with autonomous vehicles using a variety of sensors to map out the immediate environment. Keep in mind that both IR distance sensors and ultrasonic distance sensors have limitations. Some objects show up better with one than the other and some objects don't show up well to either one. Video analysis is often used in sophisticated robots to detect objects and map the environment, but the hardware is relatively expensive and requires a lot of bandwidth and memory. The software is very sophisticated and resource intensive.

    A lot can be done with things like the CMUcam and similar devices along with a combination of ultrasonic, IR, and tactile sensors.
  • SRLMSRLM Posts: 5,045
    edited 2008-12-15 06:04
    Mapping the environment will probably turn out to be more limited than you expected with an ultra sonic. The Ping))) has a ~30 degree field of view if I remember right, so that means that your resolution is that limited. Thats not to say that you can't do some interesting things with it.
  • General CedricGeneral Cedric Posts: 18
    edited 2008-12-15 15:11
    Just to elaborate, the robot will probably be built using the 12v motor kit from parallax.

    coecsl.ece.uiuc.edu/ge423/spring04/group7/group7.html
    I found that link while searching some more. It seems to use ultrasonic sensors mounted on a servo to map the environment.

    I would like for real-time mapping and navigation to be possible. Any ideas?


    P.S. I was originally going to use the parallax sensor, but this one has a range of 20 feet instead of 10. However, I might use the parallax model because I can get it at the local radioshack.

    ▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
    -General Cedric


    "Wikipedia is your Friend"-Me

    Post Edited (General Cedric) : 12/16/2008 2:20:10 AM GMT
  • Mike GreenMike Green Posts: 23,101
    edited 2008-12-15 15:54
    1) Remember that the ultrasonic beam is cone shaped. The further away the object is, the broader the beam. Objects have to be bigger to reflect enough sound and the position of the object is less accurate.

    2) Notice that the link you provided is for a system for mapping a maze made up of particleboard sheet walls which provide large reflective areas for the ultrasound. This is not any kind of "real world" environment. If you have an environment in mind that's similar with large flat walls and obstacles that are also relatively large, relatively flat, and "bright" for sound reflections, then you should do well.

    I don't mean to sound negative, but this is a difficult problem, depends very much on the environment for the robot, and gets very very complex the more arbitrary the environment becomes.
  • SRLMSRLM Posts: 5,045
    edited 2008-12-15 19:03
    Take a look at the DARPA Challange and Urban Challenge contests. University level teams solved the exact same problem that you want to: map an environment. It (currently) takes many processors and accurate lasers to make a really good map of the environment from a stationary position. If you are moving (doing something like SLAM) you can use ultrasonics, but note that you can't path plan then.
  • InteractInteract Posts: 79
    edited 2008-12-15 22:24
    This is what·a scan·looks like with a wall on one side and a bottle on the other:

    http://forums.parallax.com/showthread.php?p=683124

    The cone shape of the sound makes for a distorted map of objects. If you are pointing the Ping)) at a wall, at an angle, the nearest point that the sound hits on the wall will be measured.so as you move away from that wall maintaining the same angle, the distance read will not increase by nearly as much because the cone is getting larger as the distance increases.
    670 x 753 - 76K
  • Tom CTom C Posts: 461
    edited 2008-12-15 23:16
    Interact,

    Not to hijack this thread, as I think that this question is germain to the mapping topic being discussed, but did you ever post your optomized Compass Code?

    I am presently working on a robot that will autonomously navigate around my single level home using ultrasonics, IR sensors and a compass after generating a 2D map to be stored on a data logger.

    Regards,
    TCIII

    ▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
    If you are going to send·a Robot·to save the world, you·better make sure it likes it the way it is!
  • InteractInteract Posts: 79
    edited 2008-12-15 23:43
    Yes I did post the compass routine:
    http://forums.parallax.com/showthread.php?p=683839

    I'm just getting back into the robotics project I left sitting on the bench 9 months ago.

    That's how it goes as a professional athlete. No time for fun in the summer. I don't even recognize half the stuff on the bench.
  • SRLMSRLM Posts: 5,045
    edited 2008-12-16 03:48
    The simplest way would be simply to head in the right direction towards the next waypoint, and make sure that it doesn't hit anything. No need to map out every crevice and rock when there is no chance of the robot hitting it.
  • General CedricGeneral Cedric Posts: 18
    edited 2008-12-16 14:28
    Well. I want the robot to stay on the sidewalk. I have seen some interesting robots that use a single webcam to identify sidewalks. I would like to use a system like that. The camera would be used to tell the robot what are safe areas. Then the GPS and PING))) sensors would guide the robot to the coordinates while staying in the safe area. How complicated does this sound?


    Here is a link to a paper on sidewalk tracking
    users.csc.calpoly.edu/~jseng/ccsc_paper.pdf

    ▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
    -General Cedric


    "Wikipedia is your Friend"-Me
  • Mike GreenMike Green Posts: 23,101
    edited 2008-12-16 15:17
    Way more complicated than you can do with a Stamp. At the very least, you need something with enough memory to hold a couple of video images. You're really talking about an Embedded Linux system which could use a webcam since it would have a full function USB port.
  • General CedricGeneral Cedric Posts: 18
    edited 2008-12-16 16:31
    Well. My intention was to have a laptop on the robot.

    ▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
    -General Cedric


    "Wikipedia is your Friend"-Me
  • SRLMSRLM Posts: 5,045
    edited 2008-12-16 16:51
    Time to start learning a full blown computer language. Expect the project to take at least a year of hard work: first to learn the language, then interface and develop the webcam operation, then interface motors and a robot of suitable size, then to put it all together into a working robot. Robot navigation and mapping is a lifetime career for professional computer scientists. It's a challenging operation that takes lots of processing power and cleaver algorithms to develop, and it still doesn't always work. Notice that in the paper they mention that the robot fails because it doesn't work in shadow. Toss in more months of work...
  • General CedricGeneral Cedric Posts: 18
    edited 2008-12-16 17:05
    Well. I might eventually get this program called MATLAB and the Image processing toolbox.

    For now, however, I need a simple navigation solution that keeps the robot on sidewalks and allows it to move to a GPS coordinate with PING))) sensors for object avoidance.

    ▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
    -General Cedric


    "Wikipedia is your Friend"-Me
  • General CedricGeneral Cedric Posts: 18
    edited 2008-12-16 17:08
    Psidewalk = S(h, s)/(S(h, s) + B(h, s)).

    This algorithm was on the research paper. Couldn't I use that?

    It seems like with a webcam or CMUcam it would be "simple" to implement this into a program that detects the hue and saturation of an area.

    ▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
    -General Cedric


    "Wikipedia is your Friend"-Me

    Post Edited (General Cedric) : 12/16/2008 5:14:16 PM GMT
  • SRLMSRLM Posts: 5,045
    edited 2008-12-16 17:36
    Simple? You have to process several frames per second, you have to decide which spots are safe to drive on, you have to be able to account for such things as debris in the road, shadows, similarly colored objects in the way, and so on. It's probably not all that difficult to detect one color (or range) but to be able to account for real world problems is tough. Take a look at this paper from Stanford to see what they came up with. It's the exact same thing that you're looking at, in fact, your paper probably derived from this project.

    The easiest solution for your project is probably to give it waypoint following capability, then place waypoints along the sidewalk to keep it in place.
  • Tom CTom C Posts: 461
    edited 2008-12-16 21:25
    General Cedric,

    The attached BS2 code, using a Parallax GPS and a Hitachi HB55 Compass, will give you an idea of what it takes to navigate a toy car around a predefined course.

    Regards,

    ▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
    If you are going to send·a Robot·to save the world, you·better make sure it likes it the way it is!

    Post Edited (Tom C) : 12/16/2008 9:31:00 PM GMT
  • Tom CTom C Posts: 461
    edited 2008-12-16 21:30
    Interact,

    Thanks for posting the compact compass code link, however how does "CompassTable CON 0" define the compass table as a table of data (array)?

    Should it be "CompassTable DATA (16)?
    .

    Regards,
    TCIII

    ▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
    If you are going to send·a Robot·to save the world, you·better make sure it likes it the way it is!

    Post Edited (Tom C) : 12/16/2008 9:37:29 PM GMT
  • InteractInteract Posts: 79
    edited 2008-12-16 22:57
    I am answering the compass question in the original thread:
    http://forums.parallax.com/showthread.php?p=683839
  • General CedricGeneral Cedric Posts: 18
    edited 2008-12-18 14:28
    SLRM- The only problem with the paper you posted is that it requires more than a single camera to detect sidewalks. I wanted to use a camera to detect sidewalks. The PING))) sensors are only for obstacle avoidance.

    ▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
    -General Cedric


    "Wikipedia is your Friend"-Me
  • SRLMSRLM Posts: 5,045
    edited 2008-12-18 16:37
    The paper I posted used a single camera and a laser array (in practice). The laser determined which areas are safe to drive based on elevation, and the camera takes that data and makes some assumptions on where the road is. I think the team did use two cameras on their car for the stereoscope vision, but the process only needs one.
  • General CedricGeneral Cedric Posts: 18
    edited 2008-12-18 16:40
    My Tech Drafting teacher says that if I build this system, he might let me incorporate it onto a go-kart that was made several years ago.

    So:
    I am now planning to drop the road following algorithm. The go-kart will have the power to move over rougher terrain. However, this probably means more sensors will be required to keep the robot on a safe path.

    Write down any suggestions. I am interested in your ideas!

    ▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
    -General Cedric


    "Wikipedia is your Friend"-Me
  • General CedricGeneral Cedric Posts: 18
    edited 2008-12-18 19:42
    I decided on the base software. Player and stage. I also intend to use a neural network.

    I will code the neural network in Python, and I will have the laptop on the go-kart

    ▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
    -General Cedric


    "Wikipedia is your Friend"-Me
  • General CedricGeneral Cedric Posts: 18
    edited 2010-04-06 02:34
    This project is finally finished! I'll be posting source code and pictures as soon as I get the time.

    ▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
    -General Cedric


    "Wikipedia is your Friend"-Me
  • P!-RoP!-Ro Posts: 1,189
    edited 2010-04-06 02:55
    Wow, you really had to dig far down to find this thread again, didn't you? I'll look forward to the pictures. Any project with a great while spent on it is definitely worth viewing!

    ▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
    Pi aren't squared, pi are round. Cornbread are squared!
Sign In or Register to comment.