Shop OBEX P1 Docs P2 Docs Learn Events
PropBOE Ping)) DAR — Parallax Forums

PropBOE Ping)) DAR

bee_manbee_man Posts: 109
edited 2012-05-16 14:56 in Robotics
While everone was having fun at the EXPO and I was stuck here in Illinois. I decided to play with my new PropBOE and Ping module. I could not find a DAR in the OBEX for the prop so I whipped up this one. It could do alot more but basically works the same as the Basic version.


PropBOE_Ping)))_DAR.zip

Comments

  • Martin_HMartin_H Posts: 4,051
    edited 2012-04-15 16:47
    Cool, how about adding a video display with the Ping)))'s output! I had a crazy week and collapsed into a pile of pudding yesterday. But today I found some time to work on my 4x4.
  • bee_manbee_man Posts: 109
    edited 2012-04-16 18:22
    Martin_H wrote: »
    Cool, how about adding a video display with the Ping)))'s output!
    [video=youtube_share;IYPCAw2uSqE]
  • Duane DegnDuane Degn Posts: 10,588
    edited 2012-04-16 18:28
    Very cool!

    I didn't realize you could sweep and area with a Ping that quickly.

    Thanks for posting the code and adding the video.
  • bee_manbee_man Posts: 109
    edited 2012-04-16 18:43
    Duane Degn wrote: »
    Very cool!

    I didn't realize you could sweep and area with a Ping that quickly.

    You can go faster if you just want to avoid obstacles. I had to add wait states to let the servo settle. Otherwise the angle of the ping was not the actual angle that I was sampling. A stepper would be more accurate, the servo seems to jump around sometimes.
  • Martin_HMartin_H Posts: 4,051
    edited 2012-04-16 18:53
    That is a great demo of the PropBoe. The image produced is much better than the BS2 Ping))) dar character cell display.
  • graffixgraffix Posts: 389
    edited 2012-04-16 23:02
    Nice job. Any plans on adding a compass to the ping?
  • bee_manbee_man Posts: 109
    edited 2012-04-17 07:07
    graffix wrote: »
    Nice job. Any plans on adding a compass to the ping?

    Sorry, The only compass modules I have are attached to penguins. That would be cool, though, You could plot the objects as they relate to your position, North, NorthEast, East, etc.
  • dgatelydgately Posts: 1,629
    edited 2012-05-16 12:48
    bee_man,

    I've been playing with similar config and code on a QuickStart-based BOEBot. I noticed in your example (the video plotting) one of the issues that perplexes me in my own example. When the box and the can are at their furthest distance from the bot, there is some space between them. But the plot does not show that space. I assume the width of the ultra-sonic beam and the reflections, especially from a curved object such as the can, fill-in the space. Since I would like the bot to travel towards an open space, do you have any ideas on how better to "see" that space. Other than narrowing the beam of the Ping, or noting the difference in distance of two objects, I'm not sure how to deal with this. And, though two objects have "some" space between them, if they are equi-distant from the Ping sensor I may not be able to "figure" that there is a space between them.

    I guess that I could set my scanning for wider angle steps (instead of 10 steps per 180 degrees, I could try 4-6)...

    Also note that I tend to use an odd number of scans as that allows an exact-middle reading from straight in front of the bot.

    Any ideas?

    And, thanks for providing your code sample!
    dgately
  • bee_manbee_man Posts: 109
    edited 2012-05-16 14:56
    dgately wrote: »
    bee_man,

    When the box and the can are at their furthest distance from the bot, there is some space between them. But the plot does not show that space. I assume the width of the ultra-sonic beam and the reflections, especially from a curved object such as the can, fill-in the space. Since I would like the bot to travel towards an open space, do you have any ideas on how better to "see" that space.

    I noticed the exact problem when I was testing the code. You could set say 12 objects around your robot at equal distances and it would think it is surrounded with no way out. The Ping triggers on the first reflection sensed, without access to the raw data you can't account for the multiple reflections from 2 objects and something more distant. As the robot moves the distances change and the two objects eventually can be resolved. The solution is to base your direction decision on far objects and if surrounded move closer then determine if there are openings. I would set up a test with two objects separated by more than the width of the bot and see how close you have to be before you can resolve that the two objects are separate. Then I would base my close move decisions on objects that are within that range.
Sign In or Register to comment.