PropBOE Ping)) DAR
While everone was having fun at the EXPO and I was stuck here in Illinois. I decided to play with my new PropBOE and Ping module. I could not find a DAR in the OBEX for the prop so I whipped up this one. It could do alot more but basically works the same as the Basic version.
PropBOE_Ping)))_DAR.zip
PropBOE_Ping)))_DAR.zip
Comments
I didn't realize you could sweep and area with a Ping that quickly.
Thanks for posting the code and adding the video.
You can go faster if you just want to avoid obstacles. I had to add wait states to let the servo settle. Otherwise the angle of the ping was not the actual angle that I was sampling. A stepper would be more accurate, the servo seems to jump around sometimes.
Sorry, The only compass modules I have are attached to penguins. That would be cool, though, You could plot the objects as they relate to your position, North, NorthEast, East, etc.
I've been playing with similar config and code on a QuickStart-based BOEBot. I noticed in your example (the video plotting) one of the issues that perplexes me in my own example. When the box and the can are at their furthest distance from the bot, there is some space between them. But the plot does not show that space. I assume the width of the ultra-sonic beam and the reflections, especially from a curved object such as the can, fill-in the space. Since I would like the bot to travel towards an open space, do you have any ideas on how better to "see" that space. Other than narrowing the beam of the Ping, or noting the difference in distance of two objects, I'm not sure how to deal with this. And, though two objects have "some" space between them, if they are equi-distant from the Ping sensor I may not be able to "figure" that there is a space between them.
I guess that I could set my scanning for wider angle steps (instead of 10 steps per 180 degrees, I could try 4-6)...
Also note that I tend to use an odd number of scans as that allows an exact-middle reading from straight in front of the bot.
Any ideas?
And, thanks for providing your code sample!
dgately
I noticed the exact problem when I was testing the code. You could set say 12 objects around your robot at equal distances and it would think it is surrounded with no way out. The Ping triggers on the first reflection sensed, without access to the raw data you can't account for the multiple reflections from 2 objects and something more distant. As the robot moves the distances change and the two objects eventually can be resolved. The solution is to base your direction decision on far objects and if surrounded move closer then determine if there are openings. I would set up a test with two objects separated by more than the width of the bot and see how close you have to be before you can resolve that the two objects are separate. Then I would base my close move decisions on objects that are within that range.