Propeller appropriate for humaniod robot?
TylerS
Posts: 7
Hello,
·
I'm a robotics hobbyist beginner looking for some advice from more experienced programmers.· My·plan is to design, build, and program a humanoid robot.· I'm currently in early stage design where I am trying to plan out the·hardware and software I will use during development.·
·
There are three approaches I am considering and I am having trouble deciding which one to take.·
1.· Use only a propeller for all processing/decisions and control.· This includes sensor reading, processing, decision making, and motor/servo/actuator control.· At least initially the sensors would only consist of range finders, pressure/contact sensors, and accelerometer/gyro for balance.· Eventually I may look at adding image processing.
2.· Using a propeller for·lower level sensor reading, and motor/servo/actuator control, but offloading the processing/decision making to a wirelessly connected PC running software such as Microsoft robotics studio or whatever else can perform this task (please avoid Microsoft bashing in this post and keep it on topic).· One concern with this strategy is if I wanted to do high level image processing would the wireless connection be able to transmit live video files for more detailed pc processing.
3.· Having an onboard PC directly connected to the propeller.· (Solves any potential wireless issues)·
·
In reality I will probably just use strategy 1 (propeller only) until I want to do something that the propeller can't handle. (Maybe by then V2 will be out and I won't have to use a PC!).
·
So my questions are...
1.· Is the propeller capable of kinematic calculations for legs?· (I believe·inverting matrices are·involved with these kinds of calculations).· Can the propeller do image processing (basic blob·and·edge detection) in addition to servo output (at least 14 servos) and kinematic calculations, and sensor readings?· I don't have enough experience to·translate specs into actual capabilities.·
2.· What programming language would you guys recommend?· I think I would prefer to learn C and program in that, but most of the objects are not available in C.· I would really like to learn a transferable programming skill set that I could use in the future (grad school, etc.)
3.· What strategy would you recommend from above?
·
Some background on me, I have limited programming experience in MATLAB and excel VBA macros.· Bachelors’ degree in Mechanical Engineering and Minor in Applied Mathematics.· I'm out of school working full time, and would prefer to spend money on good software development programs to save time if they are more useful than the freeware programs.· What programs would you guys recommend?· I've read up on ICCV7, Viewport,·Microsoft Robotics Studio.·
·
Does anyone know of someone who has made a humanoid based on the propeller, or will I be the first (could be bad for me... lol).
·
I'm a robotics hobbyist beginner looking for some advice from more experienced programmers.· My·plan is to design, build, and program a humanoid robot.· I'm currently in early stage design where I am trying to plan out the·hardware and software I will use during development.·
·
There are three approaches I am considering and I am having trouble deciding which one to take.·
1.· Use only a propeller for all processing/decisions and control.· This includes sensor reading, processing, decision making, and motor/servo/actuator control.· At least initially the sensors would only consist of range finders, pressure/contact sensors, and accelerometer/gyro for balance.· Eventually I may look at adding image processing.
2.· Using a propeller for·lower level sensor reading, and motor/servo/actuator control, but offloading the processing/decision making to a wirelessly connected PC running software such as Microsoft robotics studio or whatever else can perform this task (please avoid Microsoft bashing in this post and keep it on topic).· One concern with this strategy is if I wanted to do high level image processing would the wireless connection be able to transmit live video files for more detailed pc processing.
3.· Having an onboard PC directly connected to the propeller.· (Solves any potential wireless issues)·
·
In reality I will probably just use strategy 1 (propeller only) until I want to do something that the propeller can't handle. (Maybe by then V2 will be out and I won't have to use a PC!).
·
So my questions are...
1.· Is the propeller capable of kinematic calculations for legs?· (I believe·inverting matrices are·involved with these kinds of calculations).· Can the propeller do image processing (basic blob·and·edge detection) in addition to servo output (at least 14 servos) and kinematic calculations, and sensor readings?· I don't have enough experience to·translate specs into actual capabilities.·
2.· What programming language would you guys recommend?· I think I would prefer to learn C and program in that, but most of the objects are not available in C.· I would really like to learn a transferable programming skill set that I could use in the future (grad school, etc.)
3.· What strategy would you recommend from above?
·
Some background on me, I have limited programming experience in MATLAB and excel VBA macros.· Bachelors’ degree in Mechanical Engineering and Minor in Applied Mathematics.· I'm out of school working full time, and would prefer to spend money on good software development programs to save time if they are more useful than the freeware programs.· What programs would you guys recommend?· I've read up on ICCV7, Viewport,·Microsoft Robotics Studio.·
·
Does anyone know of someone who has made a humanoid based on the propeller, or will I be the first (could be bad for me... lol).
Comments
Microsoft Robotics Studio doesn't do that much for you, particularly for the type of project you're considering. I would recommend dividing up tasks as follows:
1) One Propeller, probably the Propeller Robot Control Board, to control motors, servos, actuators, and sensors. It may be able to do the kinematic calculations. I'm not sure if it would be fast enough. The floating point is done in software and basic operations like + | - | * | / all execute at around 40us per operation.
2) A Propeller is probably not a great choice for image processing. At least one forum member has been working on a video cam interface. The main problem is that there's not enough memory to store the image if it's not fairly small. With a little external hardware, it's possible to store larger images in external RAM, then process it. It's still not the best platform. Hanno, in Viewport, has provisions for video capture to a PC where the image processing can be performed.
3) There are some very small and low power Linux platforms like the Gumstix that would be ideal for this sort of project. The Propeller could do the low level control and the Gumstix could do the image processing from a webcam and high level control functions.
4) Spin would be the best programming language to use for the Propeller. It's easy to learn and a lot of what you'd need is already available as objects in the Propeller Object Exchange. On the Gumstix, you might use a variety of packages and languages depending on the task involved. There are some ready-made tools for image processing
▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
Will work for Propeller parts!
Welcome to Propeller World!
Sounds like you're headed for a fun challenge. Best bet is to start small and built up on successes. The Propeller is ideal for controlling and interfacing with lots of actuators and sensors. Multiple cogs makes it easy to split your project into doable tasks. I've done simple vision processing using a grayscale camera that finds the xy location of barcode patterns. I've used that to control my balancing robot. Here's my Circuit Cellar article- covers robots, vision, ViewPort: http://www.circuitcellar.com/archives/viewable/224-Sander/index.html
ViewPort includes the PropCV vision engine, the integration with OpenCV (used by the DARPA winners for computer vision using PC hardware) and fuzzy logic engine+control panel.
Hanno
▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
Co-author of the official Propeller Guide- available at Amazon
Developer of ViewPort, the premier visual debugger for the Propeller (read the review here, thread here),
12Blocks, the block-based programming environment (thread here)
and PropScope, the multi-function USB oscilloscope/function generator/logic analyzer
there is a simulation inside of Microsoft Robotics Studio
Have a look at this video.....it is also the subject of an article in the Nov 2009 issue of Servo.
http://www.youtube.com/profile?user=john30340#p/u/1/Q94WKdn3uF8
Also see this videos for an example of Reverse and Forward Kinematics using RobotBASIC's
extensive Matrix Maths operations.
http://www.youtube.com/profile?user=john30340#p/u/8/AukHmkZqmys
Also see this video by Ratronics that shows RB being used to comm with the propellers to control a
multi-servo Real Robotic Arm
http://www.youtube.com/watch?v=CILzXm80rrI
Also see this video that demonstrates the EXTENSIVE image processing abilities of RobotBASIC
for doing things like Edge detection, Color Finding, Comparing etc etc. Using ONE-LINE commands.
http://www.youtube.com/profile?user=john30340#p/u/22/GV55FM1DJy4
See this video that makes a Puppet follow color objects with its eyes and head and also to
say the color of the object it sees.
http://www.youtube.com/watch?v=LwvspYFXJMM
Also see this video about how RobotBASIC was used to do PID control of a Space Station Model
using The Propeller. This is an old video that actually uses the BS2 but an article that is coming out
soon in Nuts&Volts will show all the Propeller version of the programs.
http://www.youtube.com/profile?user=john30340#p/u/24/oxqlTaJy31M
Also notice how in many of these programs the EXTENSIVE ANIMATION ability of RobotBASIC
is used to create SIMULATIONS (e.g. robot arm, humanoid robot, space station). To see how
simple it is to develop animation programs in RB (especially if you have experience in VB) see this video
http://www.youtube.com/profile?user=john30340#p/u/7/EULYbnAxJu0
Also RobotBASIC has a Robot Simulator that is nowhere as sophisticated as MSRS but that in fact is
an advantage (many people attest to that). Have a look at this video to see how simple it is to
develop a simulated robot that uses infrared sensors to Circumscribe a circular path around a
cicular object.....the 10 minute video shows how to develop the program from scratch and goes
through various stages of improving on the algorithm.
http://www.youtube.com/profile?user=john30340#p/u/12/27Gt3IgdcMc
On the website www.RobotBASIC.com you will find LOTS more stuff and links and links to THREE
books that utilize RB from the very novice to 1st year college level with numerous projects some of
which you can see in the·fourth video mentioned above.
The idea is that you can use the Propeller with Spin/Pasm to develop the HARDWARE LEVEL control
of actuators (motors) and to interrogate transducers (sensors).
You send the transducers' readings to the PC running an RB program using wired/wireless·Serial·comms
(eg. Bluetooth) or using TCP/UDP (e.g. WiFi). The RB program then does the number crunching using
all the power of RB (e.g. Matrix inversion, Curve Fitting).
The·program would then display the data on the PC for the user using GUI·display and will also send to the
Propeller all the necessary data·so as to be able to activate the actuators in the appropriate proportions.
The cycle of course is repeated continuously.·Another advantage of RB is that you can simulate·all this
BEFORE you do the hardware. This way you HONE·all the ALGORITHMS without the expense and mishaps
and·THEN port over some of the algorithms to the Propeller using SPIN and implement the comms.
This is exactly the kind of thing that is demonstrated in the Space Station project in the video mentioned
above and that will be the subject of the·article in Nuts&Volts (coming out very soon).
Also in the two video below you can see how a·BoeBot was made to follow a·line on the floor by adding
a ONE LINE of code to a program that was developed entirely using the RB robot simulator. The VERY SAME
simulator program was made to work on a·REAL robot (BoeBot) with a Bluetooth link·using just one line of
additional code.
http://www.youtube.com/profile?user=john30340#p/u/28/i5JT4WdMofQ
http://www.youtube.com/profile?user=john30340#p/u/27/vftgmZQCheA
·
So in summary....my advice (such as it is worth) is for you to use the Propeller +Spin·(Pasm much later)
to control·the hardware and a PC with RobotBASIC to do the·math crunching.
You should proceed in steps.
1- Learn the Propeller +Spin and how to make it do projects like controlling·motors and reading Sensors.
2- Learn how to comm the Propeller and the PC using FullDuplexSerial (FDS) and the Parallax Serial Terminal (PST).
3- Learn RB and how to make it do some simulations and animations.
4- Learn how to comm the Propeller and RB using wired·links....see this PDF about how to do this
··· http://www.robotbasic.org/resources/RobotBASIC_To_PropellerChip_Comms.pdf
5- Learn how to use the Easy BlueTooh Module·from Parallax to comm the Propeller and RB on the PC.
6- Develop a (or use·RB's) protocol for Sending/Receiving commands/Data back and forth.
Now you would be ready to start tackling the process of INTEGRATING your smaller projects into a full big
one that would do all the complex actions you need....Also now you are more able to DEBUG such complex
stuff since you know the COMPONENTS work and only the integration is what needs to be worked out.
I hope this helps
Samuel
Post Edited (SamMishal) : 12/5/2009 9:08:03 AM GMT
In general terms I would be inclined to use one or probably more Props, on the robot, for scooping up data from sensors and managing actuators. Pipe all of this over to PC where you can develop and debug your ideas/algorithms in comfort. When you get to the point that it behaves as you like then take a look at the size you code on the PC and the rate at which it needs to run. From that you can determine how much processing power you need to install in your robot to make it autonomous and hence which platform, Prop or otherwise is suitable.
Consider that in 10 years time, when you have perfected this, the PC you are using now can perhaps be replaced with a Prop V [noparse]:)[/noparse]
It may be possible to do a limited amount of image processing on a Prop but given that your goal is make a robot perhaps it's best to devote your time to that task and use something that will get you some image processing power without a great deal of effort.
▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
For me, the past is not over yet.
http://www.parallax.com/tabid/773/Default.aspx
··············· now this is a topic close to my heart, I to have the goal to build a humanoid robot. I want it full sized able to vision process to some degree, audio process sounds and human speech. It has to be able to talk, it would be realy great if it could sing! it has to be able to walk on 2 legs with out bumping in to things and falling over. it would be great if it could be tought how to use its hands and arms to manipulate its inviroment.
I started looking in to how one would go about developing somthing like this about a year ago and came to some conclusions......
Even though there are plenty of ppl and teams around the world developing humanoids the task is still very expencive and very difficult.
I have a budget of around 150$ a week to put in to this project so the best priced solutions are a must.
I came to the same conclusions as SamMishal·on this one. I beleve the·propeller chip in conjunction with parallax's·easy to use and reuse·addon sensors·to be the best solution to begin with as there are·so meany ppl writing code for this amazing product. Take this for an example. I started looking in to how I would write a speech synth prog for this thing as one of my first projects..... AND LO! there is already a quite a good one already written AND IT CAN SING! AND its FREE! AND you can fool about with the code your self and change every aspect of it. Then you got sensors, if I was to sit and write code to read say the H48C Tri-Axis Accelerometer it would take up a week of my spare time. But thanks to the propeller community I had the H48c up and running in the terminal screen in an afternoon.
My advice is to take a bottom up approch to a project like this. Use the prop chips and the sensors to learn with, adding more complexity as you go. start of with an easy to work body for your bot like a basic box with 2 drive motors, add in 2 RC servos for 'head' movment. Beleve me it dosnt sound like much but there is months of work in even taking this format as far as you can with the prop chip/chips and all the sensors and addons there are out there for them.
Then when you cant possibly squeeze any more out of the prop chips add in an EEEpc on to the top of your mobile box and wire it in to your prop chip/s and run robotbasic on it (I cant recomend this programing language highly enough) why an EEEpc?? cos some·models are like 800grams in wight cost, around·400$ AUD, only the size of a large papperback book, and best of all, have solid state drives in them. beleve me! your bot will take a pounding as your testing it in an outdoor inviroment (mines been rained on, driven its self though puddles and ive had to wipe mud off the screen and keybord)
then when you have taken this as far as you can look at dismantling it and adding in say an extra 3·drive motors (totaling 5) and building a quadroped. 4 motors runing each on their own cams to make leavers work in a leg like motion (have a good look at cheep wind up toys that walk on 4 legs) and a 5th one to stear by pivoting inbetween the hips and rib cage (get your dog out on an oval and throw a ball for it, take a good look of its leg cycles and how it turns)
then when you have taken that as far as you can, start replacing the cam legs with servos in each joint (very very expensive unless you make your own) then when you have this contraption stable start teaching it/modifying it to stand up on 2 legs. Consider this your androids personal evolution.·This way your·getting·gratification in each of the steps as you go, your spreading your costs out over a long period of time (if you go for the biped strate off most of the gear you buy will sit on a shelf for months before you ever get to use it and you will get disheartened at your slow progress and your friends/family·will lough at you cos they are not seeing much happening)
Robotics should be fun and inspire others to want to have a crack too!
··
It is a very old trick to program movements by measuring and recording (aka logging) the movements of real people.
That has been used since the 1960s for electronic cartoon animation and for Disneyland's "Animatronics"TM Robots.
Two ways: one by looking at white dots on an otherwise dark person with a camera, the other by putting wired position sensors
on a real person, called a "harness". There are many simple ways to sense position but no off-the-shelf units
available, and my ideas seem to only make sense to me anyway. For example: a do-it-yourself SONAR VOR
system seems like a good and simple "duct-tape"/"odds and ends" hack to me, but others may have fresh ideas
that will work with what's in a 2009 odds-and-ends bin. The camera method I recommend using a slowed-down
video camera, because it is much easier to follow 10 frams a second than regular analog video. I'd start with a CCD sensor,
but the 2009 junkbin is likely to have a lot of web cams in it, which gives you a choice of operating the web cams themselves
or removing and operating their CCD sensors, whichever is easier to do now.
Before I learned much about the Propeller, in a silly daydream I imagined it would be great for androids since 4 cogs might be
reserved for controlling each arms and legs, but whether that makes any sense or not, I still think a Propeller would
be a great android brain.
In my generation, BASIC and ASM was enough to control the world,
then came the Virus Infected Mouse-OS Dictatorship***,
then came the great and awesome Spin and PASM.
***I saw a demo scene written on an old Commodore last night called "Progress Without Progress",
and another called "Robot Liberation", on Youtube. I recommend you might like those but only
because of the topic. Stuff done with the maximum power of old computers may or may not impress you.
Ive wondered if·you could use Tri-Axis Accelerometers positioned in the middle of your long bones and a few on the body, one each on the back of your hands and one each on the tops of each of your feet and one on your forehead. you would have to take·accurate measurements of your body also. The Accelerometers would be calibrated to zero in a standing to attention position, then when movment occurs the Accelerometers·data is·cross refrenced with the bodys measurements. I think it would take around 15 Accelerometers to do this at minimum, but probably more.·Naturally you would use prop chips and the H48C Tri-Axis Accelerometers, maybe a SD card to record the data, or a radio link. Again robot basic could be used to interpret the data and make a visual representation on screen.
I think we are getting away from the topic a bit here though. We are talking about using human motions in an avitar machine.
TylerS·how big do you want to make this thing? if your going to make it so a laptop can be mounted in it it would have to be atleast the size of a 3 year old child. If your happy with making a little robo fellow you might want to look in to buying a robonova shell and servos then adapting it to run on the propchips (this would be quite easy to do)
Check this out http://www.lynxmotion.com/Category.aspx?CategoryID=91
I looked in to going down this path but its not enough for me I want a real·android, not a cute hightec toy
Thanks for the great tips and advice. I'm really impressed with what I've seen of the propeller community so far.
After doing more reading and research it looks like I'll definitely be using the propeller for data acquisition and actuator control, I'll probably end up getting viewport for debugging. Viewport can give real time sensor reading correct (does real time variable control = real time sensor readings and able to change servo angles etc. in real time)?
Also, what resolution color image files can the prop transfer wireless (or wired I guess) at 30 fps? I'm guessing not that much due to limited memory. This may mean that I don't have a choice but to have an onboard CPU with image analysis capabilities (once I want to do imaging).
I've been thinking about the calculations I want to do and the control methods and I'm positive I'll need something considerably more powerful than the propeller linked wireless at a minimum. (now transmition speed becomes an issue because I will need to send sensor data, process where to put the next step, and then transfer that to the robot before it falls).
For the end result I'm thinking to go towards one of the humanoid robotic competitions. There are several at RoboGames (Mech Warefare) with a max weight of 5 kg. Robocup has a teensize league with 100-160cm tall bots. Right now I'm leaning towards the Mech Warefare competition / robocup kidsize 30-60 cm because the robot could serve either purposes and it would be significantly cheaper to build. (Can't use servos on a 100cm tall bot).
My only reservation about robot Basic is that the simulations won't be nearly accurate enough for what I want to do. It seems like Robotic Development Studio's simulator would work well. Since robot Basic is free I'll at least play with it, but I'm not sure I'll want to stick with it.
If I learn C for programming the propeller, would I also be able to use C for the PC part of the control? This could be an advantage as opposed to learning spin, and then learning another language for the PC. Aren't there more powerful development tools if I use C? Are any real time sensor reading programs and debugging tools (like Viewport is with spin) available for C?
Lastly, I'm planning to spend a good deal of money on this thing. Probably 2 to 3k by the time I'm done. I already have 7 old servos, an IR sensor and Ping sensor. Plus I have two prop proto usb boards and a Propeller Professional Development Board. I've just never used it. (I picked them up as door prizes at some events). I've got plenty to get started.
Post Edited (TylerS) : 12/10/2009 1:12:41 AM GMT
Ive seen the robonova doing some very cool Gymnastics with no sensors at all, just going from pose to pose. Ive seen him doing other cool stuff with just a gyro and·Tri-Axis Accelerometer too.·
Yes, ViewPort let's you both monitor and control variables in real time.
This means you can use a ViewPort dial to move a hobby servo while monitoring the value of a pressure sensor as a value. Of course you can also graph both values in a virtual oscilloscope or other instruments. All with just 1 additional line of code in your program. See signature for review and link.
Hanno
▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
Co-author of the official Propeller Guide- available at Amazon
Developer of ViewPort, the premier visual debugger for the Propeller (read the review here, thread here),
12Blocks, the block-based programming environment (thread here)
and PropScope, the multi-function USB oscilloscope/function generator/logic analyzer
How hard can it be to replicate that? I don't think you need a "robot brain" at all.
Don't worry about learning spin and C, if you learn C then you have learnt programing and Spin will offer no problems, you can learn it in a day or so.
Cheers,
Graham
There is a Propeller object available for talking to Bioloid actuators created by CrustCrawler. They use the Propeller as an interface between any micro and the Bioloid but the object can be used to have the Propeller directly control the actuators.
http://forum.crustcrawler.com/phpBB3/viewtopic.php?f=12&t=1050
http://forum.crustcrawler.com/phpBB3/viewtopic.php?f=12&t=1054
▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
BioProp: Robotics - Powered by Bioloids and controlled by the Propeller
I'm about to build a Linx Motion BRAT with the Propeller Robot Controller, but I'm really interested to know if anyone has ported a successful walking gate with the PRC yet?