Very impressive robot vision possibility!
I was over at robots.net and I saw this very cool system being developed by CMU: www.cs.cmu.edu/~dhoiem/projects/popup/index.html. What is does is approximates depth from a single image and generates a 3d representation of it. The animations at the top of the page will give you an idea of what this looks like. I played around with it a bit and I have to say, it's pretty amazing.
I took a single picture of a box on my desk with my digital camera (first attached image). and after running it through the program it gave me a .wrl (vrml file) of the box. I took a picture from the side to show what it looked like depth-wise (second attached image).
The only things I did to get from the photo to the model was run it (the picture) through an image segmenter (very common, they're used in almost all machine vision systems, google search for one, I used EDISON www.caip.rutgers.edu/riul/research/code/EDISON/index.html ) to get the ppm file and then ran the original .jpg and the .ppm file through CMU's program, and there it was! Very cool if I must say so myself.
I'm not exactly sure how to implement this into a mobile robot, but if someone figures it out, I think it would be a breakthrough.
Anyway, I thought the robotics people here might want to see this, so I thought I'd share.
Post Edited (Whelzorn) : 6/20/2006 9:10:49 PM GMT
I took a single picture of a box on my desk with my digital camera (first attached image). and after running it through the program it gave me a .wrl (vrml file) of the box. I took a picture from the side to show what it looked like depth-wise (second attached image).
The only things I did to get from the photo to the model was run it (the picture) through an image segmenter (very common, they're used in almost all machine vision systems, google search for one, I used EDISON www.caip.rutgers.edu/riul/research/code/EDISON/index.html ) to get the ppm file and then ran the original .jpg and the .ppm file through CMU's program, and there it was! Very cool if I must say so myself.
I'm not exactly sure how to implement this into a mobile robot, but if someone figures it out, I think it would be a breakthrough.
Anyway, I thought the robotics people here might want to see this, so I thought I'd share.
Post Edited (Whelzorn) : 6/20/2006 9:10:49 PM GMT
Comments
It's amazing the kind of discoveries that are rolling out every month with robotics. I have said in the past, and saying it now, we are in the early era of robotics much like the early years of the personal PC. Once they get to the level of some practical benefit, everyone’s going to want one.
▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
Mike
·
Anyway, I totally agree about robotics making the next huge impact on society. I have the most faith in companies like White Box Robotics and iRobot, but it's nice to dream about one day seeing complex humanoids like asimo being used for practical purposes.
I am thinking about trying to make this "more mobile" by myself, as I am very interested in robot vision tech like this.
anyway, its good to see some other interest in this stuff!
Post Edited (Whelzorn) : 6/22/2006 7:05:23 PM GMT