Robot Vision/Camera

SB316SB316 Posts: 33
edited November 2010 in Accessories Vote Up0Vote Down
Hey, I have been interested for a while in robotic vision and I had been wondering how one would implement vision onto a robot.· I have already seen a tutorial on methods to use·in applications such as blob tracking, but I don't know how to actually get a camera onto a robot.· Could I hack into a USB Camera? (USB has 4 pins for GND, 5V Power, and Data In and Out).· And, could one use the robots microcontroller to process images, or do you need to have a connection to an external computer?· And one more thing.· Does anyone know of some good robot vision processing software?

Thanks,

▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
SB 3:16
Tagged:

Comments

  • 14 Comments sorted by Date Added Votes
  • Mike GreenMike Green Posts: 22,004
    edited January 2010 Vote Up0Vote Down
    Could one use the robots microcontroller to process images?
    No. It's possible to do some processing, like blob tracking, using a fast microcontroller with enough memory attached, but you'd be using a low resolution image subset for this (like 128 x 64 pixels) because of speed and memory constraints.
    Could I hack into a USB Camera?
    No. You'd need to use a USB-host capable microcontroller and, because of throughput limitations, you'd at best be able to work with a subset of the image. If you really mean "hack into", like bypassing the USB interface, you might be able to, but would probably need a lot of hardware and speed / throughput would still be an issue.
    Do you need to have a connection to an external computer?
    Probably.
    Does anyone know of some good robot vision processing software?
    Have you looked at the CMUcam website? www.cs.cmu.edu/~cmucam/
    Have you looked at the ViewPort website? hannoware.com/viewport/features.php
  • SB316SB316 Posts: 33
    edited January 2010 Vote Up0Vote Down
    Thanks for getting back so fast (like what, 12 minutes after I posted [noparse]:)[/noparse] ). I looked at the two websites. Maybe I should have asked more specifically. I'm a super Mac person. Is there anything for Mac?

    Thanks,

    ▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
    SB 3:16
  • Mike GreenMike Green Posts: 22,004
    edited January 2010 Vote Up0Vote Down
    I really don't know what's available for the Mac. I'm also a Mac user, but I have Windows XP on my MacBook that I run using Parallels Desktop for use with things like ViewPort, 12Blocks, PropBasic, PropScope, etc. You can also use VirtualBox (free) with Windows. If you don't already have it, see if you can find an old copy of XP to keep around. I mostly use MacBS2 for the Stamps and BST for the Propeller.
  • wyzard28wyzard28 Posts: 24
    edited January 2010 Vote Up0Vote Down
    I've got one of these babies waiting in a box to mount to my Stingray's new Pan & Tilt I'm building: http://www.surveyor.com/blackfin/

    I, too, run off an iMac, and use VMware Fusion to drop into Windows XP mode to run most of the development software, including Roborealm
    a very nice image processing suite: http://www.roborealm.com/

    Once I get things up and running, I'm planning to upgrade to the Surveyor's SVS (Stereo Vision System): http://www.surveyor.com/stereo/stereo_info.html
  • BotdocterBotdocter Posts: 271
    edited April 2010 Vote Up0Vote Down
    What is the deal with roborealm? Is it free or not?

    And are there any (other) programs that are free that do things like roborealm or viewport?

    ▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
    1 Parallax Propeller Robot Control Board
    1 Memsic MX2125 accelerometer/ tilt
    1 Parallax Ping))) ultrasonic sensor

    a few motors and a whole lot of chaos!
  • SB316SB316 Posts: 33
    edited April 2010 Vote Up0Vote Down
    What I saw at the RoboRealm website is that it costs money to buy. But you can ask for it free if you post a project you did using it (during the 30 day free trial that they offer) somewhere as advertisement.

    That's what I say anyway.

    ▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
    SB 3:16
  • hover1hover1 Posts: 1,927
    edited April 2010 Vote Up0Vote Down
    Open CV·is free.

    Jim
    I have three propellers
  • Robert TRobert T Posts: 71
    edited April 2010 Vote Up0Vote Down
    I think the source code for the Surveyor Vision System is open source and can be down loaded from their web site.
  • nomadnomad Posts: 247
    edited April 2010 Vote Up0Vote Down
    hi Botdocter,

    i think OpenCV its for you:

    it's openSource, you can take it with windowXP or better, or with Linux.

    it's good community in computer vision, it's gives some good source-code, a book "learning opencv" from o'relly.

    ·i take this on my bot with a usbWebcam on a laptop (sitting on the bot) under linux, a propellerRobotControllerBoard for

    the Motors, a interface to propellerProfessionalDevelopmentBoard with 7 x Ping and 1 x HM55B-compass module and serialInterface (usb-to-rs232) to the laptop

    with a second webcam you can control your bot with your head or your hand :-) (PS2mouse, touchscreen, or your control the stuff with a wiimote :-)

    regards nomad
  • ercoerco Posts: 14,444
    edited May 2010 Vote Up0Vote Down
    I bought Roborealm for $39 long after this link should have expired: http://www.roborealm.com/woot/rovio.php
    ·The Paypal link still works, so it's probably good. An easy download. They give you free updates for a year.

    I installed it and it works on my PC, but I·have not really had a chance to play with it yet.

    ▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
    ·"If you build it, they will come."
    You'll find me in the new Robotics forum.
  • linuxntlinuxnt Posts: 4
    edited November 2010 Vote Up0Vote Down
    Hello!
    I started out with the prop not long time ago and got hooked really good!
    Now, I want to find a solution how to position a tool. Think of it as placing the gastap into your car - thats about the precision needed. Also, I will be able to rely some on stored coordinates, so its the last finishing I need to "see"
    Could one use the robots microcontroller to process images?
    Mike Green wrote: »
    No. It's possible to do some processing, like blob tracking, using a fast microcontroller with enough memory attached, but you'd be using a low resolution image subset for this (like 128 x 64 pixels) because of speed and memory constraints.

    I like to do that! I want to experiment with vision and don't need it to be particulary fast or hi precision. Should I aim to get the data into a cogram as a matrix(small, just black/white zeros/ones one pixel repr. by a bit probably 64x32 or so) OR is it the hubram I must use?
    Could I hack into a USB Camera?
    No. You'd need to use a USB-host capable microcontroller and, because of throughput limitations, you'd at best be able to work with a subset of the image. If you really mean "hack into", like bypassing the USB interface, you might be able to, but would probably need a lot of hardware and speed / throughput would still be an issue.
    Do you need to have a connection to an external computer?
    Probably.
    Does anyone know of some good robot vision processing software?
    Have you looked at the CMUcam website? www.cs.cmu.edu/~cmucam/
    Have you looked at the ViewPort website? hannoware.com/viewport/features.php

    Yes - and Viewport, though an excellent thing, does require a PC (no?)
    CMUcam - it would probably solve the need, but I like to learn som along the way.

    I like to know if I can use something like the module on the CMUcam, but scale it to be feasible on the propeller!

    Anything to help me going is VERY appreciated. many thanks in advance!
  • HannoHanno Posts: 1,130
    edited November 2010 Vote Up0Vote Down
    Hi linuxnt,
    Most people use ViewPort on a PC for debugging, real-time graphing of variables/pin states and to control devices with OpenCV. However, it also comes with 2 objects you can run on your Propeller that don't require a PC connection:

    PropCapture will capture NTSC video at full frame rate at configurable resolutions into memory (max is 240x200 at 16 grayscales). Normally you would stream this to ViewPort for analysis with OpenCV or just to look at yourself. However, you can also use:

    PropCVFilter applies simple computer vision filters to video frames in HUB. Things like Soebel filters, min/max...

    You can combine both (like I did in my DanceBot) to create an autonomous robot that uses vision to interact with people or other things in its environment- very fun!

    Read more:
    in my circuit cellar article:http://www.circuitcellar.com/archives/viewable/224-Sander/index.html
    in the official propeller book: http://www.amazon.com/Programming-Customizing-Multicore-Propeller-Microcontroller/dp/0071664505

    Hanno
    Professional IDE to edit, debug, and run SPIN, PropBasic and C: ViewPort
    Visual programming language: 12Blocks
    Multi-function Oscilloscope/LSA/Function Generator: PropScope
    500 page book of Propeller Projects:Programming and Customizing the Multicore Propeller
    Blog:http://onerobot.org/blog
  • linuxntlinuxnt Posts: 4
    edited November 2010 Vote Up0Vote Down
    Hanno wrote: »
    Hi linuxnt, ...

    Hi Hanno!
    Well - what do you know! It turned out to be a good idea to ask :)

    I have just quickly gone through the suggested links - and found them very promising. I'll take some time to go through deeper and I probably get even more curious and eager to get on with my own experimenting.

    Thanks a lot for the tips!
Sign In or Register to comment.