I originally mentioned that I was designing a low-cost Laser Range Finder (LRF) for Parallax back in their 2009 catalog. The project has been moving along sporadically in a vacuum since then and I thought it would be fun (and hopefully informative) to use this Forum thread as a way to keep folks updated with my progress and to solicit any comments along the way.
Thereís a danger of publicly documenting an in-progress design, since lack of consistent updates will cause people to think the project has been abandoned. Not true in this case - Iím working on this design in parallel with some other projects. Sometimes Iíll have more time to focus and, hence, make strides quickly, and other times Iíll be working on other things or waiting for parts/materials to arrive. Iíll try and post when I have something useful to share or run into some development problem.
Most of the past year has been spent researching and evaluating various range-finding methods and trying to settle on a design direction that is suitable for a low-cost/hobbyist environment.
My original plan way back at the beginning was to use the time-of-flight method (http://en.wikipedia.org/wiki/Time-of-flight and http://www.repairfaq.org/sam/laserlia.htm#liarfbl) to measure the travel time from laser light leaving the laser and being received by the detector. I built a high-speed time-to-digital converter using the ACAM GP2 (http://www.acam-usa.com/GP2.html) and a Parallax SX, and had a Data Delay Devices 3D3608 programmable pulse generator (http://www.datadelay.com/asp/oscpg.asp) generating pulses in a range of 14ns to 1.3uS that was to be used to trigger the laser driver circuitry.
Hereís a short video I made for Ken Gracey/Parallax in mid-2009 just to demonstrate the subsystems: http://www.youtube.com/watch?v=BlQhr8Jtl_A
I had also considered phase-shift measurements, which compares the phase shift between the outgoing modulated laser and its reflected light.
The high speed circuitry required for both systems is non-trivial and too much in the analog domain for me (Iím primarily a embedded/digital engineer), so I scrapped this approach. Both designs would also need specialized optics and the circuitry would be too finely tuned and precise for any user-based modifications/hacks.
I decided to go with the method of optical triangulation whereas the distance to a targeted object is calculated using triangulation with simple trigonometry between the centroid of laser light, camera, and object. The most compelling example is the Webcam Based DIY Laser Rangefinder (http://sites.google.com/site/todddan...m_laser_ranger) and my design is based, in theory, on this implementation.
Optical triangulation for range finding has been discussed previously elsewhere on the Parallax forums, but nothingís out there as far as a fully developed, compact, easy-to-use module:
- Phil Pilgrim using a TSL1401-DB linescan imaging sensor (June 2009)
- Discussion of time-of-flight and explanation of triangulation (~November 2006)
My first foray using this technique was using a Position Sensitive Detector as described by Roger Johnson and Chris Lentz's 2-D Optical Position Sensor article (Circuit Cellar #152, March 2003). A PSD gives excellent resolution and accuracy (0.0001Ē and 0.001Ē, respectively), but only for a limited detection range as the sensor is designed for sensing light shined directly onto its face, not capturing reflected light.
This approach ultimately turned into the Laser Position Sensor module (Parallax part #NFS001) that we recently released as open-source with no plans to manufacture due to cost of the PSD sensor (~$30 in volume) and expected low-volume sales.
My finalized design direction is to use a Propeller as the core, an Omnivision OVM7690 640x480 CMOS camera module (http://www.ovt.com/products/sensor.php?id=45), and a laser diode. The OVM7690 is a really nice, compact device with fixed focus, integrated lens. The LRF module will be open source to the extent possible. Omnivision requires an NDA in order to obtain data sheets and communication information, so we'll need to work with them further to figure out what we can publicly release.
I particularly like this approach, as the design combines a few separate subsystems that could ultimately end up being used as separate pieces - the CMOS camera system and laser driver system. I also hope that by using a Propeller, folks will take advantage of hacking/modifying the module for more vision/camera/imaging applications above and beyond the basic laser range finding functionality.
To me, simplicity for the user is key. Like previous modules Iíve designed for Parallax (Emic Text-to-Speech, RFID Reader and Read/Write, GPS Receiver), the LRF will have a simple serial communications interface for sending and receiving commands. Most likely it will be a 4-pin device: VCC (5V), SIN, SOUT, and GND.
In May 2009, early in my experimentation with optical triangulation, I made a video for Ken/Parallax to demonstrate a prototype using a CMUcam2 and a Freescale QG8. It worked surprisingly well given the low CMUcam2 resolution (176x255) and I could get 1/4 inch accuracy and distance range from 7 inches to 40 inches: http://www.youtube.com/watch?v=-h5ctq7dE9k
I think range and accuracy will dramatically improve using 640x480 resolution and further experimentation will be necessary to determine the ideal width of the module (distance between the laser diode and the camera), which will be an engineering trade-off of size v. measurement range. Possibly the camera and laser diode subsystems could be on a single module, but scored to make it easy for hackers/customers who want a different configuration or larger width to do so by snapping the board in half and re-calibrating.
Work to date
Since the video of the CMUcam2 and Freescale QG8 was made, Iíve successfully put together a version using the CMUcam2 and Propeller:
Iíve ported the triangulation math from the QG8 to the Propeller, so I can calculate the range to an object given pixel count from the center of the laser dot to the center of the camera module (which the CMUcam2 returns from its color tracking function call). For what itís worth, Iíll be releasing the code and hastily-drawn schematic for this version once I make some more progress and have some time to go back and clean things up.
The prototypes using the CMUcam2 proved that Iím on the right path with optical triangulation, but I needed to continue development using the actual OVM7690 camera. Instead of relying on the high-level image processing that the CMUcam2 provides, Iíll need to create the color tracking routines directly on the Propeller. So, I built a hardware development ďplatformĒ using a Propeller Proto Board, OVM7690, and custom PCB holding the associated control/interface circuitry:
Last week, I finished up the camera communications interface, so I can now send commands to and configure the OVM7690.
I feel like I've made some very good progress so far.
Up next is to finish the start-up/initialization routines for the camera, complete my evaluation of suitable laser diode/driver circuitry, and try to capture some data.
I also plan on diving into Hanno Sanderís DanceBot vision tracking code (http://www.circuitcellar.com/archive...der/index.html and in Chapter 7 of the Programming and Customizing the Multicore Propeller Microcontroller book, Parallax part #32316).
Hannoís ViewPort application (http://hannoware.com/viewport/) supports the OpenCV library for computer vision, which should give me a nice head-start on dealing with the actual color tracking of the laser dot (once I can successfully capture a video frame from the OVM7690).
Whew - that was a really long post! If youíve read this far, let me know if you have any initial thoughts and if you enjoy the idea of my posting of development progress to the Forums!