Shop OBEX P1 Docs P2 Docs Learn Events
PropCAM: A Propeller imaging camera AVAILABLE NOW! - Page 2 — Parallax Forums

PropCAM: A Propeller imaging camera AVAILABLE NOW!

2456712

Comments

  • bambinobambino Posts: 789
    edited 2006-06-21 15:53
    Phil I think your program would cure male baldness. I've pulled mine out numerous times over just that. The program that my manager wants though would not throw an exception for any warm blooded creature. But other than that were on the same page.

    ·
  • simonlsimonl Posts: 866
    edited 2006-06-22 12:33
    Yay! Nice one Phil -- I sooooo want one (or more!) of these yeah.gif

    ▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
    Cheers,

    Simon
  • parskoparsko Posts: 501
    edited 2006-06-22 13:33
    Phil,

    How fast could you refresh that screen with a new image?

    -Parsko
  • Phil Pilgrim (PhiPi)Phil Pilgrim (PhiPi) Posts: 23,514
    edited 2006-06-22 17:23
    Parsko,

    Image acquisition is a two-step process. It consists of an integration period, followed by a readout. The integration time can range from 0.2mSec to 20mSec and is the time required for the pixels to gather light — akin to exposure time for film. After the integration period, readout from the sensor to the Propeller's hub-resident image buffer occurs in 1.6mSec. The total cycle time from one acquistion to the next is given by

    ··Cycle Time = Integration + Readout time + Overhead

    The overhead is about 4.7mSec and results from the acquisition management being coded in Spin. Therefore the quickest cycle time is 0.2 + 1.6 + 4.7 = 6.5mSec or about 153fps. The slowest is 26.2mSec, or 38 fps.

    Because an NTSC video screen refreshes at approximately 30 fps, even the longest exposure time will fit comfortably within one full frame (two interlaced fields) period. That's why I specified a frame rate of 30 fps, though it can be much faster than that in reality.

    I've programmed several acquisition modes into the camera's software:

    ··1. Snapshot
    ··2. Continuous
    ··3. Timed
    ··4. Synced

    Snapshot mode allows the Spin programmer to begin an integrate-and-acquire cycle to obtain a single frame. In the other modes, a separate cog exposes frames continuously. They can either be picked off into the image buffer by the programmer (frame-grabbed) or written to the buffer after each exposure. In continuous mode, frames are exposed as fast as possible. In timed mode, the programmer can set a time interval to fix the fps rate. In synced mode a software PLL keeps the 1.6mSec acquisition interval in the video output's vertical blanking interval, regardless of the exposure time. This eliminates flicker, as the picture can update only when it's not being displayed. This mode provides the best-quality real-time display on the video monitor.

    When the subject is being illuminated by fluorescent lights a timed interval in a multiple of 1/30 second is recommended to minimize flicker. You'd think that the synced mode would work just as well. But the NTSC frame rate isn't exactly 1/30 sec, and so frame refreshes drift relative to the 60Hz (actually, in most cases, 120Hz) light output. Plus, the PLL exhibits some jitter.

    In order to obtain the best-quality images, the programmer can select one of the autoexposure modes to keep the average pixel value as close to a preset value as possible. This can be done by regulating the exposure time, the camera's internal analog gain, or both.

    Well, that was a long-winded answer to a very short question. I hope it helps.

    -Phil
  • Kaos KiddKaos Kidd Posts: 614
    edited 2006-06-22 18:27
    Phil:
    I love it...
    Is there a target cost? (Not that anyone will hold anyone to it... just a rough idea...)

    ▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
    Just tossing my two bits worth into the bit bucket


    KK
    ·
  • Phil Pilgrim (PhiPi)Phil Pilgrim (PhiPi) Posts: 23,514
    edited 2006-06-22 18:57
    KK,

    No target cost yet. There are two huge variables that have to be nailed down first: assembly costs and the lens. Domestic lens suppliers get obscene markups on stuff they buy cheaply overseas. So I've got to find a supplier in the orient who can provide the correct lens at a reasonable price. (I've got a thousand lenses in stock, but they were bought for a different project, and the focal length is wrong.) I've gone through this process in the past and had good luck getting data sheets and samples from both China and Korea. (The lenses on the TCS230 color module were selected this way and come from Korea.) This time around, just getting my emails answered has proven frustrating. Last night I even phoned my Korean supplier for more data and pricing, and they promised to email me "tomorrow". We'll see...

    -Phil
  • parskoparsko Posts: 501
    edited 2006-06-22 21:28
    Phil,

    Your answer was pritine! Otherwise a fabulous project. Here in Holland, it seems popular for people to put small simple cameras outside their door, and route it to their TV. I can imagine this would be a perfect fit for that! It looks to be a fun toy [noparse]:)[/noparse]

    -Parsko
  • Phil Pilgrim (PhiPi)Phil Pilgrim (PhiPi) Posts: 23,514
    edited 2006-06-23 07:20
    Here's an example of using a histogram as a means of object recognition. One of the very basic things the PropCAM is programmed to do is obtain a brightness histogram over all, or a portion of, the captured image. Since we're dealing with 16 gray levels, the histogram will have 16 bins. A Spin program was written to capture and remember the histograms from two different pictures: Alexander Hamilton ($10 bill) and Andrew Jackson ($20 bill). It was then presented with these and other pictures, comparing histograms with the two that were saved. Assuming there was an adequate match, it would identify the picture; otherwise, it would indicate "neither". Attached are example photos of the output. To the right of each image is a bar graph showing its brightness histogram. On the bottom left is a list of stats showing the current camera settings and feedback values. (The camera was in autoexposure mode to eliminate any bias from overall brightness.) On the right is the identified picture and two error values: one comparing Hamilton; the other, Jackson.

    One of the advantages of histogram analysis such as this is that it's a global metric and relatively insensitive to rotations, as the photos suggest. Now I admit, this is a very contrived example, amounting to little more than a parlor trick. It's done under controlled conditions with controlled lighting using two-diminsional pictures. Out in the wild, with real three-dimensional heads, it would be hopeless. But still, it's a valid technique that might be used as part of a heuristic for identifying and sorting currency — to name just one example.

    Anyway, work continues apace. Now, if I could just find the right lens...

    -Phil

    Update: No sooner did I finish typing, "... the right lens..." than I got two emails from lens companies in the Orient. Finally! Now things are starting to happen.

    Post Edited (Phil Pilgrim (PhiPi)) : 6/23/2006 7:58:12 AM GMT
    800 x 300 - 31K
    800 x 300 - 31K
    800 x 300 - 31K
  • Kaos KiddKaos Kidd Posts: 614
    edited 2006-06-23 13:49
    Phil,
    thanks for the answer, and, yea I do understand... I'll just sit and wait... :0
    Man, this is awesome ...

    ▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
    Just tossing my two bits worth into the bit bucket


    KK
    ·
  • simonlsimonl Posts: 866
    edited 2006-06-27 12:55
    Hi Phil,

    I'm hoping that I'll be able to attach one of these to the underside of my helicopter, track up to three 'objects' (contrast areas) on the ground, and use the data to hold the heli' in position.

    Do you think that'll be possible with a PropCAM?

    ▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
    Cheers,

    Simon
  • Phil Pilgrim (PhiPi)Phil Pilgrim (PhiPi) Posts: 23,514
    edited 2006-06-27 14:39
    Hi Simon,

    If the objects are very contrasty and readily identifiable, it may be possible. One of the challenges of using machine vision outdoors, of course, is that you don't have control over the lighting. Clouds scutting by and shadows from trees, buildings, and the like can easily fool the best of systems.

    -Phil
  • bambinobambino Posts: 789
    edited 2006-06-27 14:54
    I know it will be awhile before we see any code on this , but where could I find some good info on connecting imaging devices to microcontrollers in general?
  • Phil Pilgrim (PhiPi)Phil Pilgrim (PhiPi) Posts: 23,514
    edited 2006-06-27 15:09
    Bambino,

    In my experience, each imaging device has its own unique requirements, so there are no "general" rules to follow. The cheapest and most common imaging devices are the ones that output video (NTSC or PAL) for a video monitor. I've seen board cameras like this advertised for as low as $19. Unfortunately, these are quite a bit more difficult difficult to interface to a microcontroller than, say, a device with digital outputs that's designed for the purpose. But even the latter come in several flavors: those whose data rates can be controlled by the microcontroller (slaves), those which output a stream of data that the microcontroller has to deal with on a continuous basis (masters), or some hybrid of the two. Some devices are strictly parallel output, some serial, some providing a choice between the two.

    I'm really hoping that the PropCAM will level some of the speedbumps people encounter while getting their feet wet in machine vision.

    Cheers!
    Phil
  • Cliff L. BiffleCliff L. Biffle Posts: 206
    edited 2006-06-28 04:56
    I'm looking forward to seeing what people do with this. You can get a lot of mileage out of the SX28-based CMUcams, and while they're rather more powerful than any one COG in the Propeller, the extra RAM opens up a world of possibilities.

    I hope to get back to my OV6620 integration project once work slows back down.
  • Phil Pilgrim (PhiPi)Phil Pilgrim (PhiPi) Posts: 23,514
    edited 2006-07-04 18:13
    Lens samples are starting to arrive from China now. So I've been busy testing them. I spec'd a rather wide angle lens (2.8mm f.l.), as I thought this might be the most useful for robotics apps. It gives a field of view nearly equal to the subject distance and would be equivalent to a 39mm lens on a 35mm camera. But with wide-angle construction comes "barrel distortion". This semi-fisheye effect is a consequence of a lens's squeezing a wide angle of incidence from the subject into a narrow exit angle onto the sensor. For things like relative position tracking, it's probably not a problem. But for measurement apps, it can be an issue if not corrected for.

    The attached photos illustrate what's happening. The PropCAM is viewing a pattern consisting of three identical ovals. Its software computes the apparent area of each oval (in pixels) and displays it in the grid on the lower right. The "distortion" figure compares the average area of the corner ovals to that of the center. The barrel distortion is pretty obvious in the screen capture. (The straight lines are overlaid on the live video by the software as an aid to aligning the subject.)

    So where to from here? Methinks a different lens. One of my supplier-hopefuls tells me their 3.6mm lens will cure the problem. This would be equivalent to a 51mm lens on a 35mm camera, which is close to a "standard" field of view. So another round of samples (and their associated shipping costs) looms. Meanwhile, progress on the image processing software continues apace. The next step there is a blob locating routine.

    -Phil
    800 x 600 - 77K
    480 x 360 - 30K
  • Andrew SmithAndrew Smith Posts: 18
    edited 2006-07-05 04:19
    Phil,

    Keep up the great effort, I'm following this thread eagerly awaiting a release date!

    Do you have any feel for where your pricing may come in?

    Thanks!

    Andrew

    ▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
    Andrew
  • Phil Pilgrim (PhiPi)Phil Pilgrim (PhiPi) Posts: 23,514
    edited 2006-07-05 05:01
    Hi Andrew,

    Thanks for the encouragement! Not sure about the pricing yet. A lot of that is riding on the lens and the choice of interfacing hardware (cable vs. adapter board). Obviously, I'd like to keep costs down as much as practical, without compromising performance or convenience.

    Cheers!
    Phil
  • BeanBean Posts: 8,129
    edited 2006-07-07 03:04
    Phil,
    Any clues about a form factor ? I've been playing with my LPKF milling machine to make a pan-tilt fixture. I'd like to make it fit your camera. It will probably have about a 2.5"x2.5" mounting surface. Do you think the camera board will be bigger than that ?

    Bean.

    ▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
    Cheap 4-digit LED display with driver IC·www.hc4led.com

    Low power SD Data Logger www.sddatalogger.com

    "I'm a man, but I can change, if I have to, I guess" The Red Green Show
    ·
  • Phil Pilgrim (PhiPi)Phil Pilgrim (PhiPi) Posts: 23,514
    edited 2006-07-07 04:12
    Bean,

    The PropCAM board is 1.35" x 1.35" with four 1/8" mounting holes on 1" centers. The optical axis is at the center of the board. (It's the same form factor as the TCS230 color sensor board, but with a different connector.) The connector is a 12-pin (6 x 2) 2mm shrouded header, with the pins coming out of, and perpendicular to, the backside of the board. The centerline of the connector along its long axis is coincident with a line connecting the two bottom mounting holes.

    -Phil
  • Mike GreenMike Green Posts: 23,101
    edited 2006-07-07 04:46
    Phil,
    I'm really impressed with what you're doing. Thanks for putting all the effort in. I look forward to buying a PropCAM (or two) when they're available.

    Mike
  • Phil Pilgrim (PhiPi)Phil Pilgrim (PhiPi) Posts: 23,514
    edited 2006-07-07 06:26
    Thanks, Mike! I've been working these past couple days on object acquisition and tracking. Attached is a short video clip (QuickTime loop — zipped) of a large marble rolling across a sheet of paper. When the PropCAM software acquires and tracks the marble, it draws a white crosshair on top of it. The tracking was done at 30fps. The attached photo shows the setup.

    -Phil
    800 x 600 - 82K
  • BeanBean Posts: 8,129
    edited 2006-07-07 11:38
    Phil,
    Wow, that's alot smaller than it looks in the pictures. Maybe I'll use the Parallax mini servos ? The fixture is going to be huge compared to the camera board.

    Bean

    ▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
    Cheap 4-digit LED display with driver IC·www.hc4led.com

    Low power SD Data Logger www.sddatalogger.com

    "I'm a man, but I can change, if I have to, I guess" The Red Green Show
    ·
  • BeanBean Posts: 8,129
    edited 2006-07-08 02:09
    Phil,
    I assume 4-40 mounting screws are okay ?
    I plan to have mounting holes for the Ping ))) also. Ping on top, camera on bottom, or should I have the camera on top ?

    Bean.

    ▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
    Cheap 4-digit LED display with driver IC·www.hc4led.com

    Low power SD Data Logger www.sddatalogger.com

    "I'm a man, but I can change, if I have to, I guess" The Red Green Show
    ·
  • Phil Pilgrim (PhiPi)Phil Pilgrim (PhiPi) Posts: 23,514
    edited 2006-07-08 04:25
    Yup. 4-40. I'd put the Ping on top. That way any cable from the PropCAM is less likely to get in its way.

    -Phil
  • Phil Pilgrim (PhiPi)Phil Pilgrim (PhiPi) Posts: 23,514
    edited 2006-07-08 20:11
    Bill,

    Thanks! But seriously, 100+? Care to divulge your app? (You can PM me if you like. 'Just wanna make sure the PropCAM can handle what you want to do with it.)

    -Phil
  • Kaos KiddKaos Kidd Posts: 614
    edited 2006-07-10 15:04
    ..Ummm, sorry for the 'off topic question', but... how does a propellrer and cam reduce your hydrocarbon consumption between 5 and 25%? Anti theft??

    ▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
    Just tossing my two bits worth into the bit bucket


    KK
    ·
  • Kaos KiddKaos Kidd Posts: 614
    edited 2006-07-11 14:05
    Wastehl:
    It explains it very well. Thanks for the insite...

    That old bit bucket is just never gonna get full!

    ▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
    Just tossing my two bits worth into the bit bucket


    KK
    ·
  • Phil Pilgrim (PhiPi)Phil Pilgrim (PhiPi) Posts: 23,514
    edited 2006-07-11 23:09
    The image processing functions are nearly completed. The most recent addition is helpful for motion detection. With the PropCAM, it's possible to capture only even lines, only odd lines, or both. Moreover, you can assign even lines from the camera to odd lines in the image/display buffer, or vice-versa, if you like. What this enables is to capture even lines to even lines then, after a short delay, even lines to odd lines. What you are left with is an interlaced image, with the even lines from one snapshot and the odd lines from another. If there was no motion in the image between snapshots, the even lines and odd lines will look the same. If something moved, there will be a readily discernable discrepancy. By comparing the lines in all line pairs, one can not only count how many pixels differ from their mates by a certain threshold, but also determine where in the scene these differences occur.

    The attached video captured from the Propeller shows me waving a pen. The top line displays how many pixels changed by at least 1/2 of full scale. If this number is over 20, the background color changes from green to red, and a new "base" snapshot is taken. There's a little crosshair cursor that follows the centroid of the motion. Since it gets partially clobbered with each new snapshot, it's a little hard to see. Also, the image quality appears to suffer, since the effective vertical resolution is cut in half to 48 rows instead of 96 to accommodate the line pairing. You may want to pause and single-step the video to see the details.

    -Phil
  • jammajamma Posts: 33
    edited 2006-07-12 23:36
    Pretty cool, Phil. Extrapolating Bill's app and applying it to a hot issue, you could set up a few thousand of these across the Mexican border to alert a central monitoring site of human movement in places where none is expected. Just need them to work off solar power, communicate wirelessly (Zigbee mesh?), and see across the infrared and visible spectrum (for night-time border crossings). Wouldn't really need to discriminate between humans and large animals since this is just a first line filter for a human monitor: pop up a screen for them and start sending a live stream for analysis.

    Of course, to really stop the problem, you'd have to install the cameras in the myriad tunnels that have been dug between the countries...

    Still, beats the dumb fence idea and might sell a lot of Props and PropCams (the real goal)!!
  • steve_bsteve_b Posts: 1,563
    edited 2006-07-13 15:24
    Just curious if this will false trigger on sunlight motion? Or, even sunlight blocked from a cloud suddenly shines through and triggers....

    it's cool nonetheless!

    ▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
    ·

    Steve

    "Inside each and every one of us is our one, true authentic swing. Something we was born with. Something that's ours and ours alone. Something that can't be learned... something that's got to be remembered."
Sign In or Register to comment.