Shop OBEX P1 Docs P2 Docs Learn Events
Video Processing (going the other way) - Page 2 — Parallax Forums

Video Processing (going the other way)

2»

Comments

  • cgraceycgracey Posts: 14,133
    edited 2006-06-16 18:57
    Cliff L. Biffle said...

    The 'vbase' value -- when you say "number of longs loaded," do you mean the longs of the machine code at 0x0020, or the overall word size of the programmed image?
    I mean the overall size of the programmed image, in longs. If you create a binary file with that word valid, you can use the Propeller tool to open it and·load it into a Propeller. If you are working entirely within your own tools, you can ignore it.·Just a chance to walk before having to run.

    And I should probably ask, since you've been so helpful: do y'all have any objections to third-party tools targeting the Propeller? It'd be non-commercial; my employer doesn't take kindly to commercial side projects. smile.gif
    It's·a free world! Do what interests you. I'm anxious to see what you come up with.

    ▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔


    Chip Gracey
    Parallax, Inc.
  • Cliff L. BiffleCliff L. Biffle Posts: 206
    edited 2006-06-16 21:25
    Chip Gracey said...
    It's a free world! Do what interests you. I'm anxious to see what you come up with.

    Some parts may be freer than others; reverse engineer enough interfaces, and you start building a collection of DMCA takedown notices. *sigh*

    Thanks!
  • cgraceycgracey Posts: 14,133
    edited 2006-06-16 23:11
    Man, don't get me started! If Americans could just go SIX MONTHS without consuming anything from the RIAA and MPAA, this stuff would all wind down to nothing and America would be a better, freer·place. It's like the last days of Rome when our government bends over backwards to pander to an industry that peddles vice.
    Cliff L. Biffle said...
    Chip Gracey said...
    It's a free world! Do what interests you. I'm anxious to see what you come up with.

    Some parts may be freer than others; reverse engineer enough interfaces, and you start building a collection of DMCA takedown notices. *sigh*

    Thanks!
    ▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔


    Chip Gracey
    Parallax, Inc.
  • Cliff L. BiffleCliff L. Biffle Posts: 206
    edited 2006-06-17 06:41
    It's not just big media; Adobe, Sony, and others are doing the same to software and hardware.

    But I think we're officially off-topic now. smile.gif
  • Mike GreenMike Green Posts: 23,101
    edited 2006-06-17 19:36
    Cliff,
    Back to the original topic ... I've ordered an OV6620 module to play with and found a few articles on the web for interfacing it including the AVR-cam description and software and a report on an interface for a class project by Inaki Oiza. Do you have any suggestions or thoughts on interfacing the OV6620 to the Propellor for low resolution (windowed or QCIF) B&W image processing?
    Mike
  • Cliff L. BiffleCliff L. Biffle Posts: 206
    edited 2006-06-18 23:02
    Mike,

    I hope to play with this for real in the near future, but here's what I've managed to divine thus far.

    You can definitely do image processing, even in color, from the OV6620 on the Propeller. You will likely have to stay away from SPIN for most tasks; the OV6620's data rates push the Propeller's IO capabilities, and you need all the speed you can get.

    You should be able to consume the full-rate 16-bit data stream (~8Msamples/sec) from the 6620 using two COGs, either by taking turns consuming lines and writing to shared RAM, or by alternating samples. If you're willing to skip every other sample (which the CMUcam does), you could do it with one COG.

    My current scheme: reserve a few (3-4) line-sized buffers in shared RAM, and write them sequentially in a circular-buffer scheme. This will give the other COGs several fields' worth of time to process the data. In this arrangement, you can use the OV6620's internal clock, which reduces your circuitry. Use WAITPEQ to watch for the HREF signal, consume a line of data, lather, rinse, repeat.

    For this kind of scheme, you'd need to dedicate 16 lines to input (or 8 for BW), 1 line to watch the HREF signal, 2 lines to drive the camera's I2C (though you could share this with the EEPROM), and possibly another line to monitor the VSYNC signal. You'd also have to be certain that both systems have reliable clocks -- synchronized to within one 8MHz camera cycle. Alternatively, you could generate the camera's clock signal from one of the Propeller's counter outputs.

    If you reduce the incoming data, either by ignoring some of the channels, downsampling, or windowing, you'll have an easier time of it. The OV6620 can also be underclocked down to 10MHz if you use an external clock source, which reduces the 16-bit data rate to 5Msamples/sec.

    You could also fit a frame of 8-bit luminance data in the Propeller's shared RAM if you code carefully and stick to QCIF. (It'll consume about 26K.) This'd give you some more flexibility in terms of processing -- multipass algorithms and the like. I'm not actively working on this, because I'm a glutton for punishment and writing line-oriented realtime processing algorithms is my idea of fun.

    And as a third option, you could use an external framebuffer, which the Propeller could read at its leisure. The CMUcam guys use the AL422, which is dual ported: you can slave one port directly to the camera, and read from the other one at the same time. It seems a nice chip, though it's only available in a 28-pin SOP package, which is a little outside my soldering skills.

    If you're doing BW exclusively, you might look at the 6120 (I believe that's its number), the BW version of the 6620. It has much higher light sensitivity.

    That's a braindump, but I hope it made some sense.
  • Mike GreenMike Green Posts: 23,101
    edited 2006-06-18 23:37
    Thanks for your thoughts. They tell me that all kinds of things are possible with the 6620 and the Propellor. I notice that Chip has suggested using a 6 MHz Xtal rather than the 5MHz usually used with the Propellor. That would help a little with processing power.
  • Cliff L. BiffleCliff L. Biffle Posts: 206
    edited 2006-06-18 23:57
    Yeah, at 96MHz, you can get an even three instructions per pixel from the 6620, allowing you to buffer a line into COG-local storage in real time. (Of course, then you need a whole mess o'cycles to put it anywhere useful; hence the two-COG approach.)
  • linuxgeeklinuxgeek Posts: 45
    edited 2006-09-03 21:31
    Phil Pilgrim (PhiPi) said...
    Chip's right. I have been working on this kind of thing -- for more than 20 years! smile.gif Machine vision, as the field is called, is hard. Yet calling it hard is a gross generalization that encompasses things like 3D navigation, bin picking of randomly-oriented parts, and face recognition, to name some of the real stinkers. But the field also includes much simpler things, such as sizing, position and orientation sensing, and barcode reading.

    Would it be workable to process a video image on the fly as you send the image data out a wireless link to a larger computer?

    ▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
    I can handle complexity.. It's the SIMPLE things that confound me.
    The Dynaplex Network - Home of The Octabot Project
    http://www.thedynaplex.org
Sign In or Register to comment.