Propeller Backpack: COLOR NTSC Capture

Phil Pilgrim (PhiPi)Phil Pilgrim (PhiPi) Posts: 22,773
edited 2013-07-08 - 22:34:38 in Propeller 1
In a previous thread, I presented a grayscale NTSC image capture program for the Propeller Backpack, along with a hint that color capture might be possible. Indeed it is, with the same circuitry. The only thing I changed was to replace the 330-ohm external resistor with 560 ohms.This permits a lower gain for the sigma-delta ADC, enabling it to capture more detail from the chroma burst at the beginning of each line. Here are a couple of the images I've captured from the system:

For years (literally), I was was hung up on the notion that you needed to gen lock to the NTSC color burst to make color capture possible. I know now that this is not necessary. Borrowing from the work I did on the Propeller AM receiver, I realized that all that was necessary was an I/Q demodulator (i.e. synchronous detector) to sample the color burst at the beginning of each line at the chroma frequency (3.579545 MHz) and then again for each pixel in the line. Once you have the I and Q components of each, you can compute the chroma (I and Q in the YIQ color space) from the equation:

hue = atan2(qburst, iburst) - atan2(qpixel, ipixel)
saturation = sqrt(ipixel2 + qpixel2)

I = sin(hue) * saturation
Q = cos(hue) * saturation

From these and the gray level (Y), you can compute the RGB components of each pixel, where Y, I, and Q are suitably (i.e. empirically) scaled to produce a 0 .. 255 range for R, G, and B:
R = 1.969 Y + 1.879 I + 1.216 Q;
G = 1.969 Y - 0.534 I - 1.273 Q
B = 1.969 Y - 2.813 I + 3.354 Q )

(formulae from Video Demystified)

To sense the i and q components of each color burst and each pixel, I created two clock outputs at the chroma frequency, that are 90° out of phase. Each is combined XOR fashion in a logic mode counter that serves as a mixer/demodulator and sums the response over the programmed integration time. I chose four chroma cycles (i.e. two pixel widths) as the integration time, with I and Q staggered two pixels apart so I could do the chroma detection in one cog. This does result in some chroma smearing (evident in the above images), so I might split the I and Q demodulation into two cogs, so that each samples just one pixel's chroma at a time, without staggering samples.

Anyway, I just wanted to provide an early preview of what I've been working on. Neither the Spin source, nor the Perl analysis and image-generation code are "forum-ready" yet, but they will follow in the days to come.

341 x 238 - 78K
341 x 238 - 75K


  • TubularTubular Posts: 4,155
    edited 2011-10-18 - 20:46:38
    Incredible, Phil.

    We really need to revive OBC's cookbook with a special section on "1 resistor recipes". Take 1 backpack, add one 330 ohm for B&W, or one 560 ohm for colour. I've got a recipe for 220kohm coming
  • potatoheadpotatohead Posts: 10,121
    edited 2011-10-18 - 20:50:11
    This is excellent!!

    Very well done Phil..
  • Cluso99Cluso99 Posts: 16,909
    edited 2011-10-18 - 23:54:26
    WOW Phil!!!!
    This is incredible. Way to go ;)
  • Duane DegnDuane Degn Posts: 10,341
    edited 2011-10-19 - 00:38:20
    Cluso99 wrote: »
    WOW Phil!!!!
    This is incredible. Way to go ;)

    Double ditto!

    My jaw dropped when I saw those pictures. Amazing!

  • SapiehaSapieha Posts: 2,964
    edited 2011-10-19 - 02:33:50
    Hi Phil..

  • RossHRossH Posts: 4,692
    edited 2011-10-19 - 03:37:59
    Very impressive, Phil.

    Or, as my daughter would say: "Awesome!"

  • RaymanRayman Posts: 11,485
    edited 2011-10-19 - 04:15:23
    Very, very neat. Great work.
  • Dr_AculaDr_Acula Posts: 5,484
    edited 2011-10-19 - 04:19:32
    Amazing. I can feel a little Gadget Gangster board coming on. RCA socket and just a few components, and your robot can see!
  • BeanBean Posts: 8,119
    edited 2011-10-19 - 04:29:39
    That is amazing work. Could you provide some more theory ? And maybe some waveform diagrams ?
    I'm not really understanding how it works, but it looks great.

  • RaymanRayman Posts: 11,485
    edited 2011-10-19 - 06:12:36
    A while ago, had some cheap cameras for <$10. It'd be very nice to see if those would work...

    Phil, can you capture a lower resolution on the Prop chip? Can you capture faster that way?
    To me, I think a very small resolution, say 64x48 or maybe even smaller would the largest the Prop could process in real time...
  • ericballericball Posts: 774
    edited 2011-10-19 - 07:33:22
    Excellent job Phil! As has been stated before, sometimes all that is required to get something accomplished is to tell someone it's impossible...

    You might want to use the saturation of the burst to do AGC on the color signal, just like you're doing phase correction.
  • PerryPerry Posts: 253
    edited 2011-10-19 - 07:39:50
    Rayman wrote: »
    A while ago, had some cheap cameras for <$10. It'd be very nice to see if those would work...

    Phil, can you capture a lower resolution on the Prop chip? Can you capture faster that way?
    To me, I think a very small resolution, say 64x48 or maybe even smaller would the largest the Prop could process in real time...

    Just about any cheap camera will work. I have been switching back and forth between a cheap camera and a satellite receicer to get test inputs.
    I have been capturing video and audio , at greater than than 64x48 resolution for some time now.(greyscale). with "stupid video capture"

    Lately I have taken a side trip and started hacking over Eric Balls 8bpp driver to get greater pixel depth so that these images can be displayed on the Propelller

    Philtastic work!!!

    This is the first approach I tried, How many colors are you getting?

    I like to do this in real time and have been using a protoboard at 108MHz clock so I have more time to process the incoming/outgoung data.

    With low resoloutions it should be possible to build lookuptabels to do the conversions in real time

  • mindrobotsmindrobots Posts: 6,506
    edited 2011-10-19 - 08:22:25
    Amazing, Phil!

    These guys (Centeye - ) had an interesting presentation at the Open Hardware Summit about what they were doing with low resolution image processing (16x16) cameras. They have a "shield" that had one of their cameras on it. It's in the never ending queue of fun things to look at.

    Since Phil is a Dreamer and a Doer, maybe he can take something of use away from this instead of my shoulda/coulda/woulda approach.
  • Phil Pilgrim (PhiPi)Phil Pilgrim (PhiPi) Posts: 22,773
    edited 2011-10-19 - 08:47:13
    Thanks, guys!

    Today I'm going to try doubling the chroma resolution (to match that of luma) by rearranging cogs to have each of the I/Q local oscillators and its associated mixer in one cog, rather than the oscillators in one cog and the mixers in another. After that, I shall try to flesh out the presentation with more theory and some code.

    I'm still doing the pixel computations on the PC in Perl, since it's easier to try different things in a fast, native floating-point environment. The Propeller is simply capturing the necessary data at this point and passing it on. Once I'm satisfied with the results, I can begin writing Prop code to to do the YIQ-to-RGB color conversion internally.

    More to come!

  • ElectricAyeElectricAye Posts: 4,561
    edited 2011-10-19 - 08:55:50
    Nice gadget! Once Humanoido's Big Brain gets some eyes like this, it'll insist on having homemade oatmeal and Windex for breakfast every morning.
  • HannoHanno Posts: 1,130
    edited 2011-10-19 - 12:15:16
    Awesome job Phil!
    If you're willing I'd love to integrate your color frame grabber with ViewPort to stream video at 1mbps- with my grayscale grabber I get ~10 frames/second. Once the video is inside ViewPort the integrated OpenCV computer vision toolkit can recognize things like human faces and send the coordinate back to the Prop. Several years ago circuit cellar published my article with grayscale grabber and propeller based computer vision- I'm sure they'd be interested in an update from you...
  • ratronicratronic Posts: 1,451
    edited 2011-10-19 - 13:39:02
    Phil, I can't wait to see how you go about this!
  • HannoHanno Posts: 1,130
    edited 2011-10-19 - 14:08:37
    Here's a color NTSC camera that runs on 6-12V. Sensor has resolution of 628x582px- Phil's algorithm should be capable of supporting the vertical resolution. Supposedly goes down to .2lux.

    It costs $11.83- including worldwide delivery

  • Phil Pilgrim (PhiPi)Phil Pilgrim (PhiPi) Posts: 22,773
    edited 2011-10-19 - 14:55:30
    I rewrote the capture program to obtain an I and Q chroma value for each pixel, rather than staggering them across two offset pixels apiece. This resulted in 50% more data for each capture. Here are the results:

    Individual I and Q chroma values per pixel.

    Individual I and Q chroma values with (0.25, 0.5, 0.25) weighted average across three pixels.

    Original staggered I and Q chroma values.

    In the top image, you can see that the chroma values are not smeared. This is particularly apparent in the dark background. In the middle image, the weighted averaging helps to smooth them a bit. But I'm not convinced that either is better than the bottom (original method) image. I fact, I think the bottom image is more pleasing. As a consequence, I think I'll stick with the original program.

    341 x 238 - 89K
    341 x 238 - 81K
    341 x 238 - 77K
  • Cluso99Cluso99 Posts: 16,909
    edited 2011-10-19 - 15:37:32
    Phil: I have to agree. I looked at the pics first, then read the captions, and lastly your comments. Right from the start I had the impression that #3 was the best.

    Let me know if you get to the point of wanting to try multiple prop in parallel. My modules stack easily for running props purely in parallel. I want to try VGA this way, but never enough time!

    I too am very interested in seeing your simple block explanations of the workings behind your code. I do understand it somewhat, basically because of your other explanations of DSP and research I did following those posts. Its a joy to find out that those maths I did so many years ago (~40) actually have an application that I may end up using after all.
  • Phil Pilgrim (PhiPi)Phil Pilgrim (PhiPi) Posts: 22,773
    edited 2011-10-19 - 20:42:56
    Here's an explanation of the theory behind the Propeller Backpack color capture. First the schematic, showing the external 560-ohm resistor (all other components being included in the Backpack itself):


    The sync is detected as a logic value on A22. The 0.1uF capacitor is charged by an occasional DUTY-mode output from A19 to clamp the signal on A22, so that the sync tips and nothing else go below the logic threshold. The video signal also transits the external resistor and the MOSFETs (which are always "on") to the sigma-delta ADC formed from pins A12 (feedback), A14 (input), and A17 (filter cap -- always pulled low). The ADC's counter provides the luma (Y) value for each pixel.

    A color video signal includes not only sync and luma levels, but also chroma (color) information. The chroma is provided by a 3.579545 MHz subcarrier that rides upon the luma signal. (In fact, if you low-pass filter a color signal, you will end up with a grayscale signal that's still compatible with B/W monitors and receivers. Therein lies the brilliance of the NTSC and PAL color standards: they are backwards compatible with the earlier black-and-white standards.) The hue of any pixel is determined by the phase of the chroma subcarrier, relative to that of the "color burst" at the beginning of every scan line; the saturation, by its amplitude. Here is a photo of the oatmeal box, over which I've superimposed a trace of the video signal corresponding to a particular scan line:


    After the low-going horizontal sync, in a section of the waveform called the "backporch", lies the color burst, which is a phase reference for the chroma information to follow. You will notice, as the word "OATS" is scanned, that areas of high brightness have a high amplitude and that areas of rich (i.e. more saturated) color (e.g. red vs. white or black) have a higher high-frequency component. The latter is the chroma subcarrier.

    In the Propeller Backpack, the amplitude and phase of the subcarrier at any point in the scan are determined by "mixing" the signal with two oscillators (NCO counters) of the same frequency but with a 90° phase offset. These are output on pins A24 and A25, which are not otherwise used by the Backpack. This is adequate to determine both the amplitude and phase of the subcarrier over any interval within the scan line. The mixing is done by a pair of counters, each of which takes the ADC feedback output and XORs it with either the in-phase (I) oscillator or the quadrature-phase (Q) oscillator and counts the number of times this results in a logical "1". The average phase of the signal during that interval, relative to the I and Q oscillators, is given by the arctangent of the sine (I) component and the cosine (Q) component as counted by those counters.The phase relative to the color burst can then be computed by finding the color burst phase relative to the I and Q oscillators and subtracting. The saturation is just the Cartesian distance (square root of the sum-of-squares) of the I and Q terms.

    Here is a scope trace of a color burst, the I and Q oscillator outputs, and the ADC feedback output:


    You will notice that the ADC output is more predominantly low when the video signal is high, and vice-versa. It should also be apparent that this effect is very subtle, due to the fact that each cycle of the chroma frequency includes fewer than twenty-three 80 MHz Propeller clock intervals. This results in a fairly low signal-to-noise ratio and accounts for the proliferation of chroma noise in the acquired images. Errors in measuring the phase of the color burst will result in horizontal color striping in the acquired image; errors in the phase of each pixel, in color blotchiness within a scan line.

    Here is a block diagram that ties the system together:


    The output data going to the PC for construction of the image consists of luma (Y) data for each pixel, and interleaved chroma data, such that each pixel shares its I and Q chroma data with its right and left neighbor. In the PC, a Perl program does the work of computing the actual YIQ color signal relative to the color burst data and converting it to RGB. What remains is to have the Propeller do this work, so that the final image can be produced internally.

    Hopefully, within the next day or so, I will have a Windows exe ready to download that receives data directly from the Propeller Backpack and displays the resulting image. Stay tuned!

    609 x 401 - 227K
    640 x 480 - 20K
    693 x 300 - 6K
    693 x 468 - 23K
  • Cluso99Cluso99 Posts: 16,909
    edited 2011-10-20 - 01:12:28
    Nice explanation as always Phil :)
  • ratronicratronic Posts: 1,451
    edited 2011-10-20 - 08:37:47
    I never realized the Propeller's sigma delta ADC could capture video.

    Edit: COLOR VIDEO!
    Edit2: Doesn't this kind of qualify for the 'PropCam'?
  • Bill HenningBill Henning Posts: 6,445
    edited 2011-10-20 - 09:04:57
    Amazing work Phil. Had to pick up my jaw off the floor...
  • GordonMcCombGordonMcComb Posts: 3,366
    edited 2011-10-20 - 13:20:22
    Parallax already sells a product with a CMOS camera and lens on it, so the parts are already in-house to make a compact camera for this. A small breakout board or daughterboard for the Backpack should be straight forward The problem with all those cheapo cameras is you really never know what output you're going to get, as it's seldom 1v p-p. Makes troubleshooting a bit hard for people who don't understand video.

    Oh, and yeah, good work Phil, but you knew that already. Tell Browz to treat you to dinner this weekend to celebrate. I understand he has a penchant for cheap fish and chips joints.

    -- Gordon
  • Phil Pilgrim (PhiPi)Phil Pilgrim (PhiPi) Posts: 22,773
    edited 2011-10-20 - 14:00:48

    Ditto the cheap cam caveat, for exactly the reason you've stated. In most situations, the problem can be corrected by supplying a 75-ohm load when the unloaded P-P is more than the 2V spec. But I'm not sure that would work in this case, due to the added series resistance in the video path before it reaches the output connection. One might have to add a load directly to the camera output before it gets to the Backpack.


    'Not sure where to go from here. So far, it's only a proof of principle that the Propeller can grab the necessary raw data. Beyond that, what would make it useful, rather than a mere curiosity?

  • PerryPerry Posts: 253
    edited 2011-10-20 - 14:25:43

    'Not sure where to go from here. So far, it's only a proof of principle that the Propeller can grab the necessary raw data. Beyond that, what would make it useful, rather than a mere curiosity?


    Phil I have been working on "stupid video capture" for some time. I just posted a 16bit NTSC video driver.

    I have added new functions to the Pixelator and it is becomeing quite the audio/video platform.

    changing in/out volume on the fly
    speeding/slowing output play.

    It could be used as a video doorbell, animal tracking recorder, time elapse recording

    even a data recorder, the audio channel can be some other signal to be presented along side video

    I'm hoping I can incoporate you color method

    But other than my pet project how about add GPS and SD and you have a packager tracker that takes pictures periodically!

    Don't worry once people see some thing that can't be done, new users/uses pop up ....



    I have been using a 108Mhz clock lately, the increase in speed gives more pixel depth.
  • GordonMcCombGordonMcComb Posts: 3,366
    edited 2011-10-20 - 15:28:45
    Phil, There are a couple of applications that I can think of, in the Backpack itself, or in conjunction with another Propeller. I'm sure many of these have already occurred to you, but here's a short list.

    Basic and intermediate vision analysis comes to mind, of course. I'm sure you've already looked at what Kwabena has done with the CMUcam4. And of course Hanno's pioneering work.

    Since you're in color space now, you could do simple color blob analysis, looking for blobs of a specific size or larger, and reporting their pixel position (e.g. center of blob, plus X and Y dimensions).

    With other tricks, you can find edges for simple object detection. With a grating etched in both planes, you can use a laser to create a topographical map of dots; the distance between the dots indicates distance. The usefulness of this system depends on how bright you can get the laser, and how well you can filter out all but its wavelength.

    Frame rate doesn't need to be super fast, and you can have modes where you skip lines and pixels, and you certainly don't need to deal with both video fields. A lot of video analysis don't even require capturing a full frame (or even field) into a buffer. You can do much of this in real time, storing at most only a few lines worth of video, like the time base correctors of yore. All you're really looking for most of the time is lowest and highest.

    Since there's good tonality in the image, the sensor could also be useful as a smart "compound eye" to look for and follow the brightest object. Outfitted with a wide angle lens the field of view could be greatly increased, acting as a very nice proximity detector, motion detector, you name it.

    -- Gordon
  • Phil Pilgrim (PhiPi)Phil Pilgrim (PhiPi) Posts: 22,773
    edited 2011-10-20 - 15:33:36
    One of the things that's been nipping at me is that all the yellows were coming out pink for some reason. Today I've been fiddling with the chroma computation and decided to try a trig identity for angle differences, rather than going the arctangent route. It's so much simpler, since there are no trig functions to compute at all -- just multiplies and divides. In the process, the yellows suddenly came out right:

    attachment.php?attachmentid=86104&d=1319061243 (old) vs. attachment.php?attachmentid=86146&d=1319149753 (new)

    'Not sure why, exactly (and the Quaker guy looks a bit jaundiced), but I'll take it!

    341 x 238 - 27K
  • TharkunTharkun Posts: 65
    edited 2011-10-21 - 06:48:17

    @Phil: Very nice work !

    Does anyone know which MOSFETs could be use when building my own propeller backpack ? (modifying prop dev. board)
    On the schematic there are n-channel depletion shown !?

    Thanks in advance
Sign In or Register to comment.