Propeller Backpack: COLOR NTSC Capture
Phil Pilgrim (PhiPi)
Posts: 23,514
In a previous thread, I presented a grayscale NTSC image capture program for the Propeller Backpack, along with a hint that color capture might be possible. Indeed it is, with the same circuitry. The only thing I changed was to replace the 330-ohm external resistor with 560 ohms.This permits a lower gain for the sigma-delta ADC, enabling it to capture more detail from the chroma burst at the beginning of each line. Here are a couple of the images I've captured from the system:
For years (literally), I was was hung up on the notion that you needed to gen lock to the NTSC color burst to make color capture possible. I know now that this is not necessary. Borrowing from the work I did on the Propeller AM receiver, I realized that all that was necessary was an I/Q demodulator (i.e. synchronous detector) to sample the color burst at the beginning of each line at the chroma frequency (3.579545 MHz) and then again for each pixel in the line. Once you have the I and Q components of each, you can compute the chroma (I and Q in the YIQ color space) from the equation:
From these and the gray level (Y), you can compute the RGB components of each pixel, where Y, I, and Q are suitably (i.e. empirically) scaled to produce a 0 .. 255 range for R, G, and B:
(formulae from Video Demystified)
To sense the i and q components of each color burst and each pixel, I created two clock outputs at the chroma frequency, that are 90° out of phase. Each is combined XOR fashion in a logic mode counter that serves as a mixer/demodulator and sums the response over the programmed integration time. I chose four chroma cycles (i.e. two pixel widths) as the integration time, with I and Q staggered two pixels apart so I could do the chroma detection in one cog. This does result in some chroma smearing (evident in the above images), so I might split the I and Q demodulation into two cogs, so that each samples just one pixel's chroma at a time, without staggering samples.
Anyway, I just wanted to provide an early preview of what I've been working on. Neither the Spin source, nor the Perl analysis and image-generation code are "forum-ready" yet, but they will follow in the days to come.
-Phil
For years (literally), I was was hung up on the notion that you needed to gen lock to the NTSC color burst to make color capture possible. I know now that this is not necessary. Borrowing from the work I did on the Propeller AM receiver, I realized that all that was necessary was an I/Q demodulator (i.e. synchronous detector) to sample the color burst at the beginning of each line at the chroma frequency (3.579545 MHz) and then again for each pixel in the line. Once you have the I and Q components of each, you can compute the chroma (I and Q in the YIQ color space) from the equation:
hue = atan2(qburst, iburst) - atan2(qpixel, ipixel)
saturation = sqrt(ipixel2 + qpixel2)
I = sin(hue) * saturation
Q = cos(hue) * saturation
saturation = sqrt(ipixel2 + qpixel2)
I = sin(hue) * saturation
Q = cos(hue) * saturation
From these and the gray level (Y), you can compute the RGB components of each pixel, where Y, I, and Q are suitably (i.e. empirically) scaled to produce a 0 .. 255 range for R, G, and B:
R = 1.969 Y + 1.879 I + 1.216 Q;
G = 1.969 Y - 0.534 I - 1.273 Q
B = 1.969 Y - 2.813 I + 3.354 Q )
G = 1.969 Y - 0.534 I - 1.273 Q
B = 1.969 Y - 2.813 I + 3.354 Q )
(formulae from Video Demystified)
To sense the i and q components of each color burst and each pixel, I created two clock outputs at the chroma frequency, that are 90° out of phase. Each is combined XOR fashion in a logic mode counter that serves as a mixer/demodulator and sums the response over the programmed integration time. I chose four chroma cycles (i.e. two pixel widths) as the integration time, with I and Q staggered two pixels apart so I could do the chroma detection in one cog. This does result in some chroma smearing (evident in the above images), so I might split the I and Q demodulation into two cogs, so that each samples just one pixel's chroma at a time, without staggering samples.
Anyway, I just wanted to provide an early preview of what I've been working on. Neither the Spin source, nor the Perl analysis and image-generation code are "forum-ready" yet, but they will follow in the days to come.
-Phil
Comments
We really need to revive OBC's cookbook with a special section on "1 resistor recipes". Take 1 backpack, add one 330 ohm for B&W, or one 560 ohm for colour. I've got a recipe for 220kohm coming
Very well done Phil..
This is incredible. Way to go
Double ditto!
My jaw dropped when I saw those pictures. Amazing!
Wow.
Impressive
Or, as my daughter would say: "Awesome!"
Ross.
That is amazing work. Could you provide some more theory ? And maybe some waveform diagrams ?
I'm not really understanding how it works, but it looks great.
Bean
Phil, can you capture a lower resolution on the Prop chip? Can you capture faster that way?
To me, I think a very small resolution, say 64x48 or maybe even smaller would the largest the Prop could process in real time...
You might want to use the saturation of the burst to do AGC on the color signal, just like you're doing phase correction.
Just about any cheap camera will work. I have been switching back and forth between a cheap camera and a satellite receicer to get test inputs.
I have been capturing video and audio , at greater than than 64x48 resolution for some time now.(greyscale). with "stupid video capture"
Lately I have taken a side trip and started hacking over Eric Balls 8bpp driver to get greater pixel depth so that these images can be displayed on the Propelller
Phil:
Philtastic work!!!
This is the first approach I tried, How many colors are you getting?
I like to do this in real time and have been using a protoboard at 108MHz clock so I have more time to process the incoming/outgoung data.
With low resoloutions it should be possible to build lookuptabels to do the conversions in real time
again
PHILTASTIC !!!!
These guys (Centeye - http://centeye.com/products/ardueye-shields-for-arduino/ ) had an interesting presentation at the Open Hardware Summit about what they were doing with low resolution image processing (16x16) cameras. They have a "shield" that had one of their cameras on it. It's in the never ending queue of fun things to look at.
Since Phil is a Dreamer and a Doer, maybe he can take something of use away from this instead of my shoulda/coulda/woulda approach.
Today I'm going to try doubling the chroma resolution (to match that of luma) by rearranging cogs to have each of the I/Q local oscillators and its associated mixer in one cog, rather than the oscillators in one cog and the mixers in another. After that, I shall try to flesh out the presentation with more theory and some code.
I'm still doing the pixel computations on the PC in Perl, since it's easier to try different things in a fast, native floating-point environment. The Propeller is simply capturing the necessary data at this point and passing it on. Once I'm satisfied with the results, I can begin writing Prop code to to do the YIQ-to-RGB color conversion internally.
More to come!
-Phil
If you're willing I'd love to integrate your color frame grabber with ViewPort to stream video at 1mbps- with my grayscale grabber I get ~10 frames/second. Once the video is inside ViewPort the integrated OpenCV computer vision toolkit can recognize things like human faces and send the coordinate back to the Prop. Several years ago circuit cellar published my article with grayscale grabber and propeller based computer vision- I'm sure they'd be interested in an update from you...
Hanno
It costs $11.83- including worldwide delivery
http://www.dealextreme.com/p/ntsc-mini-surveillance-av-camera-628x582px-6019
Hanno
Individual I and Q chroma values per pixel.
Individual I and Q chroma values with (0.25, 0.5, 0.25) weighted average across three pixels.
Original staggered I and Q chroma values.
In the top image, you can see that the chroma values are not smeared. This is particularly apparent in the dark background. In the middle image, the weighted averaging helps to smooth them a bit. But I'm not convinced that either is better than the bottom (original method) image. I fact, I think the bottom image is more pleasing. As a consequence, I think I'll stick with the original program.
-Phil
Let me know if you get to the point of wanting to try multiple prop in parallel. My modules stack easily for running props purely in parallel. I want to try VGA this way, but never enough time!
I too am very interested in seeing your simple block explanations of the workings behind your code. I do understand it somewhat, basically because of your other explanations of DSP and research I did following those posts. Its a joy to find out that those maths I did so many years ago (~40) actually have an application that I may end up using after all.
The sync is detected as a logic value on A22. The 0.1uF capacitor is charged by an occasional DUTY-mode output from A19 to clamp the signal on A22, so that the sync tips and nothing else go below the logic threshold. The video signal also transits the external resistor and the MOSFETs (which are always "on") to the sigma-delta ADC formed from pins A12 (feedback), A14 (input), and A17 (filter cap -- always pulled low). The ADC's counter provides the luma (Y) value for each pixel.
A color video signal includes not only sync and luma levels, but also chroma (color) information. The chroma is provided by a 3.579545 MHz subcarrier that rides upon the luma signal. (In fact, if you low-pass filter a color signal, you will end up with a grayscale signal that's still compatible with B/W monitors and receivers. Therein lies the brilliance of the NTSC and PAL color standards: they are backwards compatible with the earlier black-and-white standards.) The hue of any pixel is determined by the phase of the chroma subcarrier, relative to that of the "color burst" at the beginning of every scan line; the saturation, by its amplitude. Here is a photo of the oatmeal box, over which I've superimposed a trace of the video signal corresponding to a particular scan line:
After the low-going horizontal sync, in a section of the waveform called the "backporch", lies the color burst, which is a phase reference for the chroma information to follow. You will notice, as the word "OATS" is scanned, that areas of high brightness have a high amplitude and that areas of rich (i.e. more saturated) color (e.g. red vs. white or black) have a higher high-frequency component. The latter is the chroma subcarrier.
In the Propeller Backpack, the amplitude and phase of the subcarrier at any point in the scan are determined by "mixing" the signal with two oscillators (NCO counters) of the same frequency but with a 90° phase offset. These are output on pins A24 and A25, which are not otherwise used by the Backpack. This is adequate to determine both the amplitude and phase of the subcarrier over any interval within the scan line. The mixing is done by a pair of counters, each of which takes the ADC feedback output and XORs it with either the in-phase (I) oscillator or the quadrature-phase (Q) oscillator and counts the number of times this results in a logical "1". The average phase of the signal during that interval, relative to the I and Q oscillators, is given by the arctangent of the sine (I) component and the cosine (Q) component as counted by those counters.The phase relative to the color burst can then be computed by finding the color burst phase relative to the I and Q oscillators and subtracting. The saturation is just the Cartesian distance (square root of the sum-of-squares) of the I and Q terms.
Here is a scope trace of a color burst, the I and Q oscillator outputs, and the ADC feedback output:
You will notice that the ADC output is more predominantly low when the video signal is high, and vice-versa. It should also be apparent that this effect is very subtle, due to the fact that each cycle of the chroma frequency includes fewer than twenty-three 80 MHz Propeller clock intervals. This results in a fairly low signal-to-noise ratio and accounts for the proliferation of chroma noise in the acquired images. Errors in measuring the phase of the color burst will result in horizontal color striping in the acquired image; errors in the phase of each pixel, in color blotchiness within a scan line.
Here is a block diagram that ties the system together:
The output data going to the PC for construction of the image consists of luma (Y) data for each pixel, and interleaved chroma data, such that each pixel shares its I and Q chroma data with its right and left neighbor. In the PC, a Perl program does the work of computing the actual YIQ color signal relative to the color burst data and converting it to RGB. What remains is to have the Propeller do this work, so that the final image can be produced internally.
Hopefully, within the next day or so, I will have a Windows exe ready to download that receives data directly from the Propeller Backpack and displays the resulting image. Stay tuned!
-Phil
Edit: COLOR VIDEO!
Edit2: Doesn't this kind of qualify for the 'PropCam'?
Oh, and yeah, good work Phil, but you knew that already. Tell Browz to treat you to dinner this weekend to celebrate. I understand he has a penchant for cheap fish and chips joints.
-- Gordon
Ditto the cheap cam caveat, for exactly the reason you've stated. In most situations, the problem can be corrected by supplying a 75-ohm load when the unloaded P-P is more than the 2V spec. But I'm not sure that would work in this case, due to the added series resistance in the video path before it reaches the output connection. One might have to add a load directly to the camera output before it gets to the Backpack.
All,
'Not sure where to go from here. So far, it's only a proof of principle that the Propeller can grab the necessary raw data. Beyond that, what would make it useful, rather than a mere curiosity?
-Phil
Phil I have been working on "stupid video capture" for some time. I just posted a 16bit NTSC video driver.
I have added new functions to the Pixelator and it is becomeing quite the audio/video platform.
changing in/out volume on the fly
speeding/slowing output play.
It could be used as a video doorbell, animal tracking recorder, time elapse recording
even a data recorder, the audio channel can be some other signal to be presented along side video
I'm hoping I can incoporate you color method
But other than my pet project how about add GPS and SD and you have a packager tracker that takes pictures periodically!
Don't worry once people see some thing that can't be done, new users/uses pop up ....
Perry
P.S.
I have been using a 108Mhz clock lately, the increase in speed gives more pixel depth.
Basic and intermediate vision analysis comes to mind, of course. I'm sure you've already looked at what Kwabena has done with the CMUcam4. And of course Hanno's pioneering work.
Since you're in color space now, you could do simple color blob analysis, looking for blobs of a specific size or larger, and reporting their pixel position (e.g. center of blob, plus X and Y dimensions).
With other tricks, you can find edges for simple object detection. With a grating etched in both planes, you can use a laser to create a topographical map of dots; the distance between the dots indicates distance. The usefulness of this system depends on how bright you can get the laser, and how well you can filter out all but its wavelength.
Frame rate doesn't need to be super fast, and you can have modes where you skip lines and pixels, and you certainly don't need to deal with both video fields. A lot of video analysis don't even require capturing a full frame (or even field) into a buffer. You can do much of this in real time, storing at most only a few lines worth of video, like the time base correctors of yore. All you're really looking for most of the time is lowest and highest.
Since there's good tonality in the image, the sensor could also be useful as a smart "compound eye" to look for and follow the brightest object. Outfitted with a wide angle lens the field of view could be greatly increased, acting as a very nice proximity detector, motion detector, you name it.
-- Gordon
(old) vs. (new)
'Not sure why, exactly (and the Quaker guy looks a bit jaundiced), but I'll take it!
-Phil
@Phil: Very nice work !
Does anyone know which MOSFETs could be use when building my own propeller backpack ? (modifying prop dev. board)
On the schematic there are n-channel depletion shown !?
Thanks in advance
Christian