In case anyone is trying this and not getting the brightness or saturation levels they were hoping for, attached is a version that lets you adjust the gains of each. I noticed that, once it got dark and I wasn't getting any daylight in the shop, the camera's automatic gain wasn't doing as well, and I had to boost things in the program. Eventually, I'll come up with some automatic gain controls in the program. But for now, it will have to be a manual adjustment.
BTW, here's the kind of VGA display I'm getting with dithering:
If it helps, here is my 128 x 96 pixel 64color VGA driver. Perhaps the pixel sizes fit better to a reduced picture resolution. And it needs only 12 kB bitmap memory.
Phil- Sorry I haven't posted a success photo yet- I got swamped and my initial experiments with some cameras (my camcorder with suspect connectors and my grayscale NTSC camera) didn't work out- I'll have to confirm I'm doing everything correctly. A suggestion to ensure first time success- how about having the Prop generate some video test pattern with one cog, output it to optional device, and then try to grab it with your capture cog? That would eliminate problems caused by cameras with different video settings (PAL), voltage levels, etc...
Hanno
No dice on the PAL stuff I'm afraid. I'd have to rewrite all the timings for sync and capture.
BTW, the Propeller video output circuit would have to be modified to input to the capture program, because it has the incorrect output impedance and drives an unloaded line with too much voltage for the Backpack to sync on. A 220R load resistor between the driving Prop and the Backpack would probably fix the problem.
All,
I'm finishing up some mods that make it easier to control the luma and chroma gains and keep them in bounds. The programs I've posted ere now have some overflow issues.
Attached is the latest update to the program. I've fixed some overflow issues, limited Y, I, and Q to NTSC specs, and added automatic gain control for the luma and chroma.
Here's a photo of my setup. The image on the screen is direct VGA output from the Propeller Backpack proto daughterboard.
I noticed that, after running for a couple hours, the luma and chroma sensitivity both started to decrease, requiring ever larger software gains to maintain constant VGA brightness, with a consequent loss of detail and increasing chroma speckle. This is due to the voltage on the ADC side of the 4.7uF ADC input cap slowly sinking. In order to keep this from happening, I added code to recharge it by outputting a high pulse on A16 during sync times. This acts as a weak clamp and apparently provides enough DC restoration to keep the burst troughs above 0V. It seems to have cured the problem. Attached is the modified code.
Phil: Since you really understand the intricancies of NTSC, is it possible to generate limit color, but at least a mix of rgb by only using 2 pins? I have tried using 1 and 2 pins and existing drivers with some success. Am I correct in thinking that we can get 4 brightness levels including black from 2 pins and the rgb color can be superimposed on this???IIRC the second pin in the sequence is mandatory for any color. Basically what I am asking, is it possile to generate 4 or 5 colors (a red, a green, a blue, and black and/or white)?
The second pin must contain some critical form of the modulation.
Hi Phil, can I join the line of people requesting a moment of your brilliant genius?!
I have some fast counter chips on the way and I have an idea that it can be possible to store a frame of video into a 512k ram chip and then clock it out and bypass all the video and color limitations of the propeller. Effectively, increase the information on a screen from 32k to 512k.
I think the ram chip is fast enough and I think the information can be stored in the ram chip as phase angle, amplitude etc. Maybe I'm crazy but...
Would it be possible for you to give me a listing of the values for a single line of your video driver? Maybe a .csv comma separated file, or just a .hex or .bin file. I'd like to study the colorburst waveform and the phase and amplitude of the waveform and just ponder how much information would be lost if you stored it into a ram chip.
It is just that to my simplistic way of thinking, NTSC is bandwidth limited to 6Mhz, you have 30 frames per second, so that is 200k of information per frame, so surely a 512k ram chip can contain enough information to replicate that signal (2x Nyquist sampling, right?)
Thanks for you consideration, and thanks for giving the gift of sight to all the Propeller driven robots!
Am I correct in thinking that we can get 4 brightness levels including black from 2 pins and the rgb color can be superimposed on this???
One of those brightness levels has to be the sync level and, for color, you need a burst on the backporch and modulation of the luma. I'm not sure what would happen if the burst troughs went to sync level, but my gut feeling is that, with most monitors, it wouldn't matter. Here's an example illustration (not to scale):
Another option might be to use filtered DUTY modulation for the sync and gray levels and superimpose the chroma on top of that with the other output.
BTW, Eric Ball is the NTSC guru here. Maybe he can weigh in on both queries.
It is just that to my simplistic way of thinking, NTSC is bandwidth limited to 6Mhz, you have 30 frames per second, so that is 200k of information per frame, ...
To satisfy the Nyquist criterion, you have to sample at the bandwidth * 2 or faster. More to the point, though, the chroma clock is 3.57 MHz. To get accurate phase shifts for the chroma, I think your output rate would have to be more like four times that, or 14.28 Msps.
You're right: that is correct. I was just trying to imagine how or if it could be done with four analog levels. The reason I think it's okay for the burst troughs to reach sync level is that the choma does that in the Propeller's fully-saturated colors without throwing the monitor out of sync.
BTW, generating NTSC, rather than capturing it, is an interesting topic in its own right that deserves to have a thread of its own.
You're right: that is correct. I was just trying to imagine how or if it could be done with four analog levels. The reason I think it's okay for the burst troughs to reach sync level is that the choma does that in the Propeller's fully-saturated colors without throwing the monitor out of sync.
BTW, generating NTSC, rather than capturing it, is an interesting topic in its own right that deserves to have a thread of its own.
-Phil
Ps. On Yours thread "Better VGA DAC resistors" I have posted one pic that can be of interest to
"
Yup, that looks right, except that C3 connects to P14, not C1-R3. In fact, you could use P14 as both a sigma-delta input and as an output during sync for DC restoration purposes. Also, by using higher-valued resistors in the sigma-delta section, you can increase the effectiveness of the DC restoration, since P12 would have less of an influence on C2's long-term charge characteristics. Also, since you've got the pads to do so, you might as well connect a complement of C3 to Vdd.
Hi Phil,
An idea to reduce required pins. If you only want to capture say 10 frames/second- then you could switch between looking for the sync and grabbing data- with the same pins. That way you need just the 2 ADC pins + the I/Q pins for color.
Hanno
That's an interesting idea. The problem I see is that you want the clamp signal for the sync detector on one side of the input cap, and the video signal on the other side, both to be low impedance, so the cap can be charged to the correct level very quickly at the end of each horizontal line and during sync. Then you have to tri-state the clamp signal so it doesn't affect the charge outside the frontporch and sync zones. But, with the clamp signal line doubling as ADC feedback, you want a high impedance on the input, so the sigma-delta circuit doesn't have too much gain. But that limits the ability to do the necessary clamping, since the time required to do it is lengthened by the higher input impedance. Moreover, you can't tri-state the clamp/feedback line during acquisition, so it will affect the charge on the cap. I wish I could see a way to reconcile the two requirements, but it escapes me at the moment.
Hi Phil,
My color NTSC camera arrived today and I got my first picture! I wasn't able to get an image for any external resistor values above 100ohm. I think I'm a bit more colorful in real life, but I'm ecstatic about this first result!
PropScope measurement of input into ovl pin closest to vid pins:
That looks pretty good. Try it with some high-saturation subjects, such as packaged consumer goods and see if the colors come through better.
I've pushed the stack on this project momentarily to pursue the output side of the equation. I'm working on a way to increase the number of colors/shades displayable on a VGA screen without resorting to external memory. This should help the rendering of captured images significantly. At least, that what I'm hoping.
Hi Phil,
I tweaked the voltage for the camera and resistor to get this image. Red multimeter, blue PropScope and orange RC glider!
I'm using ViewPort to view the result on my pc. That allows me to easily capture screenshots, get pixel values by mousing over a position, see graphs of row/columns, and perform OpenCV video processing- for example, to find a human face. The screenshot shows the result of the OpenCV face finder, it found the position and size of my face. That data is streamed back to the prop to let the prop do things like steer a robot towards humans or guide industrial equipment to specific colors or shapes.
If you haven't already, I would like to improve the update rate to what the conduit can support after I submit my book hopefully later this week.
Hanno
That looks good. I'm working on converting the Spin RGB rendering code to PASM to (hopefully) get a close-to-real-time update rate. It's about half done.
You know, this video capture could be perfect for holding a quadcopter in a fixed location. Get the quadcopter to a GPS location within a few metres. Capture a picture from a camera looking straight down, then detect whether the picture has moved in a certain direction and use that to get an accurate lock. You don't need or want high res video for that application - it just adds to the processing time.
I believe this is the way an optical mouse works. You might even want to decrease the resolution - 16x16 might be enough.
It is very exciting to see you are working on close to real time update rate. This is top-notch work PhiPi!
I have been trying to get this working with an NTSC output instead of VGA
finally some success.
Here are my first before and after shots of a video capture using my "pixelator" platform instead if the "backpack".
This old shopworn "proto board" is going to be used as an oscilloscope to help debug the workings of my last "proto board" to put all of this together.
That looks good. I'm working on converting the Spin RGB rendering code to PASM to (hopefully) get a close-to-real-time update rate. It's about half done.
-Phil
Thanks Phil for all your great work. I thought I would bump this thread.
How is the rendering improvement coming?.
One afternoon I got the idea that one could capture a whole screen and then do the rendering to to a bmp file on an SD card.
No display driver needed !
here are a few captured images.. conveted to PNG format
Comments
Looking forward to lots of fun with color computer vision with the prop- thanks Phil!
Hanno
BTW, here's the kind of VGA display I'm getting with dithering:
-Phil
http://forums.parallax.com/showthread.php?122473-VGA-bitmap-driver-with-large-pixels-of-any-color&p=907056#post907056
Andy
Hanno
Hanno,
BTW, the Propeller video output circuit would have to be modified to input to the capture program, because it has the incorrect output impedance and drives an unloaded line with too much voltage for the Backpack to sync on. A 220R load resistor between the driving Prop and the Backpack would probably fix the problem.
All,
-Phil
Here's a photo of my setup. The image on the screen is direct VGA output from the Propeller Backpack proto daughterboard.
-Phil
I've been thinking of cool things to do with this.
First thing that comes to mind is adding a camera to my BOE bot...
-Phil
Saw this one which is close but not sure if the mounting hole slots extend in enough
The second pin must contain some critical form of the modulation.
I have some fast counter chips on the way and I have an idea that it can be possible to store a frame of video into a 512k ram chip and then clock it out and bypass all the video and color limitations of the propeller. Effectively, increase the information on a screen from 32k to 512k.
I think the ram chip is fast enough and I think the information can be stored in the ram chip as phase angle, amplitude etc. Maybe I'm crazy but...
Would it be possible for you to give me a listing of the values for a single line of your video driver? Maybe a .csv comma separated file, or just a .hex or .bin file. I'd like to study the colorburst waveform and the phase and amplitude of the waveform and just ponder how much information would be lost if you stored it into a ram chip.
It is just that to my simplistic way of thinking, NTSC is bandwidth limited to 6Mhz, you have 30 frames per second, so that is 200k of information per frame, so surely a 512k ram chip can contain enough information to replicate that signal (2x Nyquist sampling, right?)
Thanks for you consideration, and thanks for giving the gift of sight to all the Propeller driven robots!
Another option might be to use filtered DUTY modulation for the sync and gray levels and superimpose the chroma on top of that with the other output.
BTW, Eric Ball is the NTSC guru here. Maybe he can weigh in on both queries.
-Phil
-Phil
Look on attached picture
You're right: that is correct. I was just trying to imagine how or if it could be done with four analog levels. The reason I think it's okay for the burst troughs to reach sync level is that the choma does that in the Propeller's fully-saturated colors without throwing the monitor out of sync.
BTW, generating NTSC, rather than capturing it, is an interesting topic in its own right that deserves to have a thread of its own.
-Phil
I found that on my PAL 7'' TV's it is one of problems that burst levels and phase are not correct
Ps. On Yours thread "Better VGA DAC resistors" I have posted one pic that can be of interest to
"
Does this reduced circuit diagram look right?
I'm going to see if I can implement it on the Quickstart in that little ADC area...
-Phil
-Phil
An idea to reduce required pins. If you only want to capture say 10 frames/second- then you could switch between looking for the sync and grabbing data- with the same pins. That way you need just the 2 ADC pins + the I/Q pins for color.
Hanno
That's an interesting idea. The problem I see is that you want the clamp signal for the sync detector on one side of the input cap, and the video signal on the other side, both to be low impedance, so the cap can be charged to the correct level very quickly at the end of each horizontal line and during sync. Then you have to tri-state the clamp signal so it doesn't affect the charge outside the frontporch and sync zones. But, with the clamp signal line doubling as ADC feedback, you want a high impedance on the input, so the sigma-delta circuit doesn't have too much gain. But that limits the ability to do the necessary clamping, since the time required to do it is lengthened by the higher input impedance. Moreover, you can't tri-state the clamp/feedback line during acquisition, so it will affect the charge on the cap. I wish I could see a way to reconcile the two requirements, but it escapes me at the moment.
-Phil
My color NTSC camera arrived today and I got my first picture! I wasn't able to get an image for any external resistor values above 100ohm. I think I'm a bit more colorful in real life, but I'm ecstatic about this first result!
PropScope measurement of input into ovl pin closest to vid pins:
Self portrait:
Hanno
That looks pretty good. Try it with some high-saturation subjects, such as packaged consumer goods and see if the colors come through better.
I've pushed the stack on this project momentarily to pursue the output side of the equation. I'm working on a way to increase the number of colors/shades displayable on a VGA screen without resorting to external memory. This should help the rendering of captured images significantly. At least, that what I'm hoping.
-Phil
Hi Phil,
I tweaked the voltage for the camera and resistor to get this image. Red multimeter, blue PropScope and orange RC glider!
I'm using ViewPort to view the result on my pc. That allows me to easily capture screenshots, get pixel values by mousing over a position, see graphs of row/columns, and perform OpenCV video processing- for example, to find a human face. The screenshot shows the result of the OpenCV face finder, it found the position and size of my face. That data is streamed back to the prop to let the prop do things like steer a robot towards humans or guide industrial equipment to specific colors or shapes.
If you haven't already, I would like to improve the update rate to what the conduit can support after I submit my book hopefully later this week.
Hanno
-Phil
I believe this is the way an optical mouse works. You might even want to decrease the resolution - 16x16 might be enough.
It is very exciting to see you are working on close to real time update rate. This is top-notch work PhiPi!
I have been trying to get this working with an NTSC output instead of VGA
finally some success.
Here are my first before and after shots of a video capture using my "pixelator" platform instead if the "backpack".
This old shopworn "proto board" is going to be used as an oscilloscope to help debug the workings of my last "proto board" to put all of this together.
Perry
Thanks Phil for all your great work. I thought I would bump this thread.
How is the rendering improvement coming?.
One afternoon I got the idea that one could capture a whole screen and then do the rendering to to a bmp file on an SD card.
No display driver needed !
here are a few captured images.. conveted to PNG format
-Phil