PDA

View Full Version : Help with composite NTSC video driver?



ericball
03-27-2009, 11:07 PM
This is phase 1 of a composite video driver which uses the VGA mode instead of the baseband mode. Using the VGA mode it should be able to render twice as many colors. Unfortunately, I get varying results depending on what TV I'm using.

The baseband mode generates color via a circular shifter which is tapped to produce 16 phases of the colorburst frequency, or in math terms Chroma = 20IRE * cos( Fc * t + hue(t) ). However, a different way to express this is two sine waves - one in phase with the colorburst (U) and one 90 degrees out of phase (V), or Chroma = U(t) * cos( Fc * t ) + V(t) * sin( Fc * t ). Therefore, in theory, if the video driver generates chroma values at 0, 90, 180, and 270 degrees, or four times the colorburst frequency, then it should be able generate different colors depending on the values of U & V (and the luma component Y).

And that's what this video driver does. It sets the PLLA clock to four times the colorburst frequency (14.318181 MHz, or 910 cycles per line) and clocks out 3 samples per "pixel", with 240 pixels per line and 240 lines per screen (non-interlaced, 262 lines per frame). This test driver puts out a static graphic of 8 vertical stripes of different colors.

On my TVs I get a B&W pattern, almost like the TV isn't picking up the colorburst. However, on my capture card I get bands of color, but they change hue. It's also not 100% stable and there's an interference pattern. (If I supress the colorburst, then it goes to B&W.) On my HDTV it's B&W and then flashes to color. (I suspect that's something with the image processing it does and it's just not happy with the signal.)

Does anyone have any insight or suggestions?

The code is for the Demoboard, I have included the settings for the Hydra and Hybrid, just change which lines are commented.

Bean
03-27-2009, 11:26 PM
Eric,
It sounds like you are not keeping the color phase consistent. All of the colorburst cycles have to be in phase with each other. Otherwise the TV cannot phase lock to it.

Bean.

▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
There is a fine line between arrogance and confidence. Make sure you don't cross it...


Microcontrolled
03-30-2009, 10:24 PM
The phase or frequency is not right. See the image for the color burst signal encoding diagram. The timing must be hitting the Black and White of the signal and only hitting the color part when the phase goes high or low enough to reach it. This is my conclusion of what could be going wrong.

▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
Toys are microcontroled.
Robots are microcontroled.
I am microcontrolled.

Phil Pilgrim (PhiPi)
03-30-2009, 11:14 PM
Expanding on Bean's comment, the phase of the chroma clock must remain constant globally, across every line, field, and frame. Because of NTSC's timing, this means it will not be constant relative to the horizontal sync pulses. Since there are 227.5 chroma clock cycles in every scan line, the colorburst, relative to the horizontal sync, will be 180 degrees out of phase on a line-to-line basis. Moreover, since each frame consists of an odd number of lines, the relative frame-to-frame difference will also be 180 degrees.

-Phil

Microcontrolled
03-30-2009, 11:22 PM
I believe you are correct, PhiPi.

▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
Toys are microcontroled.
Robots are microcontroled.
I am microcontrolled.

ericball
03-31-2009, 06:58 AM
I found the bug, and now it works. (More on that after I respond to PhiPi.)


Phil Pilgrim (PhiPi) said...
Expanding on Bean's comment, the phase of the chroma clock must remain constant globally, across every line, field, and frame. Because of NTSC's timing, this means it will not be constant relative to the horizontal sync pulses. Since there are 227.5 chroma clock cycles in every scan line, the colorburst, relative to the horizontal sync, will be 180 degrees out of phase on a line-to-line basis. Moreover, since each frame consists of an odd number of lines, the relative frame-to-frame difference will also be 180 degrees.

Actually, TVs are pretty tolerant. Consider that the Atari 2600 has a line length of 228 chroma cycles, doesn't invert the colorburst phase every line, lacks equalization pulses, and can produce a wide number of lines per non-interlaced frame (depending on the game).

That being said, I couldn't figure out why my exhibited all the signs of colorburst phase issues - because my code doesn't have "phase" in the normal sense. Instead it generates 910 pixels per line (i.e. 227.5 * 4) and the code handles the phase inversion by coding the +20IRE/0IRE/-20IRE/0IRE colorburst directly. The pels are displayed as 3 pixels, adjusted to the colorburst phase. Oh, and it's 262 lines, non-interlaced, which is a valid signal (just ask any 8 or 16 bit game console).


But all of that is water under the bridge. I found the typo/bug - it's the MOV VSCL at doactive. I should have used MOVS so the pixel counter stayed at 1 instead of being reset to 0 (i.e. 256 PLLA/pixel). I think this means there's an interaction between the pixel counter and WAITVID which can be prevented by ensuring PixelClocks is an even multiple of FrameClocks.


Phil Pilgrim (PhiPi)
03-31-2009, 07:27 AM
ericball said...
Actually, TVs are pretty tolerant. Consider that the Atari 2600 has a line length of 228 chroma cycles, doesn't invert the colorburst phase every line

Since it uses a whole number of chroma cycles per line, it doesn't have to invert the colorburst signal for it to stay in phase. TVs can be very tolerant of sync variances, but they're much less tolerant of liberties taken with the colorburst. This is because the burst is so short and the TV's internal phase lock has to have such a long time constant by comparison.

-Phil

ericball
03-31-2009, 07:45 AM
Phil Pilgrim (PhiPi) said...

ericball said...
Actually, TVs are pretty tolerant. Consider that the Atari 2600 has a line length of 228 chroma cycles, doesn't invert the colorburst phase every line

Since it uses a whole number of chroma cycles per line, it doesn't have to invert the colorburst signal for it to stay in phase. TVs can be very tolerant of sync variances, but they're much less tolerant of liberties taken with the colorburst. This is because the burst is so short and the TV's internal phase lock has to have such a long time constant by comparison.
I never thought of it that way, but you are correct. If I change my code to not invert the colorburst the picture goes B&W with color interference patterns.