There are protocols used by security cameras to transmit 1080p30 and 4k15 over a single coax line. e.g. AHD. Converter boxes to convert AHD to HDMI are available starting at $50. It's questionable whether some of these converter boxes can output HDMI above 1080p.
The specs are probably not available so we would need to reverse engineer it.
The maximum component resolution may be limited by the sample rate of the ADCs. It might be possible to reduce the frame rate while increasing resolution, so the dotclock stays the same.
No, no, no, no, please no! Now that we use LCDs, it should be fine to output 30p, 24p, or even 15p. I'm hoping we can do 720p30 over HDMI.
720p60 signals at 74.25MHz. Half that would be 37.125MHz, which would need 371.25 MHz for HDMI. That would barely work.
Ok then, 720p24 at 29.7MHz. Or 720p15 at 18.5625MHz. I should be able to get that on my P2ES board. I might have to try just to see how many displays will handle it.
I will give 720p24 HDMI a try.
I remember reading that a 250MHz bit rate (25MHz pixel rate) is the minimum for HDMI. Less than that is not recognized by receivers.
I've just been playing with my cheapo LCD TV with its VGA input and found it is quite fussy about the horizontal scan frequency for some resolutions but not so much for other resolutions. And you know it's not happy by the way it shifts the display area half off screen. It's a tad bizarre because the image looks complete and stable. Even the resolution reported is correct.
1920x1080@60 is particularly sensitive, it specifically wanted 67 kHz. For a while I was thinking that particular mode was broken over VGA until I was getting extreme with another resolution and it went sideways on me as well.
I remember reading that a 250MHz bit rate (25MHz pixel rate) is the minimum for HDMI. Less than that is not recognized by receivers.
It seems bizarrely arbitrary but still I'll be most amused if that isn't rigidly enforced. Like how PC monitors refuse to drop below 30 kHz horizontal scan rate.
True, maybe it is manicured just to look like market driven. I vaguely remember difficulties in general usefulness with component into a TV. It seemed entirely dedicated to DVD players.
The lowest frequency I can get 720p to work at on my HDTV is 285MHz. That's 23.03fps. My next frequency down is 22.95fps and that doesn't work. So, I think 24fps is the official low-end, with 1fps allowance downward. It's really nice that we can do 720p24, though! Thanks for suggesting that, Saucy. 720p in portrait mode with an 8x8 font could show 90 columns by 160 lines of code. That would be great.
848x480@60 would suit 16:9 format monitors. I've tested a 32 MHz dot clock mode for VGA output. That might just scrape in at the top end for the HDMI output.
It won't be an issue for straight outputs, or even straight inputs most likely. But timings that rely on both together, like SD cards, could be in a difficult position for reliability. Dunno, maybe it's as simple as allowing enough leeway for the worse case and a slower sysclock just has to suck it.
potatohead has it right, the MPAA/etc. wanted HDMI because it has DRM (HDCP). They don't want hires component because it's not protected (DRMed).
Those guys really know how to ruin the fun.
Fun fact: HDCP was/is entirely pointless, too. Pirates were/are much more interested in ripping the actual video files being played. Recording from the output is impractical and looses a lot of quality.
HDCP 1.x is also a terrible encryption scheme that is farily easy to break. Someone even reverse-engineered the master key.
Or you can just bypass it entirely by purchasing a badly designed HDMI splitter (altough since this bug is fairly common, I think at some point they started doing it on purpose), for as they didn't really check whether licensed devices violate the spec.
It is also the reason for why an HDMI connection sometimes takes f o r e v e r to be established.
And thus we are stuck with HDMI, no HD VGA / YPbPr anymore and HD-SDI never became a home standard.
24fps is the low end. It is supported for direct film frame rates to preserve cinematography.
I have a 1080p 24fps BluRay of an old western, "How the West Was Won" The thing is amazing. Matching that up makes a big difference. I bought it to see that 24fps mode in action.
Yeah Chip. Spoilers, aren't they? Well, what they ended up with is cheapo TVs from Asia that will decrypt and then present the raw stream internally. That ended up being as good as the analog hole was.
It sure is a lot less resource intensive.
Cool to see a bit higher resolution will work at the lower frame rates. People will put that to use.
Re: Component usefulness
Game systems, disc players and some computer graphics cards will output component. Anything that does has a larger color space than plain RGB does. I have used with plasma and CRT HD displays for years.
On both plasma and CRT you get a real black and serious contrast, if you want it.
I like component because just one wire will deliver a grey scale monochrome signal from 240p all the way through 1080p. Super lean, and your choice of sweep rates from 15khz on up.
Chip: Have you made any assessment yet about whether the new P2 is bug-free enough to proceed with making some production chips? Have you seen any problems that might indicate another spin? From what you've posted I would guess not. I'm asking because I think Ken said that Parallax was unlikely to ramp up any tools effort until you were sure you had a chip that could go into production. Are we at that point now?
Lol, David, is that a gentle nudge for Chip to stay on track?
Maybe. Why should Chip be the only one having fun? We need P2v2 chips for the rest of us too!
Seriously, I think it made sense for Parallax to avoid spending a lot of resources working on tools until they knew for sure that they had a chip they could sell. I'm just asking if Chip thinks we're there yet.
One thing that was not easy to prove in the FPGA was the DIR/OUT timing relationship pertaining to glitch removal on simultaneous transitions. Chip worked with On Semi to fine tune that area I believe.
PS: It's not something that would be show stopper either way though.
There is other new stuff still to test, e.g. Sinc2/Sinc3, Hann/Tukey windows, SCOPE instruction. I imagine Chip has a list that he is working through in order and we'll have to be patient.
Re HDMI, I think there are 720x480 60Hz and 720x576 50 MHz modes at 270 MHz, which could give better looking text than 640 pixels/line. Also from memory, the DVI/HDMI spec says minimum clock is a bit less than 250 MHz, but no point trying that as the new P2 works at 297 MHz.
The minimum frequency supported is specified to allow the link to differentiate between an active low-pixel format link and a power managed state (inactive link). The lowest pixel format required by the DVI specification is 640x480@60 Hz (clock timing of 25.175 MHz). The DVI link can be considered inactive if the T.M.D.S. clock transitions at less than 22.5 Mhz for more than one second.
25.0 MHz works of course, but precisely how low one can go I do not know.
The minimum frequency supported is specified to allow the link to differentiate between an active low-pixel format link and a power managed state (inactive link). The lowest pixel format required by the DVI specification is 640x480@60 Hz (clock timing of 25.175 MHz). The DVI link can be considered inactive if the T.M.D.S. clock transitions at less than 22.5 Mhz for more than one second.
Thanks Tony. 23 MHz looks doable then.
It's not hard to see the intentional blindness in the lowest required wording.
I remember reading that a 250MHz bit rate (25MHz pixel rate) is the minimum for HDMI. Less than that is not recognized by receivers.
Doh! Thankfully I decided to sleep instead of code.
The 720p24 is great news! We can advertise "Digital HD Video". I remember that HDMI devices are required to support 640x480x60. That should include audio too.
It's not hard to see the intentional blindness in the lowest required wording.
DVI was designed in the CRT era. Even 60Hz refresh was annoying then. There was probably no practical use for lower pixel rates. Although when HDMI came along it should have been lowered to support 480i without pixel doubling.
I've wanted to try ATSC output. I didn't since the IQ modulator was broken. But the MPEG2 encoding is likely a bigger problem. Except if we are outputting text. If we use a 16x32 font then each character will fill exactly 2 macroblocks. We would just be assembling pre-calculated macroblocks. For 1080p that would be 120x33 characters. Scrolling could be easily done using motion compensation. Note: This is not a suggestion for Chip. We haven't heard about all the ADC upgrades yet.
Comments
https://webstore.ansi.org/standards/ansi/cta7702017-1663326
I will give 720p24 HDMI a try.
I remember reading that a 250MHz bit rate (25MHz pixel rate) is the minimum for HDMI. Less than that is not recognized by receivers.
1920x1080@60 is particularly sensitive, it specifically wanted 67 kHz. For a while I was thinking that particular mode was broken over VGA until I was getting extreme with another resolution and it went sideways on me as well.
Those guys really know how to ruin the fun.
I tried 720p15, which would signal at 185.625MHz, but it doesn't work on my HDTV.
It's a progressive 24fps scan.
So, the whole image is sent in order, not odd and even lines separated.
It won't be an issue for straight outputs, or even straight inputs most likely. But timings that rely on both together, like SD cards, could be in a difficult position for reliability. Dunno, maybe it's as simple as allowing enough leeway for the worse case and a slower sysclock just has to suck it.
Fun fact: HDCP was/is entirely pointless, too. Pirates were/are much more interested in ripping the actual video files being played. Recording from the output is impractical and looses a lot of quality.
HDCP 1.x is also a terrible encryption scheme that is farily easy to break. Someone even reverse-engineered the master key.
Or you can just bypass it entirely by purchasing a badly designed HDMI splitter (altough since this bug is fairly common, I think at some point they started doing it on purpose), for as they didn't really check whether licensed devices violate the spec.
It is also the reason for why an HDMI connection sometimes takes f o r e v e r to be established.
And thus we are stuck with HDMI, no HD VGA / YPbPr anymore and HD-SDI never became a home standard.
I have a 1080p 24fps BluRay of an old western, "How the West Was Won" The thing is amazing. Matching that up makes a big difference. I bought it to see that 24fps mode in action.
Yeah Chip. Spoilers, aren't they? Well, what they ended up with is cheapo TVs from Asia that will decrypt and then present the raw stream internally. That ended up being as good as the analog hole was.
It sure is a lot less resource intensive.
Cool to see a bit higher resolution will work at the lower frame rates. People will put that to use.
Re: Component usefulness
Game systems, disc players and some computer graphics cards will output component. Anything that does has a larger color space than plain RGB does. I have used with plasma and CRT HD displays for years.
On both plasma and CRT you get a real black and serious contrast, if you want it.
I like component because just one wire will deliver a grey scale monochrome signal from 240p all the way through 1080p. Super lean, and your choice of sweep rates from 15khz on up.
Here's 1280x720@24 tweaked to run at 28 MHz VGA out - Tested on my LCD TV. I presume it'll work over HDMI out too.
Seriously, I think it made sense for Parallax to avoid spending a lot of resources working on tools until they knew for sure that they had a chip they could sell. I'm just asking if Chip thinks we're there yet.
One thing that was not easy to prove in the FPGA was the DIR/OUT timing relationship pertaining to glitch removal on simultaneous transitions. Chip worked with On Semi to fine tune that area I believe.
PS: It's not something that would be show stopper either way though.
Re HDMI, I think there are 720x480 60Hz and 720x576 50 MHz modes at 270 MHz, which could give better looking text than 640 pixels/line. Also from memory, the DVI/HDMI spec says minimum clock is a bit less than 250 MHz, but no point trying that as the new P2 works at 297 MHz.
From the DVI v1.0 spec, p.18:
http://www.cs.unc.edu/Research/stc/FAQs/Video/dvi_spec-V1_0.pdf
25.0 MHz works of course, but precisely how low one can go I do not know.
It's not hard to see the intentional blindness in the lowest required wording.
Doh! Thankfully I decided to sleep instead of code.
The 720p24 is great news! We can advertise "Digital HD Video". I remember that HDMI devices are required to support 640x480x60. That should include audio too. DVI was designed in the CRT era. Even 60Hz refresh was annoying then. There was probably no practical use for lower pixel rates. Although when HDMI came along it should have been lowered to support 480i without pixel doubling.
I've wanted to try ATSC output. I didn't since the IQ modulator was broken. But the MPEG2 encoding is likely a bigger problem. Except if we are outputting text. If we use a 16x32 font then each character will fill exactly 2 macroblocks. We would just be assembling pre-calculated macroblocks. For 1080p that would be 120x33 characters. Scrolling could be easily done using motion compensation. Note: This is not a suggestion for Chip. We haven't heard about all the ADC upgrades yet.