I'm not really sure why you'd want to not just use VGA at 1080p...
Don't most HDTVs have a VGA input?
I've used both VGA and component input to my TVs and I can't really tell the difference...
But, I can see a difference between both of those and DVI or HDMI. The digital signalling gives you a perfect picture.
That may not justify adding the $10 or so for the encoder chip to many people though.
My old 57" 1080i Toshiba does not have hdmi or vga, it does do 1080i over component.
If using VGA on a Prop2 board and with only a software change I can use the vga to 3phone adapter cable, it would be a bonus.
That any pin can be changed to be anything in software (as long as no external circuit) on the Prop2 is a game changer.
Video+2 audio on a trrs connector can be changed to 2.1 audio with just software, cool.
Now that you mention it, component is probably a safer route than VGA since probably all TVs support it.
Still, I'm looking forward to having a lot of pins free to drive a HDMI encoder with 24 bit color.
Most of the new chips also support sound, so you can have perfect digital sound and picture with one cable.
Back when the color transforms were being tossed about, I had mentioned the YPrPb to Chip a couple of times, sometimes incorrectly labeled YCbCr, is present on nearly every higher definition and higher quality TV. When I compare the HDMI output on a PS3, with the YPrPb how close they are is impressive. On my displays, it's near pixel perfect. On ordinary definition TV's, it's not always possible to know whether or not they will take a progressive signal or an interlaced one. Either way, horizontal resolution on larger sets is usually around 640 pixels, in color, no artifacts other than those associated with the display itself.
I've found a ton of HDTV's that don't provide VGA / PC inputs. Not sure why, but they don't.
I was with up to the 640 pixels... I think almost all TVs are now 1920 horizontal pixels...
The difference between HDMI and component depends a lot on what you're looking at...
For a photo, I'll agree it's hard to tell.
But, when showing text I can tell right away.
Most of the early HDTVs (like mine) only support the original standards: 480i, 480p, 1080i. My TV doesn't support 720p. It's pretty obvious why they support 1080i, since that's interlaced they use the same old trick they used on PC monitors: half the video frequency and only use half the actual vertical pixels. This was a common trick IIRC, they would display 1024x768 interlaced on a monitor that otherwise supported 640x480 as the max resolution. 1080i only uses 540 actual lines of the display (CRT), which means you only need a 540 line CRT. On the rear projection TVs, there isn't really a limitation to the CRT because it's 3 mono CRTs without a color mask, so there is no grid screen, I think it's more of a self imposed limitation of slow hardware.
For 1080i, it can be obvious at times that you are seeing an interlaced display, but by and large, I don't notice the artifacts when using my PS3 in 1080i.
To be clear, I was writing about ordinary SD television displays. My HD set does the full 1080p via component input. I plan on doing some text with the P2, but I think a lot of how good text looks depends on the video processor in the TV. The last one I had was quite good at lower resolution signals, but fell kind of flat on higher resolution ones (720+) And that TV died. This one is the opposite. SD type signals are kind of crappy, but the HD component is really quite good, assuming good cables and such.
HDMI is nice, but we will have come way up the video curve --plenty far for me. Exciting times.
BTW, many sets will display 1080i on component. They just don't talk about it. The various media companies didn't want an analog hole and requested the component be interlaced, even though the device would be capable otherwise.
Funny, they didn't even think about the little "open HDMI" pass through devices you can get for a song.
1080i is, "at this moment" The European EBU broadcasting system.
i is smooth in pano's. It soften the image on fast horizontal moves.
p give strobocopic in pano's .
Text for i, scrolling text for p are the most difficult test situations
Real uncompresed p and i give big diference on source, but after compressing for distribution/storage ...
To be clear, I was writing about ordinary SD television displays. My HD set does the full 1080p via component input. I plan on doing some text with the P2, but I think a lot of how good text looks depends on the video processor in the TV. The last one I had was quite good at lower resolution signals, but fell kind of flat on higher resolution ones (720+) And that TV died. This one is the opposite. SD type signals are kind of crappy, but the HD component is really quite good, assuming good cables and such.
HDMI is nice, but we will have come way up the video curve --plenty far for me. Exciting times.
BTW, many sets will display 1080i on component. They just don't talk about it. The various media companies didn't want an analog hole and requested the component be interlaced, even though the device would be capable otherwise.
Funny, they didn't even think about the little "open HDMI" pass through devices you can get for a song.
Err... yep! That's what I intended to say, but an "i" got in there instead. Both the ones in the office do, despite their documentation mentioning 1080i as the upper limit.
I thought you were referring to the fairly large number of Sony and other rear projection "HD Ready" sets that only advertised 480i, 480p and sometimes 720p that were later found to also be able to process 1080i...
Err... yep! That's what I intended to say, but an "i" got in there instead. Both the ones in the office do, despite their documentation mentioning 1080i as the upper limit.
The real screwballs of the HDTV generation were the Toshiba (or probably any, I had a Toshiba) 4:3 HDTVs. Obviously HDTV is inherently 16:9, so guess what it looks like in 4:3? Fortunately the usable lifespan of that TV was spent using the scan doubler. Once HDTV was available OTA, I had the opportunity to watch a Thanksgiving special with Harry Connick Jr looking coked to the gills (my speculation).
Ironically, it becomes harder and harder to distinguish a good 480p master from an HD master, whether it's 720 [most likely] or 1080. Compression of HD content has made it a bit of a joke IMHO. The original OTA broadcasts utilized the full bandwidth allowed for HD, and they were sharp as all heck (hence the ability to see Harry *real* well).
One nice thing about Chip's analog video output for me is that I can use it to provide both digital and analog outputs on a DVI connector...
People could just use the DVI to VGA adapter to get VGA and if they wanted component, they could use the VGA to component cable...
I don't think anyone design a board with a large DVI connector anymore.
If they can only do dvi and not hdmi., they do dvi over a hdmi connector instead.
I think a lot of people (myself included) avoid HDMI because the HDMI people will send their lawyers after you if you try to sell something with the letters "HDMI" on it without giving them a barrel of cash first... DVI to HDMI adapters are cheap...
Your are right, a little bit too pricey.
maybe should go with displayport connector, and let user buy the "illegal" displayport to hdmi M-M cable.
--- HDMI---
The fees - annual and royalties - for a low-volume HDMI Adopter agreement.
$5k/year fee + flat $1/unit administration fee + variable per unit royalty Designed to lower up-front barrier to entry cost for customers until higher volumes.
Royalty is device-based and not dependent on number of ports, chips or connectors
$0.15 no HDMI logo
or $0.05 HDMI logo (requires compliance testing)
I'm not really sure why you'd want to not just use VGA at 1080p...
Don't most HDTVs have a VGA input?
I've used both VGA and component input to my TVs and I can't really tell the difference...
Analog VGA signaling does make more since than digital signaling, as with a good monitor and good cable you get a much much clearer picture than the digital stuff, and can push way way higher resolutions than are supported by the digital signaling standards. Though VGA not component.
But, I can see a difference between both of those and DVI or HDMI. The digital signalling gives you a perfect picture.
That may not justify adding the $10 or so for the encoder chip to many people though.
A good monitor and cable will give you a better result at even higher resolutions every time. This so called High Definition video is a lot lower resolution than I used to use with my monitors (and still do).
Not on text it doesn't. Analog has it's advantages for pictures and video. It's crappy for high resolution text. Even with very short cables, significant artifacts can be seen on text with pixel sizes near the intended pixel clock.
Not on text it doesn't. Analog has it's advantages for pictures and video. It's crappy for high resolution text. Even with very short cables, significant artifacts can be seen on text with pixel sizes near the intended pixel clock.
What resolution do you use with your monitors?
The artifacts are likely due to signal reflections resulting from impedance mismatch. The monitor is 75 ohms, but the add-on boards are 47 ohms. Using a 28 ohm series resistor from the DAC will up the impedance to the 75 ohms that the monitor likes. This will help things a lot, as well as allow you to use larger RGB coefficients to get more DAC span (ie more resolution). The DACs on the actual Prop2 chip will be 75 ohms, so no impedance matching for video will be necessary.
Modern flat panel monitors are essentially digital output displays. Feeding it a digital signal will give the best possible result.
If you feed a HDTV or new monitor an analog signal, the monitor has to use it's ADCs to sample your analog signal.
Sharp lines are a problem because there is no reference pixel clock on your analog signal and so the monitor makes it's
own pixel clock signal using your sync pulses and trys to sync to the analog input. But, this is never perfect and so sharp edges will be jittery...
If you can find me a digital display that can support 2048x1536@32bpp with zero artifacting then I will consider using digital displays for everything, currently I keep my high end analog CRT displays around as they will do up to 4096x3072 with no noticeable artifacting, and I usually run at 2048x1536@16bpp for most things.
The Apple Retina displays have 2560x1600. For a digital display, they are insane, and currently break a lot of software too. They are not cheap, but oh man! They do precisely what you are writing about.
With reasonable cabling, yeah high end CRT displays do that kind of thing. Native digital display resolutions are at issue. If odd resolutions are in play, the digital signaling isn't going to play out well.
I don't disagree with either of you. Chip is right about reflections, and I was speaking more to cable length, quality issues. For my workbench / desktop, I really like analog signals and am pretty stoked at what the P2 can do. HDMI / DVI lacks the simplicity and quality in that scenario. Agreed. However, the digital signaling turns out to be superior over any significant distances, and I have that use case regularly in the form of conference room displays, projectors, etc... A similar one happens in visualization / design reviews where 4 kilo pixel displays are often the norm, with 6 being seen in high end review rooms.
SGI used high end edge blending to merge multiple analog displays. Took a very long time to set that all up, but the result is a 3D wall at 4-6Kpixels. Very cool stuff. The move to digital makes that all a bit easier and it can get pixel perfect over significant cable distances, something I never saw with analog devices at all.
Finally, when we settle in on mainstream use cases, higher resolutions on analog tend to be at or near the display capability. I can't wait to do some component testing on my plasma TV. A friend has an analog SONY HD set capable of 1080p, and analog component video is amazing on that set. The digital looks harsh by comparison. The only artifacts on that one, given appropriate cable distances and impedance is shadow mask / convergence issues. (Two things, I would be shocked to see are not present on those 4K pixel CRT images you are looking at David.)
So far, I've seen precise analog signals render very nicely, though not quite perfectly on the HD sets I've tried. Up-scaling is a software feature too. Some sets nail it, others? Meh... The digital signalling has a clear advantage here, and that's the display trend going forward. Detail text in such a scenario does artifact and it does so either because the analog signal isn't matched to the native display resolution, where digital signals do not do that, unless they too are mismatched, in which case it's a bigger mess than analog often is.
I'm standardizing on 1080p/i working resolutions. My primary laptop has that display resolution, TV, etc... Again, I'm eager to test the P2 and find optimal signals! Good times ahead for everybody.
My only CRT displays are an older SONY TV with component. It's an SD TV and I'm working on a component driver for it today, and a smallish CRT VGA that is good to 1600 pixels or so. It will take higher resolution signals no problem, but those are beneath the CRT mask / convergence factors and artifact considerably.
The point of my post really was to highlight the mainstream use case today and that analog input means aren't generally optimal for that, unless resolution is something the display can scale, and or distances aren't in play, which is why people are hinting at HDMI.
For me personally, I'm quite pleased that mess isn't on the P2. An add on chip can contain the licensing, IP mess and a very good signal path can deliver P2 graphics to it pixel perfect, once things are optimized.
I'm also stoked over YPbPr signaling and the color space in hardware we've got now! Simple signal level changes can address saturation and tint, and the monochrome (Y) displays just about anything on one pin too.
@Chip, thanks. I'll deffo try the resistor when working on the HDTV.
Some displays will trim area to a multiple and then scale nicely, while others will dither pixel width, and what they do with an analog signal varies too.
Re: High resolution CRT.
I had a pro quality 16:9 CRT VGA that was good to about 2500 pixels. It had a fine mask and it weighed a ton! It was for HD video editing, visualization and CAD. It was a gift due to a scratch in the glass, which I blended down with some CD filler chemical. No way that display would render 3K pixels.
As far as dual connects goes, good and bad there. It does allow parallelization of large displays, but the software needs to support that. I personally don't need over 2K pixels for anything.
Comments
Learn something new every day. : ]
Don't most HDTVs have a VGA input?
I've used both VGA and component input to my TVs and I can't really tell the difference...
But, I can see a difference between both of those and DVI or HDMI. The digital signalling gives you a perfect picture.
That may not justify adding the $10 or so for the encoder chip to many people though.
If using VGA on a Prop2 board and with only a software change I can use the vga to 3phone adapter cable, it would be a bonus.
That any pin can be changed to be anything in software (as long as no external circuit) on the Prop2 is a game changer.
Video+2 audio on a trrs connector can be changed to 2.1 audio with just software, cool.
Still, I'm looking forward to having a lot of pins free to drive a HDMI encoder with 24 bit color.
Most of the new chips also support sound, so you can have perfect digital sound and picture with one cable.
I've found a ton of HDTV's that don't provide VGA / PC inputs. Not sure why, but they don't.
The difference between HDMI and component depends a lot on what you're looking at...
For a photo, I'll agree it's hard to tell.
But, when showing text I can tell right away.
For 1080i, it can be obvious at times that you are seeing an interlaced display, but by and large, I don't notice the artifacts when using my PS3 in 1080i.
HDMI is nice, but we will have come way up the video curve --plenty far for me. Exciting times.
BTW, many sets will display 1080i on component. They just don't talk about it. The various media companies didn't want an analog hole and requested the component be interlaced, even though the device would be capable otherwise.
Funny, they didn't even think about the little "open HDMI" pass through devices you can get for a song.
i is smooth in pano's. It soften the image on fast horizontal moves.
p give strobocopic in pano's .
Text for i, scrolling text for p are the most difficult test situations
Real uncompresed p and i give big diference on source, but after compressing for distribution/storage ...
Real 1080p 25Hz need twice the power of 1080i.
Ironically, it becomes harder and harder to distinguish a good 480p master from an HD master, whether it's 720 [most likely] or 1080. Compression of HD content has made it a bit of a joke IMHO. The original OTA broadcasts utilized the full bandwidth allowed for HD, and they were sharp as all heck (hence the ability to see Harry *real* well).
People could just use the DVI to VGA adapter to get VGA and if they wanted component, they could use the VGA to component cable...
If they can only do dvi and not hdmi., they do dvi over a hdmi connector instead.
This cable will do:
http://www.ebay.com/itm/For-HDTV-HD-6ft-Gold-24-1-DVI-D-Male-to-Male-HDMI-Cable-/390491565316?pt=US_Video_Cables_Adapters&hash=item5aeb1c6d04
maybe should go with displayport connector, and let user buy the "illegal" displayport to hdmi M-M cable.
--- HDMI---
The fees - annual and royalties - for a low-volume HDMI Adopter agreement.
$5k/year fee + flat $1/unit administration fee + variable per unit royalty Designed to lower up-front barrier to entry cost for customers until higher volumes.
Royalty is device-based and not dependent on number of ports, chips or connectors
$0.15 no HDMI logo
or $0.05 HDMI logo (requires compliance testing)
A good monitor and cable will give you a better result at even higher resolutions every time. This so called High Definition video is a lot lower resolution than I used to use with my monitors (and still do).
What resolution do you use with your monitors?
The artifacts are likely due to signal reflections resulting from impedance mismatch. The monitor is 75 ohms, but the add-on boards are 47 ohms. Using a 28 ohm series resistor from the DAC will up the impedance to the 75 ohms that the monitor likes. This will help things a lot, as well as allow you to use larger RGB coefficients to get more DAC span (ie more resolution). The DACs on the actual Prop2 chip will be 75 ohms, so no impedance matching for video will be necessary.
If you feed a HDTV or new monitor an analog signal, the monitor has to use it's ADCs to sample your analog signal.
Sharp lines are a problem because there is no reference pixel clock on your analog signal and so the monitor makes it's
own pixel clock signal using your sync pulses and trys to sync to the analog input. But, this is never perfect and so sharp edges will be jittery...
Model name so we can prove to you that in hardware no way it can support 4k x 3k
With reasonable cabling, yeah high end CRT displays do that kind of thing. Native digital display resolutions are at issue. If odd resolutions are in play, the digital signaling isn't going to play out well.
I don't disagree with either of you. Chip is right about reflections, and I was speaking more to cable length, quality issues. For my workbench / desktop, I really like analog signals and am pretty stoked at what the P2 can do. HDMI / DVI lacks the simplicity and quality in that scenario. Agreed. However, the digital signaling turns out to be superior over any significant distances, and I have that use case regularly in the form of conference room displays, projectors, etc... A similar one happens in visualization / design reviews where 4 kilo pixel displays are often the norm, with 6 being seen in high end review rooms.
SGI used high end edge blending to merge multiple analog displays. Took a very long time to set that all up, but the result is a 3D wall at 4-6Kpixels. Very cool stuff. The move to digital makes that all a bit easier and it can get pixel perfect over significant cable distances, something I never saw with analog devices at all.
Finally, when we settle in on mainstream use cases, higher resolutions on analog tend to be at or near the display capability. I can't wait to do some component testing on my plasma TV. A friend has an analog SONY HD set capable of 1080p, and analog component video is amazing on that set. The digital looks harsh by comparison. The only artifacts on that one, given appropriate cable distances and impedance is shadow mask / convergence issues. (Two things, I would be shocked to see are not present on those 4K pixel CRT images you are looking at David.)
So far, I've seen precise analog signals render very nicely, though not quite perfectly on the HD sets I've tried. Up-scaling is a software feature too. Some sets nail it, others? Meh... The digital signalling has a clear advantage here, and that's the display trend going forward. Detail text in such a scenario does artifact and it does so either because the analog signal isn't matched to the native display resolution, where digital signals do not do that, unless they too are mismatched, in which case it's a bigger mess than analog often is.
I'm standardizing on 1080p/i working resolutions. My primary laptop has that display resolution, TV, etc... Again, I'm eager to test the P2 and find optimal signals! Good times ahead for everybody.
My only CRT displays are an older SONY TV with component. It's an SD TV and I'm working on a component driver for it today, and a smallish CRT VGA that is good to 1600 pixels or so. It will take higher resolution signals no problem, but those are beneath the CRT mask / convergence factors and artifact considerably.
The point of my post really was to highlight the mainstream use case today and that analog input means aren't generally optimal for that, unless resolution is something the display can scale, and or distances aren't in play, which is why people are hinting at HDMI.
For me personally, I'm quite pleased that mess isn't on the P2. An add on chip can contain the licensing, IP mess and a very good signal path can deliver P2 graphics to it pixel perfect, once things are optimized.
I'm also stoked over YPbPr signaling and the color space in hardware we've got now! Simple signal level changes can address saturation and tint, and the monochrome (Y) displays just about anything on one pin too.
@Chip, thanks. I'll deffo try the resistor when working on the HDTV.
http://www.dell.com/content/topics/topic.aspx/global/products/monitors/topics/en/monitor_3007wfp?c=us&l=en&s=gen&~section=specs
I would argue that the scaling of the input resolution to actual display resolution is usually also optimized with a digital signal.
Some displays will trim area to a multiple and then scale nicely, while others will dither pixel width, and what they do with an analog signal varies too.
Re: High resolution CRT.
I had a pro quality 16:9 CRT VGA that was good to about 2500 pixels. It had a fine mask and it weighed a ton! It was for HD video editing, visualization and CAD. It was a gift due to a scratch in the glass, which I blended down with some CD filler chemical. No way that display would render 3K pixels.
As far as dual connects goes, good and bad there. It does allow parallelization of large displays, but the software needs to support that. I personally don't need over 2K pixels for anything.