Paranoia: any chance P2 3.3V GPIO will fry VGA monitors by mistake?
rogloh
Posts: 5,786
in Propeller 2
So unlike the typical resistive video DACs people use for generating analog RGB VGA in the past on P1's, it appears that the P2 is designed to natively support driving VGA RGB directly from its IO pins which can be configured to output up to 2V analog voltage levels via a 75 ohm source resistance, which becomes 1V into the 75 ohm terminated load in the VGA monitor itself.
One thing I was wondering here is whether in general VGA is designed to “safely” accept up to 3.3V on its RGB inputs in case of any errant condition where 3.3V is directly being output to any P2 GPIO pins attached to VGA devices, instead of having the RGB pin’s modes safely setup as 2V DACs with 75 ohm impedance at all times. I think this type of pin setup problem could quite easily happen during application development if you happen to select the wrong P2 pin or pin mode settings by mistake in your code or if you happen to try out some other P2 code from elsewhere that just drives out 3.3V on the IO pins assigned to VGA on your board, before you realise it. This was never an issue for the P1 VGA with its resistive DAC, but perhaps it now might become one for the P2?
I expect the outcome of such a condition will totally depend on the VGA monitor you have and whether its has sufficient input protection so any over-voltages outside the normal 0-1V range expected (or 0 - 0.7V perhaps?) would be clipped (or otherwise safely accepted without damage). Because VGA came about when 5V was king and there are other 5V/TTL level capable signals used for sync and I2C/DDC EPROM reading functions also present in the VGA cable I’d sort of hope VGA was originally designed to cope with possible shorts that get up to 5V onto their RGB signals and protect against it, but of course I don’t really know this for sure. Maybe some cheaper or modern day VGA monitors might skimp on this input protection and could be fried if they ever see raw 3.3V right out of the P2 at their inputs? It’s only going to be able to get 44mA running through 75ohms if driven at 3.3V (we are not talking hundreds of mA) but I wonder if that higher voltage might mess up the monitors ADCs if it’s sustained.
So if you are paranoid and wanted to restrict this problem in the first place, would there be a good design practice to somehow add a little extra output circuitry to any P2 VGA PCB designs to limit this possibility, and if so what could it look like? I was thinking perhaps some type of Zener diode connected to ground in parallel to the VGA load and clamping at something a little above a 1V breakdown voltage for each RGB signal line might help, but whether or not that would stress the P2 GPIO pins without additional current limiting resistors when they inadvertently try to drive 3.3V into these Zeners is not known to me…. and adding extra current limiting resistors to deal with that could go mess up the nice 75 ohm impedance. Would the internal P-channel FET resistance of 19 ohms or thereabouts be enough to limit the P2 pin from burning out if its 3.3V level IO pins get shorted to 1V with such Zener protection or is something else required?
Any thoughts? Is this even worth worrying about?
One thing I was wondering here is whether in general VGA is designed to “safely” accept up to 3.3V on its RGB inputs in case of any errant condition where 3.3V is directly being output to any P2 GPIO pins attached to VGA devices, instead of having the RGB pin’s modes safely setup as 2V DACs with 75 ohm impedance at all times. I think this type of pin setup problem could quite easily happen during application development if you happen to select the wrong P2 pin or pin mode settings by mistake in your code or if you happen to try out some other P2 code from elsewhere that just drives out 3.3V on the IO pins assigned to VGA on your board, before you realise it. This was never an issue for the P1 VGA with its resistive DAC, but perhaps it now might become one for the P2?
I expect the outcome of such a condition will totally depend on the VGA monitor you have and whether its has sufficient input protection so any over-voltages outside the normal 0-1V range expected (or 0 - 0.7V perhaps?) would be clipped (or otherwise safely accepted without damage). Because VGA came about when 5V was king and there are other 5V/TTL level capable signals used for sync and I2C/DDC EPROM reading functions also present in the VGA cable I’d sort of hope VGA was originally designed to cope with possible shorts that get up to 5V onto their RGB signals and protect against it, but of course I don’t really know this for sure. Maybe some cheaper or modern day VGA monitors might skimp on this input protection and could be fried if they ever see raw 3.3V right out of the P2 at their inputs? It’s only going to be able to get 44mA running through 75ohms if driven at 3.3V (we are not talking hundreds of mA) but I wonder if that higher voltage might mess up the monitors ADCs if it’s sustained.
So if you are paranoid and wanted to restrict this problem in the first place, would there be a good design practice to somehow add a little extra output circuitry to any P2 VGA PCB designs to limit this possibility, and if so what could it look like? I was thinking perhaps some type of Zener diode connected to ground in parallel to the VGA load and clamping at something a little above a 1V breakdown voltage for each RGB signal line might help, but whether or not that would stress the P2 GPIO pins without additional current limiting resistors when they inadvertently try to drive 3.3V into these Zeners is not known to me…. and adding extra current limiting resistors to deal with that could go mess up the nice 75 ohm impedance. Would the internal P-channel FET resistance of 19 ohms or thereabouts be enough to limit the P2 pin from burning out if its 3.3V level IO pins get shorted to 1V with such Zener protection or is something else required?
Any thoughts? Is this even worth worrying about?
Comments
Keep in mind P8x32A's have been driving 3v3 into VSYNC and HSYNC monitor inputs for many years...
If you do want to clamp, leds can have much sharper 'knees' than zeners.
Yeah I guess you mean IR LEDs as I think the IR ones start a little above a volt which would be good. The visible ones would start to clamp at higher voltages like 1.8V etc for red which might be already getting high.
Hopefully this is not an issue, was just thinking about it when laying out a board and running the raw RGB signal lines directly from the P2 output pins to the VGA connector and had a small dose of paranoia wondering if this might be an issue. I do know the sync lines do safely accept the full TTL range, it's the RGB pins that desire 0-1V or so.
If I had a sacrificial monitor lying about it would be good to feed it 3.3V to see what "whiter than white" actually looks like.
I doubt it's a real issue, ( 5V might be more of a concern...) plus it's unlikely to be a long duration oops.
Zeners have quite high C, and are soft at low volatges, and seem to cease at ~1.8V (1v/0.75V 'zeners' are diodes)
A simple diode string could be sufficient, if you really wanted to do something...
Yes maybe there could/should be additional current limiting resistance after the voltage across the 75ohm resistor is sensed and tapped into the rest of the amplification circuitry in case the voltage is too high.
I think Chip did mention the DAC's worked ok down to 2.1V, so that's another possible means to do protection.
It may be a simple Series R (or even PTC ) in that blocks Vio can give a primitive short/ miss config protection.
Yeah I'd wondered about that possibility too jmg. The good thing is that each Vio drives 4 pins which works out nice for doing RGBS or just RGB (SyncOnGreen). Would at least go down from 3.3V to 2.1V, but I think I'm probably going to ignore this issue, for now anyway.
By the way does RGB with a sync on green output still work okay with the known bug in the colorspace converter? Has anyone tried it yet on a real P2? I do see the sync component requires an addition in the formula for FI, FY and FQ and wasn't that the part that was broken with the $signed thing? Presumably the separate H sync (or more practical a combined H+V sync) method would also work in that case?
Colorspace conversion works fine for things like Y-Pb-Pr. It's just modulation, like NTSC uses for color that doesn't work on the current silicon.
This is a schematic for an IBM 6543 SVGA monitor.
https://elektrotanya.com/ibm_6543svga,g50_sch.pdf/download.html
As you can see, the RGB lines have a capacitor rated at 50V and a 390ohm resistor in series with them.
Heres the schematic (The site elektrotanya BTW is one of the best sites I have ever found for repair manuals for almost ANYTHING. They have a lot of monitor, TV and laptop schematics amont others) https://elektrotanya.com/amstrad_pc14m28lr_pc14m39_y2_svga_monitor.pdf/download.html
This one is very similar to the IBM. On the IBM monitor, there was a vref pin on the IC that was connected to the color lines through a 10K resistor. On the Amstrad it connects to the emitter of a transistor through a 10K resistor. There is also a capacitor in the Amstrad design, of similar value, but only rated at 10V, but nothing analogous to the 390 ohm resistor. I like the IBM better.
If we look at the LM1203 in the IBM
http://cdn.goldmine-elec.com/datasheet/A10692.pdf
Figure 6 shows a simplified schematic of a video amplifier. The data sheet does not list a maximum input current, but DOES say the maximum voltage at any pin cannot be more than VCC. This makes sense, because if the voltage is greater than VCC then the collector base junctions of the input transistors start to conduct which could lead to bad things very fast. At 12V the input current into a amplifier pin would be 11.4mA through the 1000ohm resistor and 21.6mA through the 500ohm resistor for a total of 34mA. (assuming .6V drop across the base emitter junction of the input transistor,and the second transistor connected to the emitter of the input transistor) I've seen devices such as the LM1203 go out from time to time. My assumption was always that it took a static hit, because its connected to an outside line. All things considered there seems to be layer upon layer of protection that should keep it safe from anything but something like 50,000 volts from a static hit or EMP from a lighting strike near a long cable run. (I saw one like that one time. The modem in the PC connected to the phone lines and the monitor and video card that were connected via a 30' cable had melted down. They said lighting struck by the building.)
Looks like you had plenty of fun back then with your repairs. Those late 80s and 90s were such good times for mucking about with PCs (perhaps life in general) and I miss it now. All those frequent upgrades for your 2x gains and and getting it all to work together. Gone now. But now is a good time for hacking about with electronics stuff and microcontrollers etc so I guess that's something.
We fondly remember all the good stuff and forget most of the bad, except for the rare truly bad things. That's why we call them the good old days.
For a while there mucking about with hardware was dying out, then suddenly there was the "maker movement" and it came back with a vengeance.