TV output current requirements
Philldapill
Posts: 1,283
Using a standard TV output adapter(1.1K, 570, 220 ohm resistors), about how much current is used? I am prototyping a project and need my power requirements pretty low. I will sometimes connect a tv to the device to check certain things, but don't want the TV object to run all the time if it is using power. Since the TV output·components form a DAC, some pins will source while others sink current. I don't want to have to code in extra features if it's not needed.
So, how much current does the TV output take(not counting extra cog)?
So, how much current does the TV output take(not counting extra cog)?
Comments
Cooresponding currents of - 0, 5.54, 4.38, 2.62, 5.54, 4.38, 2.62, and 0.(220ohm---(570||1100), 570ohm---(220||1100), etc.)
I just calculated it in an excel sheet, and it seems the average current would be... 3.136mA... WHOA! That's way more than I expected. I would have imagined somewhere in the 100uA's range. I guess only a proper REAL measurement will be the final say, but wow. Thanks Mike.
▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
Doing the math, it should be about 11.68mA peak, but the average(over a period of time) should be 3.136mA. Still, a lot of current just for a signal.
▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
If you really want to get into the math, calculate a curve of the statistical distribution of brightness, and use the power dissipation for each brightness value, then integrate. In my job as an engineer , I don't integrate. I design around worst-case unless cost pressures force me to be more precise.
▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
Post Edited (Ken Peterson) : 6/28/2008 4:24:46 AM GMT
That's only for one specific colour ( or control level ) so the average, unless you're unlucky would be less than that.