Shop OBEX P1 Docs P2 Docs Learn Events
LED's - forward voltage & current question — Parallax Forums

LED's - forward voltage & current question

JavalinJavalin Posts: 892
edited 2009-11-04 13:54 in General Discussion
Evening all,

A quick one for you experts!· If·i'm driving a LED with these spec's:

Forward current max.: 30mA·
Forward voltage max.: 2.5V·
Reverse voltage max.: 5V
Do I need to stay inside both the forward current and voltage?· Whats the reverse voltage mean?

This site·seems to suggest a 220ohm resistor will work, but using a circuit sim package, it seems to show that allows 25ma and 3.44v across the LED.· If I use a 390 ohm that limits it to 16.5ma and 2.54 volts.

Many thanks,

James

Post Edited (Javalin) : 11/3/2009 7:52:06 PM GMT

Comments

  • Mike GreenMike Green Posts: 23,101
    edited 2009-11-03 19:40
    Yes, you need to stay within all the maximum ratings.

    Reverse voltage is what voltage the LED will withstand without damage when connected in reverse.

    Because the LED is conducting, you'd need unusual circumstances to be able to produce more than 2.5V across the LED in the forward direction. Probably the 2.5V figure is the maximum forward voltage that can be measured across the LED during normal operation. The other two maximum ratings are damage points. More than 30mA through the LED might cause failure of one of the internal leads or metallization on the chip. More than 5V reverse voltage can cause destructive reverse breakdown of the diode junction.

    If your power supply is +5V, a 220 Ohm resistor would allow (5 - 2.5)/220 = 11.4mA of current to flow through the LED.
  • JavalinJavalin Posts: 892
    edited 2009-11-03 19:50
    Hi Mike

    Thanks for the reply!

    Why do you do (5 - 2.5) / 220 ?

    James
  • MoskogMoskog Posts: 554
    edited 2009-11-03 20:10
    You need 2.5 volts to run the LED. That explain the (5 - 2.5).

    Ohm's law says current = voltage divided by resistanse, like this: 2.5 [noparse][[/noparse]Volts]/220 [noparse][[/noparse]ohm] = 0.01136 [noparse][[/noparse]Amps] or 11.4 mA.

    Edited:

    I usually use this formula to find the right resistor value·to a·common red·LED:

    (Supply voltage - 2) / 0.02

    In your case: (5V -2)/0,02 = 150 ohm. In this case the current is some 20mA, that means a little brighter LED then the 11.4mA example on top. Go for a resistor with a little higher value if you don't find exactly what you need.






    Post Edited (Moskog) : 11/3/2009 8:33:13 PM GMT
  • JavalinJavalin Posts: 892
    edited 2009-11-03 20:20
    Ok - knew about ohms law - just was confused why the subtraction.

    Cheers,

    James
  • JavalinJavalin Posts: 892
    edited 2009-11-04 10:07
    OK - thinking some more - why are you subtracting the 2.5 volts from the supply?

    Looking at http://www.allaboutcircuits.com/vol_1/chpt_2/1.html and thats how i've always done it - in the examples nothing is shown about subtracting voltage from supply.

    Is this something peculiar to LEDs or semiconductors?

    James
  • Peter JakackiPeter Jakacki Posts: 10,193
    edited 2009-11-04 13:38
    The LED is a light-emitting diode and like all diodes there is a forward voltage drop to get the diode junction to conduct, remember it's a junction of on an N-type and a P-type semiconductor. With normal diodes that's around 0.6V or so but with LEDs the forward voltage drop is at least 1.6V or more depending upon the semiconductor material (standard LEDs use Gallium-Arsenide).

    The reason you subtract the voltage from the supply then is very simple because that voltage drop reduces the voltage the same way that inserting a common diode into a 1.5V would result in only having 0.9V or less available (try it). Do the same thing with an LED and no, you won't get a negative voltage, you just won't get anything. This is assuming you have a resistor in these circuits across which you measure the voltage. Short out the diode and you read 1.5V across the resistor.
                        |----meter-------|                                                    
    +V------A[noparse][[/noparse]DIODE]K--------RESISTOR-------GND
    
    



    Now take two batteries in series to get 3V then measure the voltage after the LED and you will end up with 1.4V or less. Ohms law says R=V/I but it's not talking about the battery voltage here, it's talking about the voltage across the resistor which will be 1.4V or less. So always subtract the forward voltage drop from the supply to work out the voltage that the resistor will see. The forward voltage drop increases slightly with current so that a red LED might have a drop of 1.8V or more when driven hard.

    I find too that the 20ma figure that's quoted in LED datasheets since year dot is what many assume the LED needs, not so, this is usually the maximum recommended current at which they also quote the brightness of the LED. Unless you are using cheap surplus LEDs from the 70's you will find that a mere 3ma is more than sufficient to light up the LED brightly.

    Exceeding the reverse voltage will not magically destroy the device, it's the current which destroys it. Zener diodes which are used to regulate many a system are really just diodes with very low reverse breakdown voltages (heavily doped) and so as long as you don't pass too much current through it then it's fine. Most times the reverse breakdown voltage for LEDs is not important.

    ▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
    *Peter*
  • JavalinJavalin Posts: 892
    edited 2009-11-04 13:54
    Peter

    Great explanation! Thanks very much!

    James
Sign In or Register to comment.