We look at our first schematic for our first simple circuit. This circuit requires a power supply, resistor, and LED.
Also, we use the same schematic without the resistor and notice that the LED no longer lights up because there was too much voltage passing through the LED.
Ohm's Law
We use Ohm's law to determine the resistor we need for the LED and power supply to be compatible, which states:
Voltage = Current * Resistance
or
V = I * R
Since the LED should take a current of about 20 milliAmperes and uses about 2.2 volts and our power supply produces 5 volts,
R = V / I
R = (5 - 2.2) / 0.02
R = 140 Ohms
If we push the limit and let the LED take a current of 30 milliAmperes and everything else constant,
R = (5 - 2.2) / 0.03
R = 93 Ohms
Thus, when we used a 100 ohms resistor for our circuit we were actually giving the LED a current that was above the normal amount which shortens the life of the LED.
No comments:
Post a Comment