When powering a simple LED circuit (DC power source, LED, resistor), does the supply voltage matter, as long as the correctly calculated current limiting resistor value is used?
In other words, is there / could there be something inherently wrong by powering an LED with 12V or 24V as long as I used the correct resistor, knew the forward voltage of the LED, knew the maximum current, and calculated it using something like this, when I could have powered the same LED with a 3.5V supply knowing the same variables and using the same website?
I’m assuming there is a limit to the maximum amount of voltage to use for the LED here… when I look at the electrical characteristics chart for the CREE XP-G for example, it shows current as a function of voltage, with voltage starting at around 2.5V @ 0ma, maxing out at around 3.25V @ 1500ma (the maximum current the LED is rated at, as described in the Characteristics table in the same document.
After 3.25V, the chart depicts current quite rapidly approaching infinity.
I’m assuming this relates to my question, I’m just curious how its all related. I’m sure its all basic Ohm’s law stuff, I’d just appreciate a clarification of the math at work.