Engineers and designers express LED efficiency in terms of "lumens per watt". This concept expresses the correlation between the luminous output (total light output) and total power dissipation (total power output). An LED lighting system with a high energy efficiency provides the maximum luminous output, while maintaining minimal overall power dissipation. This translates to a lighting system that produces less overall heat, since modern LED products produce their majority of waisted energy in the form of heat. For example, a light engine providing 40 lumens per watt is more efficient than one providing only 30 lumens per watt. To calculate and compare the energy efficiency of various light engines, the designer must obtain at least two variables. First is the total light output from the device, expressed in lumens. It is important not to confuse the luminous output (also referred to as luminous flux) with luminous intensity, which does not represent the total light produced by the light source. The second requirement, total power dissipation, can be easily calculated by multiplying the circuit source voltage by total amperage. After obtaining the two required variables, simply divide the luminous output by total power dissipation to calculate for system energy efficiency, expressed in lumens per watt.
In LED off-grid lighting projects, it is very important to maximize system efficiency to the maximum extent possible. A limited source of energy, such as the batteries found in off-grid systems, need to provide power for sufficient illumination over an extended period. The "amp-hour" rating allows the designer to calculate approximate battery life based on circuit amperage. A less efficient lighting system will provide less light, more heat, and a greater number of individual LED lights to achieve the required level of luminous output. Obviously more LEDs means an increased current draw, and shorter battery life. The primary problem associated with high power LED systems originates from the thermal energy produced by each LED. This heat actually decreases the LED light's luminous output. Therefore, improving LED efficiency always starts with good thermal management.
It is important to maintain minimal LED die temperatures, not only to improve thermals, but also to boost system reliability. The electrical circuits on which the high power LEDs are mounted, should be physically attached to an external heat sink. In addition to good thermal management, several other methods and techniques will help to boost overall system efficiency. If possible, minimize all wiring, and utilize the thickest conductors available. All electrical wires oppose electrical current to some extent, and opposition to current results in unwanted voltage drops. Batteries may provide 24 volts when measured directory from the terminals. However, from the ends of a 100 foot conductor connected to the batteries, the available voltage may decrease to only 11 volts. In this case the resistance of the 100 foot wire leads results in a drop of 1 volt. This energy is converted to heat and released along the entire length of wire.
In a properly designed lighting system, the source voltage should always match the operating voltage of individual LED light engines. Many modern light engines designed for interior home lighting, require 20 - 24 volts DC for operation. To preserve overall system efficiency, it is critical to supply power directly from the primary power source (batteries). Do not rely on the wiring network connected to an inverted power source (low DC voltage to 120 AC output). Throughout the process of power inversion, and conversion back to a lower usable voltage, accumulated power losses become significant. The additional costs for high to low voltage converters will more than likely exceed the cost associated with wiring the building with dual power buses (high and low voltage). The high voltage wiring supplies power to wall outlets throughout the building, and the low voltage wiring supplies power to individual LED lighting fixtures.