Does LED efficiency increase with lower drive currents?

Relative to arrays of strip LED lighting, I’ve heard people say that distributing current across more strips lowers the current for each strip and increases efficiency. Is this right? If so what type of efficiency are we talking about? I’m interested in increasing luminous efficiency, but I fail to see how diluting current by adding more strips increases efficiency. Wouldn’t each strip be just that much lower in luminous flux, and the sum total would be the same?

Yes, operating a given emitter at reduced drive current levels typically does yield an increase in luminous efficacy (lumens/watt), because LEDs are not strictly linear in terms of this behavior. Higher per-strip drive levels promote higher operating temperatures, which depending on the scenarios one chooses to compare can cause reductions in efficacy on the order of 5 to 20%.

Question about nominal drive current.

“Nominal” drive current values typically indicate the level at which the other device characteristics are measured and communicated. Manufacturers generally choose this value to reflect an operating point that offers a good balance of competing design tradeoffs under expected operating conditions.

Because different applications value different performance characteristics differently, it’s not necessarily the optimum drive level for all purposes; it’s a suggested starting point. Yes, further reductions in drive level may yield increases in luminous efficacy, but there is a point of diminishing returns.

Sorry, I totally changes the focus of my quesiton, or really added a question by editing my post completely, but you had already responded. I should have just left it and added question. That was dumb.

Thanks for clarifying about nominal current.

Just to be clear about efficiency. As I am about to buy a driver and trying to decide between 2 different ones.

So, 2 scenarios:
1: total wattage is 400w, current is 1000mA per strip for a total of 10 strips. Voltage is 40v.
2: total wattage is 400w, current is 750mA per strip for a total of 10 strips. Voltage is 53.3v.

Scenario 2 would be more efficient? In other words efficiency comes from lower drive current even if the voltage increases?

Your scenarios don’t make sense, if you are using the same LED strip for both scenarios. The higher the current passing through a strip, the higher the voltage drop across them. Here is how I would try to explain how efficacy works:

Say you run 1000mA through 10 LED strips at a Vf of 40V, and it yields a total of 70,000 Lumens. This would give you an efficacy of 70,000 Lumens / 400W, or 175 Lumens/W. If you ran 500mA through the same LED strips in the same arrangement, the Vf would probably be in the neighborhood of 38.5V, and the output would be roughly 36,000 Lumens (possibly slightly more). Under this scenario, your input power is 38.5V x 5000 mA = 192.5W, and your efficacy is 36,000 Lumens / 192.5W = 187 Lumens/W.

So, your efficacy increases from 175 Lumens/W to 187 Lumens/W, but you are getting barely over 1/2 half as much total light. If you put 20 LED strips in parallel and ran them all at 500mA, you would get about 72,000 Lumens at 385W (20 strips x 0.500A x 38.5V). This would give you more light with less power, but you would use 2x the number of LED strips to accomplish it.

Does that make sense?

Yes, it makes sense. And it opened up a greater understanding. I should be focusing on lumen output instead of wattage. I was not remembering that wattage does not simply translate to amount of light.

I’m using these 2ft srtrips: https://www.bridgelux.com/sites/default/files/resource_media/DS132%20Bridgelux%20EB%20Series%20Gen3%20Data%20Sheet%2020190617%20Rev%20A.pdf
Which have a typical forward voltage of 19.1Vf and a nominal current of 700mA. And a typical flux @ 25c of 2490 lumens. And efficiency: 19.1*.700 = 13.37w = 2490/13.37 = 186 lm/w.

Now, if I want to find the lumens and wattage of an array of these strips driven by a particular driver how do I do that?

10 of these strips using this driver: meanwell HLG-240H-48 https://www.meanwell.com/Upload/PDF/HLG-240H/HLG-240H-SPEC.PDF

This driver has total current of 5A, and a voltage range of 24-48v. I would like to wire these 10 by wiring 2 in series 5 times, then taking those 5 sets of 2 and parallel them together.

I can find the current per each of the 5 sets by dividing the driver current by 5. which is 5A/5 = 1A. Using the datasheets “current vs forward voltage” graph I can see that at 1 amp the voltage is about 19.6v. If I double the voltage: 19.6v * 2 = 39.2v. Each series pair uses 39.2v * 1A = 39.2w, multiply this by 5 = 39.2 * 5 = 196w. And the number of lumens is simply 10 * 2490 = 24900. So efficiency is: 24900/196 = 127 lm/w.

Clearly my calculations are incorrect. What am I doing wrong?

What you are missing is the increased lumen output from the increased current. That is given in the Figure 4 graph.

The lumen output will be about 140% of what it is at the test current (700mA), so roughly 3486 lumens per strip. Ten of these will give about 34,860 lumens for an efficacy of 34,860lm / 196w = 178lm/w.

In a real application, this won’t quite be true, because light output and Vf both drop slightly at elevated temperatures for a given current (which will be the case here due to self heating). The Vf drop adds slightly to efficiency, but the loss in light output more than makes up for that, so overall efficacy is reduced by about 3.5% percent if the case temperature rises to 60°C. This comes from Figures 7 and 8 on page 7 of the datasheet.

Thank you for breaking that down for me, very valuable.

It all makes sense now, but I want to be clear about one part. When calculating the total wattage that my strips will use, I take the driver’s max current and divide that by how many strips I have, then take that current and consult the data sheet graph “current vs forward voltage” to get the voltage. Then multiply that voltage by the total driver current. Is this the case regardless of how I wire the array, no matter how complicated a combination of serial and parallel wiring?

For example, if I wanted to wire 24 strips in series of 8, then parallel those sets of 8 in 3 times. I still just divide the driver max current by 24, find the voltage for that and multiply that voltage by the max current to find the wattage?

The arrangement makes a difference. If you have only three strings of LED “strips” in parallel, then each string will take 1/3 of the current pushed out by the constant current LED driver, regardless of how many “strips” are in the string (assuming you have the same number of strips in each string).

If you wanted each strip to run at 1A, then you need an LED driver which will output 3A and it would have to produce nearly 160V to guarantee that they turn on, since each strip will require 19.6 volts and there are 8 of them in series. In this case, since each string is 1A, the power used by each strip is 1A x 19.6V, or 19.6W, and the total power is 24 x 19.6W = 470.4W. You could also figure it out by looking at the total voltage of each string and multiplying that by the current through that string, so 156.8V x 1A = 156.8W, and since there are three strips, the total is 470.4W – same answer.

On the other hand, if you arranged them in 6 parallel strings of 4 strips per string, your LED driver would have to be able to output 6A to give each string 1A (6A / 6 strings = 1A per string). Since you only have 4 strips in each string in this arrangement, the voltage required would be nearly 80V (19.6V x 4 in series = nearly 80V). The total power of all of the strips will be exactly the same doing it this way. Each strip will have 1A passing through it and it will take about 19.6V, for a total of 470.4W.

You can get the same light output a number of different ways. You can put all of them in parallel, but then you would need a supply which output 24A at 20V (not available), you could put them all in series and use a 1A, 480V supply (not available, and dangerously high voltage), or any number of options in between these two extremes (2 x 12, 3 x 8, 4 x 6, 6 x 4, 8 x 3, 12 x 2). As long as you can find a driver that can output the appropriate current and voltage for the arrange you choose, you will get the same light output and will use about the same power.

So, what, exactly is your goal with these questions? Are you trying to get the most light out of 24 of those strips? Are you trying to figure out how to best arrange them for a particular power supply, or what?

I am trying to figure out if there is an arrangement that gives me the most efficiency.

Or, is it the case that, at the same wattage and same number of strips, any arrangement of drivers with their varying voltage ranges and current outputs, and wiring options (serial and parallel), gives the same exact efficiency?

As with your earlier example on efficiency, and looking at the Bridgelux datasheet graphs, lower current requires lower voltage, which of course yields lower power. But is the only way to increase efficiency at same wattage to add more strips or can the efficiency be tweaked by the many configuration options? In other words, am I just chasing my tail at this point.

@tilopa108,

The “most efficiency” is not the same as the “most light”. Which of these do you want?

If you are assuming that an LED power supply will always push the full stated power into the LEDs, that is not the case. They will vary their voltage until the load draws their specified current level, unless the maximum voltage of the supply is reached before it reaches its specified current, at which point most supplies will maintain the maximum voltage at a reduced current.

The highest efficiency (or more precisely, efficacy, in the case of light output) comes from running each LED strip at low current, but you will get less light out of each strip that way. The most light comes from running each strip at high current, but your efficacy will be reduced this way.

So, if you run half the max current through each LED strip you will get just barely over half the lumens compared to running at max current, but it will take less than half the power, so it is more efficient.

If your goal is to get the most light out of a given number of LED strips (which is not going to be the most efficient because you will have to run them at high current to get the most light), state how many strips you want to use, and we’ll find the best LED driver for that purpose.

I understand. To be more precise I don’t simply want the best efficiency, nor do I want the most light. I want a “sweet spot” given the number of strips I want to use and the ballpark power output I want to hit.

The number and type of strips I want to use are Bridgelux EB Gen 3 560 mm x 14. And I would like the power output to be around 200 - 225 roughly.

Thanks for you help.

Hey thanks for all the help on this forum. I know I’ve been a bit of a PITA noob.

I’ve decided to run EB Gen 3 560 mm x 15 using the HLG-240H-20b, which is 12A, and voltage range of 10v - 20v. And wiring all 15 in parallel.

According to my calculations this would give: 12A/17= 800mA, at 19.2v. Total wattage= 230.4w
And efficiency of: base lumens per strip = 2490, x 1.18 (118%) = 2938 x 15 = 44073/230 = 191 lm/w.

@ tilopa108

That should work pretty well. However, I think you are slightly optimistic on your lm/w estimate. Since you’re driving more than the test current of 700mA, efficacy should drop a slightly rather than increase.

In the graph of “Relative Luminus Flux vs. Current” (Fig.4, shown above), it shows that a 40% increase in relative luminus flux will be seen with an increase of 300mA, so the slope is about 40%/300mA or 13.3%/100mA. So your actual output per strip will be about 2490 x 1.133 (113.3%) = 2821 lumens. Total output will be 15x that, or 42,318 lumens, and efficacy will be 42,318 lumens / 230.4w = 184lm/w.

Compared to 186lm/w for the 700mA test current shown in the datasheet, that looks about right.

Yes, my figure of 118% was from just looking at the graph, and, of course, I’m just eyeballing it so it is not very accurate. But I see the way you did it is more accurate. It just looks like the region from 100% to 120% is a steeper curve, but maybe that is just an optical illusion.

My one concern is the wiring size, and potential voltage drop. I was going to use awg 18 solid core, but now that looks like it would not handle the 12A coming from the driver. Could I use a higher gauge (like 16) from the driver to the first wire nut (using wago 5 connector), which will connect the first 3 strips? Not sure how it works, 12A coming out of the driver, and each strip it hits reducing the current in the wire by 800mA? Maybe it is best to just use awg 16 for all of it?

Also, is there any concern about voltage drop with this low voltage high current situation, if I’m only going about 4 feet max wire distance from driver?

Lastly, with the hlg-240h-20b the b version lets me dim. So, if I dim it down it reduces current and thereby increases efficiency? Obviously it decreases wattage and lumens, but this is a nice feature because the other driver I was considering had a better efficiency but max power of 200w, so this gives me the option to have that same driver in one, so to speak.

Thanks again. Got my digi-key order in my cart, ready to pull the trigger before end of day.

12A is getting toward the upper end of the recommended current for 18AWG, but still within maximum recommendations. It will mean you lose some overall system efficiency, like 5%. From a quick Google search, I found recommendations for max current in the neighborhood of 13A.

I’m seeing numbers ranging from 0.0064 to 0.0075 Ohms per foot for 18AWG wire. Assuming 10 feet round trip (5 there and 5 back), power dissipated would be no more than (12A) squared x 0.075 = 10.8w.

16AWG is no more than about 0.00473 Ohms per foot, so that would give you about 6.8w across the same length and 12A.

Larger wire is definitely preferable. And, going with a higher voltage and putting at least 2 strips in series would definitely help reduce losses in the wires.

Sticking with the 15 strips, you could also wire 3 parallel strings of 5 in series. This would reduce your wire losses to an insignificant level.

Something like the HLG-240H-C2100B or the HLG-240H-C2100A could work for this, as they output anywhere from 59V to 119V at 2.1A. This would drop your max current per strip to 700mA, and thus drop your total luminus flux back to 37,350 lumens, but it would be significantly more efficient, due to a slight efficacy improvement plus almost no wire losses (~0.3w in 10ft of 18AWG).

Both of these supplies are dimmable, the “B” version via PWM, 1-10V external dimmer, or resistance, and the “A” version via an integrated potentiometer.

For a little more money, you could also go with the HLG-320H-C2800A. This one outputs 2.8A maximum, but you can reduce that current via the integrated potentiometer, allowing for a broad range of output current.

Sounds good, I’m going to do that. Also, with the current at 700mA I might not need much of a heatsink. Bridgelux says in there data sheet that at low drive current heat sink may not be required. In any case, less heat to manage means less of efficiency losses in general.

Thanks much for your help david.

I’d advise against leaning too hard on that assumption. At the power levels in question, thermal management is not a trivial issue. This resource might offer some insights into the process of getting a handle on that topic.