The High Output (HO) strips use double the typical current 1400mA and as a result have about double the power: 53.8 vs 26.8. Bridelux states the HO are “fully engineered devices that provide consistent thermal and optical performance on an engineered mechanical platform.” So, is the only difference between these 2 strips that the HO has a better heat management, like built in heat sink, or something? Seems unlikely. Or would I need to use a much better heat sink on the HO to get the stated increase power and lux?
In the APPLICATION NOTE AN130 document from Bridgelux, page 17, there is relevant information about thermal management for the EB Series Linear, EB Series Square, Vesta Linear, and Vesta Edge modules. Note that they calculate power dissipation over surface area and use the largest possible 80W value for a square model and compare that to a very compact COB module to conclude that these modules still do not require heatsinking by design.
If you read further, there is a practical discussion about testing the product in a real-world setting, though, since there may be conditions that do not allow for adequate thermal management based on the product design, alone. Read up to the ESD Prevention section where the discussion ends with the thermocouple test point diagram.
It’s only in an application setting with some testing and monitoring that you would determine if heatsinking is required beyond the initial design. The only other note that I found was to be sure that all modules of the same type use the same heatsinking, when needed, to ensure performance stability (light measurement) across the cluster or group.
Thanks. I read the section on Thermal Management, but it does not address my question. I have contacted Bridgelux many time, they never answer their phones or return my voicemail, nor do they respond to my emails. So that is why I’m seeing if your company can help answer this.
Let me restate my question this way. Given these two different LED strips, EB Gen 3, and EB Gen3 HO (high output). If we look at the the 1120mm strips for each, they both contain the exact same diode and the exact same number of diodes, for the same length strip. However, the Gen 3 HO has a nominal drive current of 1400mA, and the Gen 3 has a nominal drive current of 700mA, both at a test current where temperature of center case temperature point Tc = 25C.
So, I could take a regular Gen 3 with nominal current 700mA and crank up the current to 1400mA and get the same output power as the HO, but would need adequate heatsinking to do that at 25C. So, is it safe to assume that the HO strips have some internal better heat management that allows them to run at 1400mA with no better (or any at all) external heatsink in the same way that the Gen 3 runs at 700mA? In other words, if I ran these both side by side in the same setting with no heatsink, the HO at 1400mA and the Gen 3 at 700mA, would the temperature of the two strips be roughly the same?
That’s a marketing statement that might sound pretty, but it’s completely and utterly devoid of any real meaning.
I’m not so sure about that; the two variants vary slightly in length (1120 vs 1190 mm) and it’d be unusual to see the identical emitter spec’d at nominal drive levels that vary by 2x. Perhaps I missed it, but I don’t see a spec for the actual emitters used anywhere in the respective datasheets.
The Tc=25°C is a test condition, indicating in effect that the measurements it qualifies are taken on a pulsed basis before anything has time to warm up to its actual operating temperature.
When operated at identical drive currents, I’d expect comparable devices from the two series to produce approximately similar light output. Adequate heat sinking is -always- necessary, but the definition of “adequate” varies…
The precise differences between the two product series are not described in the datasheet and I’ve yet to reverse-engineer either. It would not be unreasonable to suspect that some substantive difference between the two in terms of their thermal properties does exist, but that cannot be guaranteed based on the information at hand.
Nope, not even close. The efficacy figures for the two are roughly the same, and the geometry of the two is similar meaning that the ability to transfer heat from the strip into the surrounding air is probably comparable also. Under the conditions described the HO series product would be absorbing roughly 2x the input power of the standard series product, so I’d expect its temperature rise above ambient to be roughly 2x that of the standard series product.
Probably the most significant difference between the HO and and other versions is the width. The HO series strips are 24mm wide, whereas the other versions are 12.7mm wide or narrower – a huge difference in surface area. They call this out with the letter following the dimension in the part number. I think this alone would probably explain the difference.
Good point, this was an usumption of mine just based on the fact that they were all referred to as EB Gen 3 on Bridelux’s website. The Gen 3 and Gen 3 slim do contain the same emitters.
And given that the Gen 3 and Gen 3 HO have the same width and length and same number of emitters, is it safe to assume that the the emitters size is the same? If this were true, then wouldn’t it be highly likely that they are the exact same emitters? And if they are the same emitters, yet one is run at 1400 and the other 700 then either there is some special heat management in the strip itself, (which seems highly unlikely to be able to manage that high of a heat increase as would be if you double the current, especially given that they both have the same PCB thickness), or they are marketing this as High Output when the only way you can meet their stated numbers of wattage and efficiency is if you have some really good heatsinks. Which seems crazy. What am I missing here?
In the past year or so, I’ve had one other request for information regarding thermal management for this same group of products. I don’t know if I can find that file, but I don’t think the question or answer was as detailed as this post.
I can try to gather all of this together and send it to Bridgelux via one of our product managers to see if they will offer any explanation. I know the concept of ‘thermally managed by design’ is used here, but customers are also directed to test LED strips in their own applications to determine the need for heatsinking.
Thanks. If you can get an answer from Bridelux I would greatly appreciate that. I’ve called virtually every department and not only to they not answer but they do not return numerous voicemails and emails.
I haven’t seen these in person. I notice that the image they both use on page 2 of the HO and page 4 of the standard datasheets is identical. Almost makes me wonder whether the HO actually has twice as many diodes and that the images are wrong. Everything else would make sense if that were the case.
It’s possible though not a given, in the same sense that two 12-ounce cans of soda may or may not be the same flavor…
Heavier copper layers maybe, possibly a lower junction-case thermal resistance on the emitters. Or possibly nothing; selling the same product with two different labels at two different price points is not an unheard of marketing tactic…
A decent course on thermodynamics would probably help clear things up for you. A poor substitute that may or may not shed light on related topics can be found here.
In effect, the strip substrate functions as a heat sink, carrying heat away from the emitters and spreading it out across a large surface area to help transfer it into the surrounding air. Whatever you attach the strip to will also function as a heat sink, to some degree. As heat sinks go they’re both pretty crummy, but because the thermal energy is spread out over a large area it’s often good enough to get by.
A 20-pound weight held in the palm of the hand doesn’t generally cause discomfort, setting it on top of one’s fingernail probably will. The area over which that weight is spread is what makes the difference. The same general idea applies here; spreading the LED dies across a board four feet long and an inch wide lets a person get by with less aggressive heat sinking that if those exact same dies were crammed together in an area a bit larger than a postage stamp.
The bottom line is that given the same installation, when both are operated at their nominal values the HO series will tend to run significantly hotter than the standard, because they have to get rid of twice as much heat. Is it possible to operate either/both satisfactorily without benefit of something one might consciously recognize as a “heat sink”? Maybe, because the ‘accidental’ heat sink created when fastening the strip to something might be good enough.
Bridgelux can’t simply say “these don’t need a heat sink” however because the laws of physics dictate that they do have to get rid of some heat somewhere, and the conditions of installation are totally out of their control. Hence the advice to validate the thermals of whatever installation one comes up with by checking the device temperature.