This is you though. There was a whole genre of led growers that would repower their room for less than 10% increase in efficiency pretty regularly. These relatively small increases have kept the owners of horticulture lighting group in business for nearly a decade. When the efficiency gains of leds started to slow down year over year they just started making modules with more leds to increase the system efficiency. And it seems like they’ve been pretty successful doing so.
You just have to accept that not everyone thinks the same way and just because you and I may feel there is a limit to cost vs performance, not everyone else feels this or will agree at the point to draw the line. Hell, there are members in this forum that have replaced perfectly operating 1-2 year old fixtures because a new “better” version of the same light was released.
Chilled tech had them, but bailed on that design and went with a full retool to passively cooled design. They’re no longer around, but they would’ve been another example of loading modules with chips to have the most efficient light at the time.
To me, you would have to consider the energy used by the cooling system in a liquid cooled lights efficiency and this would be a major drawback. It would probably be more reasonable to use fans and the current tech has already deemed fans unnecessary.
Were the 10% increases due to changing to LED lights and then more efficient LED lights, or something else? I would think the commercial growers who replaced all their lights too often went bankrupt.
Adding LEDs increases the power demand and PPF but I don’t see how it improves system conversion efficiency. Packing them more densely makes it harder to cool them, which works against efficiency.
Evidently, the manufacturers have managed to cool their lights adequately using heat sinks, but at some density, cooling fans and, eventually, water cooling (or liquid nitrogen!) become necessary.
I conclude that commercial LEDs probably require water-proofing, but I suspect the biggest difference is that their manufacturers must have solid, US-based tech support.
My Vivosun lights are well designed and built, but getting tech support requires emailing with people in China who’s English often necessitates multiple iterations before they understand the question and I understand their answer. If I was growing commercially, I’d demand tech support that answers a US phone number, speaks English fluently, and knows the product thoroughly. That alone may explain the greater cost of commercial units.
That isn’t really how all of this works. You can fully load a driver with 100 leds or still have it at same load with 200 or 300 leds. In several cases there could be no difference to power demand of the system. In this scenario reducing individual led current by 50% or more would also lead to a reduction in the temperature of each led. Then spreading the overall thermal load to more leds would increase mounting surface area to the heatsink making it more effective. Just about every aspect of increasing chip count to run same overall power level leads to decreases in temperatures and increases in electrical efficiency.
I’ve seen exposed lights sprayed with water and keep running. Not that i would do it. Has more to do appeasement of government officials than anything else.
Chip density, rather. Your observation implies that the ideal LED is a single one that’s the right size and shape for the application. Failing that, bigger is better.
I’m not sure how you concluded that from any of my posts. Even rereading them i make no claim to size of light as that is irrelevant. See examples of 2 different led modules that are roughly the same size.
This article is old news and I’m surprised they would even publish this in 2024. I have been coaching people to decrease veg time and substitute canopy size by increasing plant count (if possible) pretty much since anyone here would listen to me. There’s even a handful of members here that don’t run anything but 12/12 for the duration if their grow. Not only in lessening amount of time that lights are running but getting similar final harvest weight and potentially being able to squeeze an additional harvest into a year. All of which maximize the throughput of a grow while minimizing cost.
I think you took “LED” in my statement to mean “LED grow-light,” but I meant “LED.”
If increasing chip density brings the benefits you say, it follows that they’re greatest at 100% chip density. That would maximize the mounting surface area to the heatsink and produce the most PAR from the same amount of power.
100% chip density would require a single, unusually large LED, mounted on a board (and heatsink) that has the same dimensions as the LED. The ideal LED (and LED lighting unit) would have the same shape and surface area as the tent it’s in.
Where did you learn this news? The people doing the research are clearly unaware that they’re duplicating prior work. More likely, they’re using properly controlled experiments to test claims they’ve heard that are based on anecdotal reports. Those claims seem to be correct and their research is validating them.
I guess I did figure you meant fixture. But this whole conversations is irrelevant if you’re talking about different leds, as there is only one most efficient led chip. The electrical efficiency of led chips and the thermal efficiency of the mounting are two different things. But if by design you are able to run a chip cooler the electrical efficiency of said chip will increase. This info is typically provided within the data sheets of most leds/modules. See example below noting the drop in flux at each current given the increase in case temperature.
You raise good question asking where I learned. I’m not sure if there is earlier studies testing exactly this theory. I know there are studies determining that plant count has no repeatable effect on yield potential when canopy size and shape is the same. From there common sense would tell you that if you can get to same point in less time you would use less energy subsequently saving money and potentially add more harvests to same amount of time. There have been grow methods (sea of green) in use for decades taking advantage of this phenomenon. I’m guessing one would have to go back a while looking. I’m not prepared or interested enough to do so but have at it if you’d like. Imsickkid and Bogleg both used variations of this method documented here. Mattybear has been growing from seed on 12/12 for a couple years too, documented in his latest journal. Not that any of this is right for everyone, but nothing in that article is groundbreaking news.
I don’t understand this remark. We’re discussing principles that apply to all LED and LED lighting units. I followed your reasoning to the conclusion that a unit containing one giant LED should be the most efficient.
I believe a 12" diameter LED is the largest that could be made presently. We’re a long way from being able to produce a 2’ x 2’, let alone larger LEDs. But square 4" LEDs could be produced using widely available 6" gear and substrates and would be easy to tile into useful sizes and shapes. I wonder if the reason no one makes LEDs that fill entire wafers is the yield would be too low.
By “this method,” do you mean taking seedlings directly to a 12/12 schedule?
For you and a few others, apparently not, but I doubt it’s common knowledge that the veg stage is optional. I wonder if it’s as simple as skipping the 18/6 stage and nothing more. The pilot study reported in the article tested with 10 different lines of nutrients and differing PPFDs – I’d like to know those results. Perhaps the full experiment has been completed by now and there’s a report I can get. I’ll see.
I never tried to conclude a bigger led was more efficient though. So I’m not sure why you’re saying that you followed me to this. Lets go back and look at what I said.
You followed up with
To make it simple, I don’t feel we’re having the same conversation. The current most popular horticulture led is 30mm x 30mm. If you take 100 of them arranged on a pcb and attach to a 10 watt driver it will be a 10 watt light. If you put 200 of the same chip on the same pcb and attach to the same 10 watt driver it will still be a 10 watt light. The 10 chips on to 200 count 10 watt light will run at half the current, and cooler due to the lower operating current and increase in thermal loading area. Both will have positive impacts in the amount of light produced at the same power consumed.
No, I meant they used sog method. MattyBear uses 12/12 for entire length of grow. There are others though, I’m just not sure how many are still active here.
You have to be careful how you say this. Plants from seed will remain in veg regardless of light schedule until they are sexually mature. Which is usually about +/- 30 days. In the article they seemed to conclude that 8 weeks veg time was normal, which I don’t necessarily agree with. When i run plants from seed it’s typically 18/6 for 4-5 weeks and then 12/12. But there are people all over the place with different approaches.
This meant I accepted your reasoning and then followed it to its logical conclusion.
Correct. I reached that conclusion. Do you see an error in it?
We are, but you got distracted by the perception that I was putting words in your mouth.
It’s the gist of what the article says. That’s why I’d like a proper report or journal article that provides the details needed to evaluate the results. I sent Fluence a message thru their website, asking the status of the research.
I haven’t, and it seems following link requires an account? I’m vaguely familiar with the dlc certification though.
I think the size of the led could be good for certain things but just because an led is big doesn’t mean it’s good. The most efficient leds available are currently very small.
The site reports the test results they obtain for lighting units that are submitted by manufacturers and compares them with the manufacturers’ specs. The link I gave you goes directly to the horticultural lighting section.
Are you referring to microLEDs? What efficiencies are they getting?