astroastro
Active Member
Is anybody out there wondering why it is, when power is much cheaper than LED's, that all these fixtures are not running their '3W' LED dies at 3W? Why not simply up the power of the LED driver and let it fly? The truth of the matter is this, irregardless of what the specifications game is going on in the Cree and Bridgelux specification sheets, irregardless of the marketing claims of the LED fixture makers, the primary power LED die that nearly every one of the companies is making and nearly everyone is using is the 1mm^2, 1W die. When you put this on your aluminum heat sink with a fan and crank 350mA (1W) through it you will be operating the die at temperatures approaching the L70/ 50K lifetime rating of the die. That is the truth. To overdrive them somewhat is certainly possible, but you have to pay close attention to the junction temperature, and you must have a well designed thermal management plan.
Driving these dies at 3x the current density (3W) is virtually impossible in a real world application- unless you simply don't care about the working life of the LED. In that case go ahead and burn them up- you will get about 2x the light out from about 3x the power, and a very short life with greatly accelerated light output degredation.
You cannot accurately get the LED junction temperature from an IR thermometer, and I have had a lot of issues and doubts from the results yielded from a thermal imager, ala FLIR or Fluke. We have a Fluke Ti32 here. I still feel most comfortable with the results I get using the old fashioned calculation based off thermal conductance of the material stack-up between the junction and ambient.
Driving these dies at 3x the current density (3W) is virtually impossible in a real world application- unless you simply don't care about the working life of the LED. In that case go ahead and burn them up- you will get about 2x the light out from about 3x the power, and a very short life with greatly accelerated light output degredation.
You cannot accurately get the LED junction temperature from an IR thermometer, and I have had a lot of issues and doubts from the results yielded from a thermal imager, ala FLIR or Fluke. We have a Fluke Ti32 here. I still feel most comfortable with the results I get using the old fashioned calculation based off thermal conductance of the material stack-up between the junction and ambient.