I also have been disappointed with LED "bulbs". They get too damn hot in recessed fixtures but NOT because of the LED's. It's all because of the shitty power converter embedded deep inside the bulb getting too hot and failing. The industry has necessarily had to retrofit to where incandescent bulbs are supposed to go.
I think that we'll start to see improvements in this area as people move away from "screw-in bulb" as the form factor and towards using external power converters delivering power to to LED fixtures that are just LED's and their heatsink + mounts.
As an amateur radio operator, LEDs have been the absolute bane of my (and thousands of other city-based hams') receiver. Those same modules not only produce ungodly amounts of heat, but also loads of radio-frequency interference that has been contributing to the increase of the global HF noise floor over the last decade.
I think jandrese is referring to the fact that even a DC current through a plasma generates loads of RF, for example fluorescent tubes emit THz radiation!
Every so often, I come back to the "I should have DC power on demand in my house!" until I'm reminded that DC power loss over a distance can hurt bad, even 48VDC telecom power.
Perhaps USB-C PD might help? USB power adapters are compact and PD offers standard, useful high power DC. If the adapters become ubiquitous, you could see "USB-C lightbulbs".
Slight tangent but "in-building DC" exists, and is rapidly gaining popularity but not via the medium many assumed: Power Over Ethernet.
You can buy PoE LED light fixtures now, which are both powered and controlled over ethernet. You buy a single PoE switch, run some low voltage Cat6, and you can power an entire floor's worth of lighting.
EEPoE (Energy Efficient PoE) offers more efficient PoE too which claims up to 94% efficiency.
Definitely a space to watch. Particularly as the cost savings of running low voltage Ethernet cable compared to high voltage electrical cable (110v) are substantial.
Microsemi's exclusive EEPoE technology cuts the power losses on Ethernet cables by 50%, through the utilization of all the copper available on cable when a Microsemi EEPoE PSE IC or Midspan is used. It is 100% compatible with IEEE802.3at, and the savings work with ANY IEEE 802.3at Type 2, Type 1 or IEEE 802.3af compliant PD. In practice, devices that consume 25.5W would consume less than 27.75W, instead of the worst case 30W when a non-EEPoE PSE is employed.
So they use all 8 wires for power instead of just 4 to halve the power loss on the wire. For the worst case device that makes the 20% loss be a 10% loss.
Definitely watching. I once was in the LED industry when it was still hot. The company was called Zega.
They tried every random thing, including putting a small real ARM server on the bloody thing to do remote control. It was back then when I first stumbled on Espressif people and Teo. They got lightyears ahead of us with all-in-one SoC.
? Must be either very low current lighting or that's some pretty beefy Cat6. I think the average Cat6 cable is only 24 gauge and that won't carry a lot of current very far.
PoE+ will deliver 30W to the far end of up to 100M network cabling. Given that those cables radiate out to endpoint devices from a central point, rather than powering multiple devices in a ring/loop topology, the potential to power lots of devices over a wide area is huge.
Biggest problem tends to be accommodating the central power delivery device. Those big PoE switches run hot and loud.
He's talking about running a separate cable for each and every bulb. It's a wasteful solution compared to running all of the lights off of a single circuit (or limited number of circuits) like a normal building.
Certainly at the household scale this is true. I cringe at the inefficiency of rooftop solar panels, hooked up to inverters, powering transformers that supply phones, lights, and electronics. AC in the house is great for an electric oven, vacuum cleaner, or fridge compressor, but so many modern electronic devices would be happy to run off of 5V or 12V.
Look on the bright side- in terms of real power, the electric oven, vacuum cleaner, AC compressor, and fridge compressor totally swamp the entire load, waste and all, of those 5V devices.
In a perfect world we'd have both AC & DC, and could use whichever was most appropriate. We could feed DC straight from the solar panels to the cell phone. But in terms of what consumes most of the power, in the typical home it's AC loads.
In a perfect world your fridge and vacuum would use BLDC motors and be happy with whatever dc or ac voltage you give them. As my washing machine already does.
>Look on the bright side- in terms of real power, the electric oven, vacuum cleaner, AC compressor, and fridge compressor totally swamp the entire load, waste and all, of those 5V devices.
This isn't an appropriate comparison. Those appliances actually do useful things that need to get done so the energy isn't just wasted like it is with power supply inefficiency.
Sure a 12vdc circuit would be nice. It wouldn't be nice enough or efficient enough to be worth paying for.
The point is the power delivery system we have does a good job for 99% of real power consumed (or whatever the number really is).
I hate waste; my mains-connected smoke detectors (1% efficient) and garage door opener (15W standby) gnaw at my soul. But that's scope for improvement, not an efficiency crisis.
A smoke detector doesn't need that much power, so efficiency of the power supply is less of a concern. You have to find a useful compromise between efficiency and use of materials. Those additional components for higher efficiency need to be produced which uses resources and energy.
Although 15W standby for a garage opener seems excessive.
Those aren’t a waste, they are critical life safety equipment. If their 1% efficient power supply lasts the full 10 year lifespan of a smoke detector rather than 6-18 months like a typical LED lightbulb, I’d say that’s the appropriate level of reliability.
Yeah, they're important & valuable. However, battery-powered smoke detectors do it with 1% of the power.
If every home in Australia had two smoke alarms, that means 5,600 kW of continuous and largely wasted energy consumption. I don't mean wasted in purpose: smoke alarms are essential. The wasteful part is the fact that 99% of the power going into smoke alarms goes to converting AC power to DC power.
As far as I'm concerned, that's 99% scope for improvement.
That is an incredible amount of energy and I agree that it is worth trying to reduce, but I’m skeptical of adding numerous DC-DC converters to the chain. In the current state, it’s all passive devices between the turbine at the power plant and the smoke detector on the ceiling. I wouldn’t trade that reliability for power savings lightly.
Grid-scale transformers are very efficient. The tiny pole-mounted ones used in the US and some other countries probably not so much, but everything north of 50-100 kVA gets into 99+ % efficiency.
Just because generator/machine set transformers and 380/220 kV step transformers require semi-active cooling doesn't mean they're inefficient... just means that they handle a huge amount of power (MWs), so even at very high efficiencies that translates to a lot of heat in absolute terms.
The goal in my thinking was to eliminate a complex or costly power supply at the point of consumption. Higher voltages require a beefier step down (isolation, caps, whatever) and AC requires a rectifier. I wanted to eliminate it all.
Sorry, I was being a little too loose for HN. Also, I'm not an EE but I play one on the Internet :).
Power loss (I^2R loss) over distance is not caused by DC; it's caused by the combination of low voltage and high current. If current usage and distance are both small (as they would be in an LED-lighted house), wiring a house for DC is a perfectly reasonable thing to do.
Also to note that your 1KW appliances need 83 amps in a 12V DC system. This is a huge amount of current and requires special cabling (which has a not insignificant voltage drop due to the resistance of the wire) and special connectors.
Yes Please for DC power. Especially if it can remove all the power adapters. I think I have more adapters than wall sockets in my house now. Phones, Tablets, electric toothbrushes, shaver, router & switch, wifi access point, rasp pi, cameras, Laptop, Monitor, streaming device, reading light, electric piano, toy train. good grief
Isn't that a function of voltage and not modulation? There might be a cost argument, I'm not sure if solid state step-down DC-DC converters are cheaper than good old cheap transformers.
>A shock from a DC supply will cause heart fibrillations at lower voltage than AC, because of the way muscles receive signals from nerves
That is opposite to my understanding. The extra shock hazard of AC vs DC was an argument used against AC power distribution back in the day. The 50V limit is to prevent shock altogether.
As a counterpoint, I've fitted 6 LED lightbulbs in recessed fittings more than five years ago, they're used maybe 10 hours a day and they're all still going strong. It depends on the model, I suppose.
> I think that we'll start to see improvements in this area as people move away from "screw-in bulb" as the form factor and towards using external power converters delivering power to to LED fixtures that are just LED's and their heatsink + mounts.
These already exist. Nearly every architectural grade fixture has a separate 24VDC driver, the LEDs, and a heatsink. Here's a cutsheet for a Lithonia recessed can as an example:
These run for $175-200 a piece. They list 70% lumen maintenance at 50,000 hours. That's 20 years at 12 hours a day for 220 days a year. Some architectural cans are a bit cheaper, some are a bit more.
I think that we'll start to see improvements in this area as people move away from "screw-in bulb" as the form factor and towards using external power converters delivering power to to LED fixtures that are just LED's and their heatsink + mounts.