Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I wonder if a good solution would be to have a fairly hefty capacitor taking charge from the supply side and then have the system measure the capacitance to determine how it will light the bulb from there. It could check for a capacitance drop to know it is using too much power and drop the output brightness using voltage or PWM, then bring it back up as the capacitor fills more. You would definitely get fluctuations, but increasing the capacitor size would absorb most types of fluctuation and also allow a system agnositc approach, able to ignore whether it is a PWM, sine cutting, or other method of dimming on the power supply. A nice benefit would be the very slight curve in the light output when you turn it off, just extending the lighting a few parts of a second as the capacitor drains.


You need to be discharging the capacitor somehow; I suppose it was meant to measure the charge, even then measuring the cut off of the sine is not hard. It's just all retrofitting design - led into incandescent bulbs, LED drivers into sine cutters - all to preserve the existing lighting fixture.

I have some near ceiling lights that allow dimming via a remote control (infra red or 433MHz) or fast switching on/off (remembering their state afterwards).

There are other ways to communicate, e.g. using the zero cross - the dimmer can send whatever signal to the lights downstream - the latter would read it and set the brightness.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: