It's a raw bus? It's like an operating systems idea of a network stack being to hand all software raw ethernet access. It is an obvious concurrency disaster once you have more than one application trying to access it, the data transmitted and control handed over is entirely opaque to the operating system (god knows what monitor vendors sneak over DDC) and most importantly all of these are absurdly low level implementation details that are subject to change.
For all you know, they don't even need DDC beyond the initial EDID setup and multiplex the I2C hardware to do something else and can't physically provide this interface anymore.
if you have root access, you should be allowed to have raw device access. the problem seems to be that apple no longer believes in users owning their computers, if it ever did.
Daily reminder that Jobs wanted one of their early computers (Lisa? Mac? Can't remember) to be bolted shut, until he was persuaded otherwise. Similar story with Apple II and expansion slots.
It's in their nature. Apple was bound to invent the iPhone.
That it works at all is a major improvement, and the Raspberry Pi method is worth having as well; I've been working on something very similar since I got my M1 Mini and found ddcctl and Lunar unable to work there, but got little further than building the light sensor before other projects took priority - I expect I'll probably finish this one around the equinox.
My work laptop, an Intel Mac, displays to and controls brightness on the same monitor, but I've been using ddcctl with a trivial wrapper script much more than Lunar of late. (Kinda feel a little bad about that, what with the author of Lunar having taken the time to give me tech support on a weird corner case here on HN a while back.) Still going to buy a license and, if I can find a way, set up a recurring donation. This kind of work deserves support.
I completely understand when users go with something more simple like a CLI or MonitorControl. I do the same myself when I don’t really need the features of a complex app.
By the way, I’m not sure if I understood correctly but if you need an external ambient light sensor, Lunar supports that out of the box now: https://lunar.fyi/sensor
Currently I'm working with a sensor homebrewed from a reverse-biased LED with a Darlington-pair amplifier - not what anyone would call accurate, but precise enough to calibrate reliably over the range of light values to be found in my office as long as I keep direct sunlight off the transistors.
Between that, a Pi, ddcutil, and the currently unoccupied DisplayPort input on my monitor, I'm hoping to brew up something that'll serve well enough - not terribly optimistic on that given that the monitor seems to maintain per-input brightness settings, but worth a try at least. (Also, if it does work, I can add an encoder with a built-in button as a manual override.)
On the other hand, it seems likely that, by the time I find that approach to fail, Lunar on Mac mini will be able to do DDC via HDMI - I thought it'd take a year at least after the M1 arch came out for anyone even to get as far as you already have, but clearly I failed to reckon with your dedication to the effort!
> reverse-biased LED with a Darlington-pair amplifier
that's clever! I knew LEDs can be used as sensors but I never had the time to try it.
Yeah, I don't know about DDC via the Mac Min HDMI. Weird things are happening with that port.
Some users report it doesn't work at all, some say their monitor crashes when DDC is sent through the port. One user even had the weird issue where sending DDC through the USB-C port to his Thunderbolt monitor causes the message to also be sent to the monitor connected via HDMI.
I'm trying to find a solution but these seems more like bugs in Apple's implementations of the video driver and we'll have to wait for those to get fixed in Monterey.
Why didn't they abstract it, or create a stripped/safe userland option to replace it? If I'm relying on a Mac for work, I can't have them removing essential features from my computer in a simple upgrade. Maybe MacOS needs semantic versioning, or at least some level of communication with the end user about compatibility.
This was never an intended feature. If I understand the article correctly, they were using an undocumented ("private") API which happened to stop working.
Every API is undocumented on MacOS, what do you want them to do? How are you supposed to discern between zombie XNU code and Good LTS Apple Compliant code?