First, can this not be reduced to an automation and compute problem? A new version of a library with a bug fix or security fix is released, so rebuild a bunch of packages. Where's the issue? Statically linked binaries can be produced from dynamic libraries, so the "rebuild a bunch of packages" can be further reduced to "re-link a bunch of packages". Optimize bandwidth consumption by building locally and validating against a reproduceable build transparency log.
Second, shouldn't the majority of the time rebuilding packages be dedicated to testing each application with the new library? Would you just skip that step if you used dynamic libraries?
Second, shouldn't the majority of the time rebuilding packages be dedicated to testing each application with the new library? Would you just skip that step if you used dynamic libraries?