Digit and film single long exposures are functionally identical and for very dim objects you still need long exposures because the amount of light reaching the sensor is very low. You can't image stack if the object is too dim to appear in the shorter exposure's you're stacking.
Astronomical sensors typically are 12 bit. So anything 4000 times brighter will overflow the sensor and can not be subtracted. And of course you have perfect solutions to Poisson noise, bleeding pixels, and increased dead time caused by shorter read out intervals as well, right? We are done here. But not for the reasons you think. You did not win this discussion.
A simple median filter will take care of it. Have you seen what an unprocessed image from the Hubble Space Telescope looks like? It's riddled with cosmic ray tracks, and they all get filtered out.
There are any number of ways to correct observations for satellite trails, not the least because their appearance and duration are both transient and 100% predictable in any given field of view, and can be gated out of the exposure series. Cosmic rays don't even give us the latter break. (To be fair, the rabbit hole does go deeper than the usual places where a median filter alone will save you, but the general problem is far from intractable.)
But never mind, we've been told that digital imaging is "functionally identical to film," and we're "losing the discussion."