Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This doesn't sound too bad, and functions* that are not changing could, in theory, also be unified, although a single change in one of the transitive functions called might force you to keep multiple versions.


It's bad enough to make packages quadratic.


I'm not really familiar with packaging problems and obviously quadratic is worse than linear, but is having no duplicates a real alternative?

(Disclaimer: I'm mostly guessing here, but am I missing something important?)

AFAIU the choice here also considers how easily things will build and link, and whether your package manager/build tool will need a SAT-solver to figure out which version of each library to use, and even then you can still run into unsatisfiable restrictions (`Could not resolve dependencies`) if libraries are not adequately updated/maintained.

It seems that by allowing duplication "only" pay for the libraries that are not updated (including all their deps), which means that you are trading computer resources (disk, cpu?) for human time updating and debugging, which might be a really good deal, especially if you only end up with duplicates in the cases where you lack the human time to keep all deps updated.

One could argue that the time cost of maintaining the libraries can only be deferred so there's no benefit deferring it, but the time people using the libraries save because they don't need the libraries to get updated if they have enough disk is probably what made Rust and NPM just duplicate dependencies.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: