Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It does work, in practice, and has for years. Every package in the Debian (or Ubuntu, or FreeBSD, etc.) package repos depends only on other packages in those repos and on the base OS. It works fine.

Packages being months behind the latest version is a feature, not a bug — it means things will only be randomly changing under your feet rarely, with the exception of security fixes.



> Packages being months behind the latest version is a feature, not a bug

I struggle to convince devs and management about this in the past few years. Everyone's gone crazy about latest and greatest set of features with no respect for stability, maintainability, etc.


To be fair upgrading very old libraries because you hit a fixed bug to very new libraries can be a long process... but I never have worked on a project that stayed on new libraries continuously so... not sure how it works the other way.


> Packages being months behind the latest version is a feature, not a bug — it means things will only be randomly changing under your feet rarely, with the exception of security fixes.

If things are "randomly changing" by updates, that means the upstream package isn't following semantic versioning practices, and more importantly, isn't preserving backwards compatibility with their releases. That's a mark of bad software development practices, and it also isn't solved by an arbitrary wait period: that just means months after the breaking change is done, you finally notice and complain, but most maintainers are going to (rightfully) ignore you by that point.

I'd make the same argument in general software quality: with new features, often come new bugs, so by delaying, you can delay introduction of those bugs. However, bugs are easier to fix the sooner they're found, so again, quick turn-around improves things, even if it sometimes causes short-term pain.

This is one thing where NPM is definitely miles ahead of APT. With APT, you get a single version of each package, so it's all or nothing. With NPM, you can specify `1.1.x` so even if a version 1.2 or 2.0 comes out, you're on the stable old one. The closest thing that seems to happen with linux packaging is on a major version (with backwards-incompatible changes) a new package name is created with a "2" on the end, to signal this incompatibility -- how is that anything but a hacky workaround to not having proper versioning support?


>With APT, you get a single version of each package, so it's all or nothing.

It's definitely not used as often as in npm packages, but you can use =version after your package name to apt install a particular version, or apt pinning for more complex setups.


The point your parent is making is that with mom, you can install one version of a package per project you’re working on, instead of one version on the system. Especially if you’re working on multiple projects or multiple versions of the same project, this is required.


But it doesn't work for libraries people will be using in development. Often you are waiting for a handful of packages to introduce specific features or bug fixes and need them the moment they are available. NPM isn't user space, it's dev space. Timeliness is the maxim.


I’m not a JS developer so maybe I’m missing something, but how do you use a library during development and not use it in production?

In the C++ world, I can’t imagine a situation where you would need to depend on, for example, libjpg while developing, but not need to read JPEG files in prod/end-user-space.


There are a slew of dev-only dependencies in JS-land. Packages that run local servers for hot-reloading during development. Test runners, linters, TypeScript compilers, SCSS compilers, and so on. None of those things need to be included with the bundled product.

Check out Electron or React starter apps for an example, their boilerplates should have dozens of examples.


Since another user explained that point, I would also add that in the "move fast and break things" environment of the web, for better or for worse, responsive and automated library updates are often desirable or even required functionality which can be tied to substantial real-world profits if something suddenly goes wrong somewhere in the stack.


Web development is one of my tasks and thakfully we don't move fast and break things, rather rely on proven and established development stacks like JEE, Spring, ASP.NET and VanilaJS.


There are some reasonably stable packages in the npm ecosystem, it's just pretty much the Wild West out there. It really depends on what you're building. It's pretty hard to get a new SPA started these days without a development environment which uses npm, especially if you're running lean.

The problem is keeping your dependency tree reasonable. Even pulling in a couple of packages might lead to hundreds of dependencies. And even if you have the sense not to use an external package for something as simple as left-padding a string, someone somewhere in that dependency tree might not feel the same way. And that's all it takes to bring a chunk of the web down.

All of these problems of course are due to JS not having a mature ecosystem or standard library. People shouldn't need to reinvent the wheel every hour, nor should they be pulling in modules less than a dozen or two lines of code. That, and the constant misguided financial incentive to deploy ever more complex functionality over http are what lead to this aggressive push for cutting edge tools.


In the C++ world, you almost certainly don't need headers in prod.


Yes, true. What’s your point? You still need libjpg to be either statically linked into your executable or present on the target system.


headers are not libraries




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: