Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

All python packaging challenges are solved. Lesson learned is that there is not a single solution for all problems. getting more strings attached with VC funded companies and leaning on their infrastructure is a high risk for any FOSS community.


Well I started with pip because it's what I was told to use. But it was slow and had footguns. And then I started using virtualenv, but that only solved part of the problem. So I switched to conda, which sometimes worked but wrecked my shell profile and often leads to things mysteriously using the wrong version of a package. So someone told me to use pipenv, which was great until it was abandoned and picked up by someone who routinely broke the latest published version. So someone told me to use poetry, but it became unusably slow. So I switched back to pip with the built-in venv, but now I have the and problems I had before, with fewer features. So I switched to uv, because it actually worked. But the dependency I need is built and packaged differently for different operating systems and flavor of GPU, and now my coworkers can't get the project to install on their laptops.

I'm so glad all the Python packaging challenges are "solved"


I started with "sudo apt install python" a long time ago and this installed python2. This was during the decades-long transition from python2 to python3, so half the programs didn't work so I installed python3 via "sudo apt install python3". Of course now I had to switch between python2 and python3 depending on the program I wanted to run, that's why Debian/Ubuntu had "sudo update-alternatives --config python" for managing the symlink for "python" to either python2 or python3. But shortly after that, python3-based applications also didn't want to start with python3, because apt installed python3.4, but Python developers want to use the latest new features offered by python3.5 . Luckily, Debian/Ubuntu provided python3.5 in their backports/updates repositories. So for a couple of weeks things sort of worked, but then python3.7 was released, which definitely was too fresh for being offered in the OS distribution repositories, but thanks to the deadsnakes PPA, I could obtain a fourth-party build by fiddling with some PPA commands or adding some entries of debatable provenance to /etc/apt/lists.conf. So now I could get python3.7 via "sudo apt install python3.7". All went well again. Until some time later when I updated Home Assistant to its latest monthly release, which broke my installation, because the Home Assistant devs love the latest python3.8 features. And because python3.8 wasn't provided anymore in the deadsnakes PPA for my Ubuntu version, I had to look for a new alternative. Building python from source never worked, but thank heavens there is this new thing called pyenv (cf. pyenv), and with some luck as well as spending a weekend for understanding the differences between pyenv, pyvenv, venv, virtualenv (a.k.a. python-virtualenv), and pyenv-virtualenv, Home Assistant started up again.

This wall of text is an abridged excursion of my installing-python-on-Linux experience.

There is also my installing-python-on-Windows experience, which includes: official installer (exe or msi?) from python.org; some Windows-provided system application python, installable by setting a checkbox in Windows's system properties; NuGet, winget, Microsoft Store Python; WSL, WSL2; anaconda, conda, miniconda; WinPython...


I understand this is meant as caricature, but for doing local development tools like mise or asdf are really something I've never looked back from. For containers it's either versioned Docker image or compile yourself.


The problem for me: a non-python developer, is that I just don't know what to do, ever, to run an existing script or program.

It seems every project out there uses a different package manager, a different version of python, a different config file to set all of that up.

Most of the time, I just have a random .py file somewhere. Sometimes it's a full project that I can look at and find out what package manager it's using. Sometimes it has instructions, most of the time not. _That's_ the situation I struggle with.

Do I just run ./script.py? python script.py? python3 script.py? python3.12 script.py? When inevitably I miss some dependencies, do I just pip install? python pip install? pipx install?

As a developer I'm sure that you just set it up and forget about it. And once things work, they probably keep working for you. But man, it really reflects negatively upon Python itself for me. I don't hate the language, but I sure hate the experience.


I believe what is missing is a way of distributing apps. You face similar issues if you get the C++ source of a random program - there are quite a few build systems in use! However, the compiled program can often just be zipped and shipped, somehow.


The C/C++ ecosystem is a bit more sane, but requires more expertise to fix. As long as you figure out the build process, usually you can rely on the distro packages. For Node and Rust, people really like to use the latest version and not the LTS one for their software.


I'm not in the market of selling python programs, but pyinstaller --onefile exists. It's imperfect, but I'm surprised it hasn't seen more uptake.


Uv solves this (with some new standards). ./script.py will now install the python version, create a venv, and install dependencies (very quickly) if they don’t exist already.

#!/usr/bin/env -S uv run --script # /// script # requires-python = ">=3.12" # dependencies = [ # "ffmpeg-normalize", # ] # ///


i think mise can support uv too as its backend https://mise.jdx.dev/mise-cookbook/python.html#mise-uv

Also I mean I understand mise but I personally just prefer using uv for python and bun for typescript both of which can run any version of python/ (node compliant?)

I still like the project though, but I tried to install elixir using it and it was a mess man.


Your comment shows the sad state of software quality those days. Rust is the same, move fast and break things. And lately also Mesa started to suffer from the same disease. You basically need, those days, the same build env like the one on the developer's machine or the build will fail.


O was trying to install Stable Diffusion just yesterday. They use Conda, so I installed it and tried to follow the instructions. First, the yaml file they provided was not valid . Following the commands to install packages explicitly failed because my Rust toolchain was old, so I updated it just for some other Rust dependency to fail to build, it didn’t even compile . Such a shit show.


Bad software quality is when you update your software frequently.

Instead, we should always choose the common denominator of the most obsolete software platform imaginable. If there is an OS that has not been maintained for several decades, then that is the baseline we should strive to support.

Using an operating system with old libraries and language runtimes is not a personal preference with the consequences of restricting oneself to older software versions, no, it is a commandment and must not be questioned.


Please no, I have to deal with old (but still supported) RHEL versions, this is definitely not the way to go.

You have to use ancient C++ standard versions, deal with bugs in libraries that have been fixed years ago, lose out on all kinds of useful improvements or you end up with retrofitting a modern toolchain on an old system (but you still have to deal with an old glibc).

It’s hell. Just make the tooling/QA good enough so that everyone can run on the latest stable OS not too long after it’s released.


I think I have a similar experience in some ways, but building python from source should work on linux in my experience. On a debian ish system I’d expect apt installing build essentials and the libraries you need and you should be good. I’ve done it with some pain on red hat-ish distros, which have tended to ship with python versions older than I’ve experience with. (I guess it’s better these days..?)


I started at about the same time you did, and I've never seen an instance of software expecting a Python version newer than what is in Debian stable. It happens all the time for Nodejs, Go, or Rust though.


I felt like python packaging was more or less fine, right up until pip started to warn me that I couldn't globally install packages anymore. So I need to make a billion venvs to install the same ml, plotting libraries and dependencies, that I don't want in a requirements.txt for the project.

I just want packaging to fuck off and leave me alone. Changes here are always bad, because they're changes.


I'd otherwise agree but this problem seems unique to Python. I don't have problems like this with npm or composer or rubygems. Or at least very infrequently. It's almost every time I need to update dependencies or install on a new machine that the Python ecosystem decides I'm not worthy.


I think pip made some poor design choices very early, but pip stuck around for a long time and people kept using it. Of course things got out of control, then people kept inventing new package management until uv comes along. I don't know enough about Python to understand how people could live with that for so long.


Every big Python repo has a Dockerfile, which is much less common in JS.


honestly until UV I thought this was the only sane way to package a python app, now it's still the only sane way and I'll use uv in the dockerfile which is honestly more complicated than their docs or reason would expect.


like what?


> pip started to warn me that I couldn't globally install packages anymore

Yeah I had that on my work computer. I just created a venv and source that in my .bashrc.


Hahaha that is an awesome middle finger to pip :-)


Is it? I would have thought this is still in line with their goals. Your individual projects' venvs will still be fully separate.


No, they're saying that they have one venv they use for everything (i.e., it's basically a working "pip install --user").

I think it's a good thing that pip finally decided to refuse overwriting distro package paths (the alternative is far worse) but making "pip install --user" not work as well doesn't make sense.


You can turn that off and allow global packages again if you want.

Or install it with the os package manager or something simmilar


You assume the OS package manager I happen to be using even has packages for some of the libraries I want to use.


Or for that matter, that the ones they do have are compatible with packages that comes from other places. I've seen language libraries be restructured when OS packagers got hold of them. That wasn't pretty.


Nothing about their post indicated they assumed that.

They offered two options, so you can go do the other one if it doesn't work for you.


I've walked the same rocky path and have the bleeding feet to show for it! My problem is that now my packaging/environment mental model is so muddled I frequently mix up the commands...


What's wrong with just using virtualenv. I never used anything else, and I never felt the need to. Maybe it's not as shipping l shiny as the other tools, but it just works.


The problem is you can do whatever you want in it, and then have no way of reproducing that.

pyproject.toml tries to fix it, poetry kept wanting to use their own wonky names for the tags, I'm not sure why.

Once that is standardized, venvs should be cattle and not pets. That is all that is needed. UV makes that fast by hardlinking in the libraries and telling you the obvoius (that venvs should be cattle and not pets)

This fight was poetry's to loose.


I think poetry “lost” because they had to build ahead of the PEPs that standardized their way of doing things. I don’t think uv could exist without the work the poetry people put in, it served as a pretty powerful demonstration of the fact that better python packaging was possible.


There’s nothing wrong with just using virtualenv. I too have used virtualenv plus pip (and sometimes pyenv) for the longest time without issue.

However, uv is the first alternative that has tempted me to switch. uv offers performance improvements over pip and handles pyenv use cases as well. I’ll be paying attention to pyx to see how it pans out.


Nothing is inherently wrong with virtualenv. All these tools make virtual environments and offer some way to manage them. But virtualenv doesn't solve the problem of dependency management.


> But the dependency I need is built and packaged differently for different operating systems and flavor of GPU, and now my coworkers can't get the project to install on their laptops.

This is why containers are great IMO.

It's meant to solve the problem of "well it works on my machine"


Yeah but it's also nice not having to rely on a container for stuff to work at all, especially when you're doing development.


While I can dev in a container, I don't enjoy the XP nearly as much due to the current state of tooling re remote debuggers.

I do ship containers as a top level .exe equivalent.


Even the way you import packages is kinda wack


Coming from the php ecosystem this kind of package manager problems feels crazy.

Maybe Python and js people should just use composer too.


It is hilarious how Composer went from "haha, PHP has a package manager now, bless their hearts" to "why can't it work like Composer?"

I don't think PHP is perfect by any means but I don't dread using Composer, even in contexts like WP plugins where it is definitely not a "native" technology.


JS packaging ecosystem is fairly mature compared to python. The install and deps issues way fewer.


You forgot the wheels and eggs


You can have my `easy_install` when you pry it from my cold dead fingers.


We actually built a platform that eliminates all these steps, you can now reproduce GitHub repos with 0 manual config in 60% cases. check for more info on https://x.com/KeploreAI We just launched it and waiting for first users to be astonished :), let me know if you have any questions


Man I used python sparingly over the years and I still had to deal with all those package manager changes. Worse than the JS bundling almost?


No, the JS bundling changes are almost monthly and it's impossible to know which one should I use and what will be broken because of my choice.


Other than a brief stint with yarn (v1) I've never had any of these troubles with node+npm


> All python packaging challenges are solved.

This comes across as uninformed at best and ignorant at worst. Python still doesn't have a reliable way to handle native dependencies across different platforms. pip and setuptools cannot be the end all be all of this packaging ecosystem nor should they be.


„across different platforms“

First things first:

Import path, os

I love Python, the ZEN of it, and you really need to accept the fact that there are conventions - quite a lot and that bash or shell scripts are where the magic happens, like environmental variables, if you know how to secure your app.

Even the self thing finally makes sense after years of bewilderment (“Wait: not even Java is that brutal to its users.”)

Lately stumbled over poetry after really getting the gist out of venv and pip.

Still hesitant, because Windows doesn’t play a role.


Try doing CUDA stuff. It's a chemical fire. And the money would make solving it would fund arbitrary largesse towards OSS in perpetuity.


I see VC money as an artificial force propping up a project. It is not bad per se, but VC money is not a constant and it leaves a big drop at the end. If there is a big enough community that has grown around the project, that drop might be okay.


I share your concern but I have saved so much time with uv already that I figure ill ride it till the VC enshitification kills the host.

Hopefully at the point the community is centralized enough to move in one direction.


I've been heartened by the progress that opentofu has made, so I think if it gets enough momentum it could survive the inevitable money grab


I agree, now I just use uv and forget about it. It does use up a fair bit of disk, but disk is cheap and the bootstrapping time reduction makes working with python a pleasure again


I recently did the same at work, just converted all our pip stuff to use uv pip but otherwise no changes to the venv/requirements.txt workflow and everything just got much faster - it's a no-brainer.

But the increased resource usage is real. Now around 10% of our builds get OOM killed because the build container isn't provisioned big enough to handle uv's excessive memory usage. I've considered reducing the number of available threads to try throttle the non-deterministic allocation behavior, but that would presumably make it slower too, so instead we just click the re-run job button. Even with that manual intervention 10% of the time, it is so much faster than pip it's worth it.


Please open an issue with some details about the memory usage. We're happy to investigate and feedback on how it's working in production is always helpful.

(I work on uv)


Last time I looked into this I found this unresolved issue, which is pretty much the same thing: https://github.com/astral-sh/uv/issues/7004

We run on-prem k8s and do the pip install stage in a 2CPU/4GB Gitlab runner, which feels like it should be sufficient for the uv:python3.12-bookworm image. We have about 100 deps that aside from numpy/pandas/pyarrow are pretty lightweight. No GPU stuff. I tried 2CPU/8GB runners but it still OOMed occasionally so didn't seem worth using up those resources for the normal case. I don't know enough about the uv internals to understand why it's so expensive, but it feels counter-intuitive because the whole venv is "only" around 500MB.


Thanks that's helpful.

Did you try reducing the concurrency limit?


Couldn’t agree more and the `uv run executable.sh` that contains a shebang, imports and then python is just magical.


Is that much different than the python inline script format?

https://peps.python.org/pep-0723/


I've been dealing with python vs debian for the last three hours and am deeply angry with the ecosystem. Solved it is not.

Debian decided you should use venv for everything. But when packages are installed in a venv, random cmake nonsense does not find them. There are apt-get level packages, some things find those, others do not. Names are not consistent. There's a thing called pipx which my console recommended for much the same experience. Also the vestiges of 2 vs 3 are still kicking around in the forms of refusing to find a package based on the number being present or absent.

Whatever c++headerparser might be, I'm left very sure that hacking python out of the build tree and leaving it on the trash heap of history is the proper thing to do.


from what I hear uv is the "solved" and venv by hand is the old way


These tools together solve a fraction of the problen. The other parts of the problem are interfacing with classic c, c++ libraries and handling different hardware and different OSes. It is not even funny how tricky it is to use the same GPU/CUDA versions but with different CPU architectures and hopefully most people dont need to be exposed to it. Sometimes parts of the stack depends on a different version of a c++ library than other parts of the stack. Or some require different kernel modules or CUDA driver settings. But I would be happy if there was a standardized way to at least link to the same C++ libraries, hopefully with the same ABI, across different clusters or different OS versions. Python is so far from solved…


We do something similar with a mix of C++, Fortran, and CUDA dependencies. Using https://cibuildwheel.pypa.io/en/stable/ and the repair functionality to restrict the manylinux to something sensible seems to have handled the complexity - while we mostly use Debian and Ubuntu the Arch folks seemed OK too.

We want to share the wheels more smoothly though and having been looking at pyx for that. Seems promising.


Why don't you use pixi?


Pixi inherited some ot the bad designs from conda, and conda clearly hadnt been close to solving python packaging either and was digging itself into a black hole over time.


uv is venv + insanely fast pip. I’ve used it every day for 5+ months and I still stare in amazement every time I use it. It’s probably the most joy I’ve ever gotten out of technology.


Installing packages it the most joy you've ever gotten outta tech?

Not a project you built, or something you're proud of? Installing packages?


I know it's like everyone's lost their mind right ?


Nope. You just haven't wrestled with python packages for long enough to appreciate the change.


(zzzeek wrote sqlalchemy, alembic, mako and probably more)


Oh, that explains a lot!

He creates the software and leaves it for others to install. No wonder he does not appreciate what uv does. ;)


Yup I'm a newb


pip is the default still


Uv truly is great, and I mean they are open source and we can always fork it just as how valkey forked redis

And also if you mean that pyx might be hosted on uv, well I think the discussion can go towards that pyx should be made open source but honestly, I am pretty sure that someone might look at pyx and create a pyx api compliant hosted server or I am still curious as to how pyx works and what it actually truly does.


No. This is the only thing that python still doesn’t have just working. Otherwise there would be no excitement for anything new in this space.


If Python packaging problems are solved, why is Python known for having the worst tooling ecosystem of any "modern" language?


+1


sorry, I guess you're new here? Here, try this Kool Aid. I think it will help you fit in. oh don't mind that "MongoDB" logo on the glass that's old




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: