Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

>So just manual discipline? It works (most of the time), but in my experience there's a "discipline budget"; every little niggle you have to worry about manually saps your ability to think about the actual business problem.

>...but getting forced to update while you're in the middle of working on a feature...

I feel like trying to work on more than one project in the same session would require more such discipline.

>So how do you ensure that? pip dependency resolution is nondeterministic, dependency versions aren't locked by default and even if you lock the versions of your immediate dependencies, the versions of your transitive dependencies are still unlocked.

Ah, so this is really about lock files. I primarily develop libraries; if something breaks this way, I want to find out about it as soon as possible, so that I can advertise correct dependency ranges to my downstream.

The requirements.txt approach does, of course, allow you to list transitive dependencies explicitly, and pin everything. It's not a proper lock file (in the sense that it says nothing about supply chains, hashes etc.) but it does mean you get predictable versions of everything from PyPI (assuming your platform doesn't somehow change).

If I needed proper lock files, then I would take an approach that involves them, yes. Fortunately, it looks like I'd be able to take advantage of the PEP 751 standard if and when I need that.

>Putting pip inside Python was dumb and is another pitfall uv avoids/fixes.

Agreed completely! (Of course I was only using a venv so that I could have a separate, parallel version of Pip for testing.) Rather, the Pip bootstrapping system (which you can completely skip now, thanks to the `--python` hack) is dumb, along with all the other nonsense it's enabled (such as other programs trying to use Pip programmatically without a proper API, and without declaring it as a dependency; and such as empowering the Pip team to go so long without even as functional of a solution as `--python`; and such as making lots of people think that Python venv creation has to be much slower than it really does).

I'll be fixing this with Paper, too, of course.



> I feel like trying to work on more than one project in the same session would require more such discipline.

We all know that multitasking reduces productivity. But business often demands it (hopefully while being conscious of what it's costing).

You also don't have to be working in the "same session" to trip yourself up this way - "this terminal tab still has the venv from what I was working on yesterday/last week" is a way I've had it happen.

> I primarily develop libraries; if something breaks this way, I want to find out about it as soon as possible, so that I can advertise correct dependency ranges to my downstream.

If you want to find out as soon as possible, better to have a systematic way of finding out (e.g. a daily "edge build") than pick up new dependencies essentially at random.

> The requirements.txt approach does, of course, allow you to list transitive dependencies explicitly, and pin everything.

It allows you to, but it doesn't make it easy or natural. Especially if you're making a library, you probably don't want to list all your transitive dependencies or pin exact versions in your requirements.txt (at least not the one you're publishing). So you end up with something like two different requirements.txt where you use a frozen one for development and then switch to an unfrozen one for release or when you need to add or change dependencies, and regenerate the frozen one every so often. None of which is impossible, but it's all tedious and error-prone and there's no real standardisation (so e.g. even if you come up with a good workflow for your project, will your IDE understand it?).

> Fortunately, it looks like I'd be able to take advantage of the PEP 751 standard if and when I need that.

That's a standard written in response to the rise of uv, that still hasn't been agreed to, much less implemented, much less turned on by default (and unfortunately most of the time when you realise you need a lock file, you need the lock file that the first run of your tool would have generated when it was run, not the lock file it would generate now - so an optional lock file is of limited effectiveness). I don't think it justifies a "python packaging has never been a problem" stance - quite the opposite, it's an acknowledgement that pre-uv python packaging really was as broken as many of us were saying.


>None of which is impossible, but it's all tedious and error-prone and there's no real standardisation (so e.g. even if you come up with a good workflow for your project, will your IDE understand it?).

I mean, my "IDE" is Vim, and I'm not even a Vim power-user or anything.

People gravitate towards tools according to their needs and preferences. My own needs are simple, and my aesthetic sense is such that I strongly prefer to use many small tools instead of an opinionated, over-arching workflow tool. Getting into the details probably isn't productive any further from here.

>That's a standard written in response to the rise of uv

I know it looks this way given the timing, but I really don't think that's accurate. Python packaging discussion moves slowly and people have been talking about lock files for a long time. PEP 751 has seen multiple iterations, and it's not the first attempt, either. When uv first appeared, a lot of important people were taken completely by surprise; they hadn't heard of the project at all. My impression is that the Astral team liked it just fine that way, too. But it's not as if someone like Brett Cannon had an epiphany from seeing uv's approach. Poetry has been doing its own lock files for years.

>so an optional lock file is of limited effectiveness

The problem is that you aren't going to just get everyone to do everything "professionally". Python is where it is because of the low barrier to entry. A quite large fraction of Python programmers likely still don't even know what pyproject.toml is.

>I don't think it justifies a "python packaging has never been a problem" stance

That's certainly not my stance and I don't think it's the other guy's stance. I just shy away from heavyweight solutions on principle. Simple is better than complex, and all that. And I end up noticing problems that others don't, this way.


> People gravitate towards tools according to their needs and preferences.

Up to a point, but people are also nudged, not always consciously, by the reality of what tools exist in their ecosystem. The fact that Python makes "heavy" tools difficult to write and use is a significant factor in what many Python developers think is just a personal preference, IME. (I'd also argue that if you want to use lots of small tools you actually have more need for a standard format for your dependencies and your lockfile, since all the tools need to understand it).

> The problem is that you aren't going to just get everyone to do everything "professionally". Python is where it is because of the low barrier to entry. A quite large fraction of Python programmers likely still don't even know what pyproject.toml is.

Yes and no. I agree that many Python programmers aren't going to change the defaults and may not even know where their tool config file is. Any approach that requires extra effort from the user is not going to succeed. That's exactly why I think lockfiles need to be on by default, which is not something that has to make things harder for users (e.g. npm is a similarly beginner-first ecosystem but they have lockfiles and I've never seen it cited as something that makes it harder to get started or anything like that).




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: