Hacker Newsnew | past | comments | ask | show | jobs | submit | AlphaSite's commentslogin

I appreciate everything they’ve done but the group which maintains Pip and the package index is categorically incapable of shipping anything at a good velocity.

It’s entirely volunteer based so I don’t blame them, but the reality is that it’s holding back the ecosystem.

I suspect it’s also a misalignment of interests. No one there really invests in improving UX.


> the group which maintains Pip and the package index is categorically incapable of shipping anything at a good velocity.

> It’s entirely volunteer based so I don’t blame them

It's not just that they're volunteers; it's the legacy codebase they're stuck with, and the use cases that people will expect them to continue supporting.

> I suspect it’s also a misalignment of interests. No one there really invests in improving UX.

"Invest" is the operative word here. When I read discussions in the community around tools like pip, a common theme is that the developers don't consider themselves competent to redesign the UX, and there is no money from anywhere to hire someone who would be. The PSF operates on an annual budget on the order of $4 million, and a big chunk of that is taken up by PyCon, supporting programs like PyLadies, generic marketing efforts, etc. Meanwhile, total bandwidth use at PyPI has crossed into the exabyte range (it was ~600 petabytes in 2023 and growing rapidly). They would be completely screwed without Fastly's incredible in-kind donation.


Indeed, they broke a few features in the last few years and made the excuse "we can't support them, we're volunteers." Well, how about stop breaking things that worked for a decade? That would take less effort.

They had time to force "--break-system-packages" on us though, something no one asked for.


> how about stop breaking things that worked for a decade?

They aren't doing this.

> They had time to force "--break-system-packages" on us though, something no one asked for.

The maintainers of several Linux distros asked for it very explicitly, and cooperated to design the feature. The rationale is extensively documented in the proposal (https://peps.python.org/pep-0668/). This is especially important for distros where the system package manager is itself implemented in Python, since corrupting the system Python environment could produce a state that is effectively unrecoverable (at least without detailed Python-specific know-how).


Oh really?

- https://github.com/pypa/packaging/issues/774

- https://github.com/pypa/setuptools/issues/3548

- https://github.com/pypa/pip/issues/7953

I relied on those for a decade, maybe two.

> something no one asked for

Was being a facetious, sure someone asked for it, but it was pretty dumb. This has never "corrupted" anything, is rare (not happened to me in last 15 years), and simply fixed when knowledgeable.

Not everyone can simply fix it, so a better solution would be to isolate the system python, allow more than one installed, etc. Distros already do this to some extent.


The employee isn’t allowed to pay for it. It has to be paid for by the employer (except premium processing and visa stamping fees (2-3k)).

That’s for people who already have H1Bs, this is the company trying to keep them long term by getting them a green card. The whole EB green card system is a bit of a mess.

Since H1B has a 6 year hard time limit.


I think apple calls them NPUs and Broadcom calls them XPUs. Given they’re basically the number 2 and 3 accelerator manufacturers one of those probably works.

I’d say where it’s more Important is when you need to manage database performance. This lets you design an api that’s pleasant for users, well normalised internally, while also performing well.

Usually normalisation and performance lead to a poor api that’s hard for users to use and hard hard to evolve since you’re so tightly coupled to your external representation.


I mean docker is orthogonal to package manager. It makes it easier to deploy but none of the other thing also have managers do are relevant.


I mean in the Bay Area where SFH have massively ballooned in price condos have remained steady for a long time. It works. Maybe not everyone gets the SFH but that’s fine and not the point. Everyone can own a home.


Can you just shell out to homebrew for unsupported cases? I don’t imagine the overhead of ruby will be that high compared to compiling the code.


If it doesn’t have those features then why would I even use it at all? Remote build and caching are the entire reason I’d even think about it.


Usually the VENV and import lines are enough


How do you determine where the venv is? AFAIK, uv run in script mode creates the venv in some random temporary directory.


I don’t know of a convenient way of doing it, but a clumsy way of doing it is to run this in your script:

  import os
  
  print(os.environ['VIRTUAL_ENV'] + '/bin/python')
Then, e.g. in VS Code, you bring up the command palette, run Python: Select Interpreter, and enter the result.


uv v0.6.10 has just been released with a more convenient way of doing this:

    uv python find --script foo.py
https://github.com/astral-sh/uv/releases/tag/0.6.10

https://docs.astral.sh/uv/reference/cli/#uv-python-find--scr...


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: