I felt like python packaging was more or less fine, right up until pip started to warn me that I couldn't globally install packages anymore. So I need to make a billion venvs to install the same ml, plotting libraries and dependencies, that I don't want in a requirements.txt for the project.
I just want packaging to fuck off and leave me alone. Changes here are always bad, because they're changes.
I'd otherwise agree but this problem seems unique to Python. I don't have problems like this with npm or composer or rubygems. Or at least very infrequently. It's almost every time I need to update dependencies or install on a new machine that the Python ecosystem decides I'm not worthy.
I think pip made some poor design choices very early, but pip stuck around for a long time and people kept using it. Of course things got out of control, then people kept inventing new package management until uv comes along. I don't know enough about Python to understand how people could live with that for so long.
honestly until UV I thought this was the only sane way to package a python app, now it's still the only sane way and I'll use uv in the dockerfile which is honestly more complicated than their docs or reason would expect.
No, they're saying that they have one venv they use for everything (i.e., it's basically a working "pip install --user").
I think it's a good thing that pip finally decided to refuse overwriting distro package paths (the alternative is far worse) but making "pip install --user" not work as well doesn't make sense.
Or for that matter, that the ones they do have are compatible with packages that comes from other places. I've seen language libraries be restructured when OS packagers got hold of them. That wasn't pretty.
I just want packaging to fuck off and leave me alone. Changes here are always bad, because they're changes.