Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Python 2.0 was released in October 2000. The Python ecosystem has witnessed several significant shifts in expectation as far as how software is built and delivered, from Slackware-style source builds to vendor packages to containers to uv just downloading a standalone binary archive. And the deadsnakes ppa and venvs, plus the ongoing awkwardness about whether pip should be writing stuff into usr/local or ~/.local or something else.

All of this alongside the rise of GitHub and free CI builders, it being trivial to depend on lots of other packages of unknown provenance, stdlib packages being completely sidelined by stuff like requests.

It’s really only in the last ten years or so that there’s been the clarity of what is a build backend vs frontend, what a lock file is and how workspace management fits into the whole picture. Distutils and setuptools are in there too.

Basically, Python’s packaging has been a mess for a long time, but uv getting almost everything right all of a sudden isn’t an accident; it’s an abrupt gelling of ideas that have been in progress for two decades.



> the deadsnakes ppa

Please don't use this. You need to be careful about how you place any secondary installation of Python on Ubuntu. Meanwhile, it's easy to build from source on Ubuntu and you can easily control its destination this way (by setting a prefix when you ./configure, and using make altinstall) and keep it out of Apt's way.

> and venvs, plus the ongoing awkwardness about whether pip should be writing stuff into usr/local or ~/.local or something else.

There is not really anything like this. You just use venvs now, which should have already been the rule since 3.3. If you need to put the package in the system environment, use an Apt package for that. If there isn't an Apt package for what you want, it shouldn't live in the system environment and also shouldn't live in your "user" site-packages — because that can still cause problems for system tools written in Python, including Apt.

You only need to think about venvs as the destination, and venvs are easy to understand (and are also fundamental to how uv works). Start with https://chriswarrick.com/blog/2018/09/04/python-virtual-envi... .

> It’s really only in the last ten years or so that there’s been the clarity of what is a build backend vs frontend

Well no; it's in that time that the idea of separating a backend and frontend emerged. Before that, it was assumed that Setuptools could just do everything. But it really couldn't, and it also led to people distributing source packages for pure-Python projects, resulting in installation doing a ton of ultimately useless work. And now that Setuptools is supposed to be focused on providing a build backend, it's mostly dead code in that workflow, but they still can't get rid of it for backwards compatibility reasons.

(Incidentally, uv's provided backend only supports pure Python — they're currently recommending heavyweight tools like maturin and scikit-build-core if you need to compile something. Although in principle you can use Setuptools if you want.)


> Meanwhile, it's easy to build from source on Ubuntu and you can easily control its destination this way

word of warning: I spent a lot of years working off of "built from source" Python on Ubuntu and every once in a while I'd have really awkward issues downstream of me not realizing I was missing some lib when I built Python and then some random standard library was just missing for me.

I think it's all generally good, but real easy to miss optional package stuff.


> You just … now

Yes, the point of my post wasn’t to give current best practice counsel but rather to illustrate how much that counsel has changed over the years as the needs and desires of the maintainers, distro people, developers, and broader community have evolved.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: