Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I’ll bite - I could care less about speed, that feels like a talking point I see often repeated despite other package managers not being particularly slow. Maybe there’s some workload I’m missing that this is more important for?

I’ve tried uv a couple places where it’s been forced on me, and it didn’t work for whatever reason. I know thats anecdotal and I’m sure it mostly works, but it obviously was off putting. For better or worse I know how to use conda, and despite having to special attachment to it, slightly faster with a whole different set of rough edges is not at all compelling.

I have a feeling this is some kind of Rust fan thing and that’s where the push comes from, to try and insinuate it into more people’s workflows.

I’d like to hear a real reason I would ever migrate to it, and honestly if there isn’t one, am super annoyed about having it forced on me.



The place where speed really matters is in virtual environment management.

uv uses some very neat tricks involving hard links such that if you start a new uv-managed virtual environment and install packages into it that you've used previously, the packages are symlinked in. This means the new environment becomes usable almost instantly and you don't end up wasting filesystem space on a bunch of duplicate files.

This means it's no longer expensive to have dozens, hundreds or even thousands of environments on a machine. This is fantastic for people like myself who work on a lot of different projects at once.

Then you can use "uv run" to run Python code in a brand new temporary environment that get created on-demand within ms of you launching it.

I wrote a Bash script the other day that lets me do this in any Python project directory that includes a setup.py or pyproject.toml file:

  uv-test -p 3.11
That will run pytest with Python 3.11 (or 3.12/3.13/3.14/whatever version you like) against the current project, in a fresh isolated environment, without any risk of conflicting with anything else. And it's fast - the overhead of that environment setup is negligible.

Which means I can test any code I like against different Python versions without any extra steps.

https://til.simonwillison.net/python/uv-tests


Ooooh that's a neat one. I really like the hard links.

On my machine, there are like 100s of not thousands of venvs.

I simply have all of them under ~/.python_venvs/<project_name>/

Does that mean, no matter how many projects I install pytorch and tensoflow and huggingface and all the heavy machinery, they'll be counted only once as long as they're unique?

If that's the case, then I can leave my habit of pip and move to uv.

This is something that always bugged my mind about virtual environments in almost all the package managers.


"Does that mean, no matter how many projects I install pytorch and tensoflow and huggingface and all the heavy machinery, they'll be counted only once as long as they're unique?"

I think so, based on my understanding of how this all works. You may end up with different copies for different Python versions, but it should still save you a ton of space.


Just an update for whoever ends up on this comment.

This feature works as long as your venv that uv creates and the uv cache (in user home directory, or anywhere else that is configured to keep the cache folder) are both on the same filesystem.

The drive where the venv was created was NTFS and the user home directory where uv cache exists was ext4 file system.

Therefore the caching didn't work and a warning was shown by uv hinting to the problem.


Conda doesn't do lock files. If you look into it, the best you can do is freeze your entire environment. Aside from this being an entirely manual process, and thus having all the issues that manual processes bring, this comes with a few issues:

1. If you edit any dependency, you resolve the environment from scratch. There is no way to update just one dependency.

2. Conda "lock" files are just the hashes of the all the packages you happened to get, and that means they're non-portable. If you move from x86 to ARM, or Mac to Linux, or CPU to GPU, you have to throw everything out and resolve.

Point (2) has an additional hidden cost: unless you go massively out of your way, all your platforms can end up on different versions. That's because solving every environment is a manual process and it's unlikely you're taking the time to run through 6+ different options all at once. So if different users solve the environments on different days from the same human-readable environment file, there's no reason to expect them to be in sync. They'll slowly diverge over time and you'll start to see breakage because the versions diverge.

P.S. if you do want a "uv for Conda packages", see Pixi [1], which has a lot of the benefits of uv (e.g., lock files) but works out of the box with Conda's package ecosystem.

[1]: https://pixi.sh/latest/


> I have a feeling this is some kind of Rust fan thing and that’s where the push comes from, to try and insinuate it into more people’s workflows.

When I first started using uv, I did not know what language it was written in; it was a good tool which worked far better than its predecessors (and I used pdm/pipenv/pyenv/etc. pretty heavily and in non-basic ways). I still don’t particularly care if it’s written in Rust or Brainfuck, it works well. Rust is just a way to get to “don’t bootstrap Python environments in Python or shell”.

> I’ve tried uv a couple places where it’s been forced on me, and it didn’t work for whatever reason.

I’m curious what issues you encountered. Were these bugs/failures of uv, issues using it in a specific environment, or workflow patterns that it didn’t support? Or something else entirely?


You've never waited 10 minutes for conda to solve your environment and then say it's unsolvable?


I have, but it takes me back many years to some obscure situations I’ve been in. For my day to day, I can’t think of the last time I’ve encountered it, it’s been years, and I regularly am setting up new environments. That’s why I’m curious about the workflows where it matters.


I dunno, 2020? Since then I switched to mamba, then poetry, and now uv. I have spent way too much time fighting python's package managers and with uv I finally don't have to. ymmv


I’ve been trying uv lately to replace my normal workflow of selecting a python with pyenv for the shell, then making a venv, then installing a bunch of default packages (pandas, Jupyter, etc). So far the only benefit is that I can use just the one tool for what used to take 3 (pyenv, venv, pip). I don’t _hate_ it…but it really isn’t much of an improvement.


uv is comparable to npm. All your deps get auto tracked in a file. There are other things that do this, but pip isn't one of them, and I vaguely remember the others being less convenient.

The speed usually doesn't matter, but one time I did have to use it to auto figure out compatible deps in a preexisting project because the pip equivalent with backtracking was taking forever with CPU pegged at 100.


What tooling do you use?


Everyone downvoting you and disagreeing - don’t listen to them! I’m here to tell you that there is a massive conspiracy and everyone is in on it. Commenters on HN get paid every time someone downloads a Rust tool, that’s why they’re trying to convince you to use uv. It’s definitely not because they used it and found it worked well for them.

> could care less

I think “couldn’t care less” works better.


Being forced to use a tool you don't want to use sucks, no matter how awesome that tool may or may not actually be. *conda and uv have roughly the same goals which means they're quite similar. For me, the speed of uv really does set it apart. For python programs with lots of dependencies, it's faster enough that I found it worth it to climb its learning curve. (ChatGPT makes that curve rather flat.) pip install -r requirements.txt went from a coffee break to me watching uv create the venv. But okay, speed gains aren't going to convince you.

Both of them manage venvs, but where the venv goes (by default) makes a difference, imo. Conda defaults to a user level directory eg ~/.conda/envs/my-venv. uv prefers a .venv dir in the project's folder. It's small, but it means per-project venvs are slightly more ergonomic with uv. Wereas with conda, because they're shared under homedir, it's easy to get lazy once you have a working venv and reuse that good working venv across multiple programs, and then it breaks when one program needs its dependencies updated and now it's broken for all of them. Naturally that would never happen to a skilled conda operator, so I'll just say per-project uv venv creation and recreation flows just that tiny bit smoother, because I can just run "rm -rf .venv" and not worry about breaking other things. One annoyance I have with uv is that it really wants to use the latest version of python it knows about, and that version is too new for a program or one of its dependencies, and the program won't run. Running "uv venv --python 3.12" instead of"uv venv" isn't onerous, but it's annoying enough to mention. (pyproject.toml lets projects specify version requirements, but they're not always right.) Arguably that's a python issue and not uv's, but as users, we just want things to work, dammit. That's always the first thing I look for when things don't work.

As mentioned, with uv the project venv lives in .venv inside the project's directory which lets "uv run program.py" cheat. Who amongst us hasn't forgotten to "source .venv/bin/activate" and been confused when things "suddenly" stopped working. So if you're in the project directory, "uv run" will automatically use the project's .venv dir.

As far as it being pushed to promote rust. I'm sure there's a non-zero amount of people for whom that's true, but personally as that makes it harder to contribute to uv, it's actually a point against it. Sometimes I wonder how fast it would be if it was written in python using the same algorithms, but run under pypy.

Anyway, I wouldn't say any of that's revolutionary. Programs exist to translate between the different project file types (requirements.txt/environment.yml/pyproject.toml) so if you're already comfortable with conda and don't want to use uv, and you're not administering any shared system(s), I'd just stick the command to generate environment.yml from pyproject.toml on a cheat sheet somewhere.

---

One bug I ran into with one of the condas; I forgot which, is that it called out to pip under the hood in interactive mode and pip got stuck waiting for user input and that conda just sat there waiting for input that would never come. Forums were filled with reports by users talking about letting it run for hours or even days. I fixed that, but it soured me on *conda, unfortunately.


Tangential: if you're stuck in the condaverse I would *loudly* recommend checking out pixi. Pixi is to conda as uv is to setuptools.


Does uv actually have a replacement for setuptools yet?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: