I think it's also from trying to keep with the old paradigm of "libraries are installed and managed globally, potentially as linkable object files."
All the languages of today gain all their improvements from:
1. Nothing should be global, but if it is it's only a cache (and caches are safe to delete since they're only used as a performance optimization)
2. You have to have extremely explicit artifact versioning, which means everything needs checksums, which means mostly reproducible builds
3. The "blessed way" is to distribute the source (or a mostly-source dist) and compile things in; the happy path is not distributing pre-computed binaries
Now, everything I just said above is also wrong in many aspects or there's support for breaking any and all of the rules I just outlined, but in general, everything's built to adhere to those 3 rules nowadays. And what's crazy is that for many decades, those three rules above were considered absolutely impossible, or anti-patterns, or annoying, or a waste, etc (not without reason, but still we couldn't do it). That's what made package managers and package management so awful. That's why it was even possible to break things with `sudo pip install` vs `apt install`.
Now that we've abandoned the old ways in e.g. JS/Rust/Go and adopted the three rules, all kinds of delightful side effects fall out. Tools now which re-build a full dependency tree on-disk in the project directory are the norm (it's done automatically! No annoying bits! No special flags! No manual venv!). Getting serious about checksums for artifacts means we can do proper versioning, which means we can do aggressive caching of dependencies across different projects safely, which means we don't have to _actually_ have 20 copies of every dependency, one for each repo. It all comes from the slow distributed Gentoo/FreeBSD-ification of everything and it's great!
If, and only if, you have actual reproducible builds, you can distribute pre-compiled binaries as a cache optimization. That can allow for speedups without necessarily compromising security. It's also a prerequisite for a lot of "supply chain" security processes which are becoming increasingly desirable.
All the languages of today gain all their improvements from:
1. Nothing should be global, but if it is it's only a cache (and caches are safe to delete since they're only used as a performance optimization)
2. You have to have extremely explicit artifact versioning, which means everything needs checksums, which means mostly reproducible builds
3. The "blessed way" is to distribute the source (or a mostly-source dist) and compile things in; the happy path is not distributing pre-computed binaries
Now, everything I just said above is also wrong in many aspects or there's support for breaking any and all of the rules I just outlined, but in general, everything's built to adhere to those 3 rules nowadays. And what's crazy is that for many decades, those three rules above were considered absolutely impossible, or anti-patterns, or annoying, or a waste, etc (not without reason, but still we couldn't do it). That's what made package managers and package management so awful. That's why it was even possible to break things with `sudo pip install` vs `apt install`.
Now that we've abandoned the old ways in e.g. JS/Rust/Go and adopted the three rules, all kinds of delightful side effects fall out. Tools now which re-build a full dependency tree on-disk in the project directory are the norm (it's done automatically! No annoying bits! No special flags! No manual venv!). Getting serious about checksums for artifacts means we can do proper versioning, which means we can do aggressive caching of dependencies across different projects safely, which means we don't have to _actually_ have 20 copies of every dependency, one for each repo. It all comes from the slow distributed Gentoo/FreeBSD-ification of everything and it's great!