alt.hn

1/12/2025 at 8:06:49 PM

Uv's killer feature is making ad-hoc environments easy

https://valatka.dev/2025/01/12/on-killer-uv-feature.html

by astronautas

1/12/2025 at 8:54:02 PM

I really like uv, and it's the first package manager for a while where I haven't felt like it's a minor improvement on what I'm using but ultimately something better will come out a year or two later. I'd love if we standardized on it as a community as the de facto default, especially for new folks coming in. I personally now recommend it to nearly everyone, instead of the "welllll I use poetry but pyenv works or you could use conda too"

by nharada

1/12/2025 at 8:55:37 PM

I never used anything other than pip. I never felt the need to use anything other than pip (with virtualenv). Am I missing anything?

by poincaredisk

1/12/2025 at 9:21:17 PM

Couple of things.

- pip doesn't handle your Python executable, just your Python dependencies. So if you want/need to swap between Python versions (3.11 to 3.12 for example), it doesn't give you anything. Generally people use an additional tool such as pyenv to manage this. Tools like uv and Poetry do this as well as handling dependencies

- pip doesn't resolve dependencies of dependencies. pip will only respect version pinning for dependencies you explicitly specify. So for example, say I am using pandas and I pin it to version X. If a dependency of pandas (say, numpy) isn't pinned as well, the underlying version of numpy can still change when I reinstall dependencies. I've had many issues where my environment stopped working despite none of my specified dependencies changing, because underlying dependencies introduced breaking changes. To get around this with pip you would need an additional tool like pip-tools, which allows you to pin all dependencies, explicit and nested, to a lock file for true reproducibility. uv and poetry do this out of the box.

- Tool usage. Say there is a python package you want to use across many environments without installing in the environments themselves (such as a linting tool like ruff). With pip, you need to install another tool like pipx to install something that can be used across environments. uv can do this out of the box.

Plus there is a whole host of jobs that tools like uv and poetry aim to assist with that pip doesn't, namely project creation and management. You can use uv to create a new Python project scaffolding for applications or python modules in a way that conforms with PEP standards with a single command. It also supports workspaces of multiple projects that have separate functionality but require dependencies to be in sync.

You can accomplish a lot/all of this using pip with additional tooling, but its a lot more work. And not all use cases will require these.

by NeutralCrane

1/12/2025 at 11:33:07 PM

Yes, generally people already use an additional tool for managing their Python executables, like their operating system's package manager:

  $> sudo apt-get install python3.10 python3.11 python3.12
And then it's simple to create and use version-specific virtual environments:

  $> python3.11 -m venv .venv3.11
  $> source .venv3.11/bin/activate
  $> pip install -r requirements.txt
You are incorrect about needing to use an additional tool to install a "global" tool like `ruff`; `pip` does this by default when you're not using a virtual environment. In fact, this behavior is made more difficult by tools like `uv` if or `pipx` they're trying to manage Python executables as well as dependencies.

by driggs

1/12/2025 at 11:56:10 PM

> sudo apt-get install python3.10 python3.11 python3.12

This assumes the Python version you need is available from your package manager's repo. This won't work if you want a Python version either newer or older than what is available.

> You are incorrect about needing to use an additional tool to install a "global" tool like `ruff`; `pip` does this by default when you're not using a virtual environment.

True, but it's not best practice to do that because while the tool gets installed globally, it is not necessarily linked to a specific python version, and so it's extremely brittle.

And it gets even more complex if you need different tools that have different Python version requirements.

by athrun

1/13/2025 at 12:46:26 AM

>This assumes the Python version you need is available from your package manager's repo. This won't work if you want a Python version either newer or older than what is available.

And of course you could be working with multiple distros and versions of the same distro, production and dev might be different environment and tons of others concerns. You need something that just works across.

by coldtea

1/13/2025 at 1:04:21 AM

Surely you just use Docker for production, right?

by nightpool

1/13/2025 at 5:53:17 AM

You almost need to use Docker for deploying Python because the tooling is so bad that it's otherwise very difficult to get a reproducible environment. For many other languages the tooling works well enough that there's relatively little advantage to be had from Docker (although you can of course still use it).

by nicoburns

1/13/2025 at 6:04:28 AM

And how do you know everything is ok when you build your new docker image?

by wombatpm

1/13/2025 at 5:36:16 AM

>> You are incorrect about needing to use an additional tool to install a "global" tool like `ruff`; `pip` does this by default when you're not using a virtual environment.

>True, but it's not best practice to do that because while the tool gets installed globally, it is not necessarily linked to a specific python version, and so it's extremely brittle.

"Globally" means installed with sudo. These are installed into the user folder under ~/.local/ and called a user install by pip.

I wouldn't call it "extremely brittle" either. It works fine until you upgrade to a new version of python, in which case you install the package again. Happens once a year perhaps.

The good part of this is that unused cruft will get left behind and then you can delete old folders in ~/.local/lib/python3.? etc. I've been doing this over a decade without issue.

by mixmastamyk

1/13/2025 at 11:07:19 AM

> "Globally" means installed with sudo. These are installed into the user folder under ~/.local/ and called a user install by pip.

> It works fine until you upgrade to a new version of python, in which case you install the package again.

Debian/Ubuntu doesn't want you to do either, and tell you you'll break your system if you force it (the override flag is literally named "--break-system-packages"). Hell, if you're doing it with `sudo`, they're probably right - messing with the default Python installation (such as trying to upgrade it) is the quickest way to brick your Debian/Ubuntu box.

Incredibly annoying when your large project happens to use pip to install both libraries for the Python part, and tools like CMake and Conan, meaning you can't just put it all in a venv.

by TeMPOraL

1/13/2025 at 5:51:26 PM

Not Debian specific. The braindead option was added by pip to scare off newbies.

No one with the most basic of sysad skills is “bricked” by having to uninstall a library. Again have not experienced a conflict in over 15 years.

Use the system package manager or buid yourself for tools like cmake.

by mixmastamyk

1/13/2025 at 8:17:06 PM

Uninstalling a library - no. But I specifically mentioned trying to upgrade system Python, which is a quick way to break e.g. apt.

by TeMPOraL

1/14/2025 at 3:29:19 AM

Ok, getting it now. I said upgrade python, and you thought I meant upgrade the system python in conflict with the distro. But that's not really what I meant. To clarify... I almost never touch the system python, but I upgrade the distro often. Almost every Ubuntu/Mint has a new system Python version these days.

So upgrade to new distro release, it has a new Python. Then pip install --user your user tools, twine, httpie, ruff, etc. Takes a few moments, perhaps once a year.

I do the same on Fedora, which I've been using more lately.

by mixmastamyk

1/13/2025 at 9:05:52 AM

Nah, pip is still brittle here because it uses one package resolution context to install all your global tools. So if there is a dependency clash you are out of luck.

So that's why pipx was required, or now, UV.

by orra

1/13/2025 at 5:44:13 PM

Not happened in the last fifteen years, never used pipx. See my other replies.

by mixmastamyk

1/13/2025 at 8:27:30 AM

> It works fine until you upgrade to a new version of python, in which case you install the package again.

Or you install a second global tool that depends on an incompatible version of a library.

by lmm

1/13/2025 at 5:43:02 PM

Never happened, and exceedingly unlikely to because your user-wide tools should be few.

by mixmastamyk

1/14/2025 at 1:23:28 AM

> exceedingly unlikely to because your user-wide tools should be few.

Why "should"? I think it's the other way around - Python culture has shied away from user-wide tools because it's known that they cause problems if you have more than a handful of them, and so e.g. Python profilers remain very underdeveloped.

by lmm

1/14/2025 at 3:17:58 AM

There are simply few, I don't shy away from them. Other than tools replaced by ruff, httpie, twine, ptpython, yt-dlp, and my own tools I don't need anything else. Most "user" tools are provided by the system package manager.

All the other project-specific things go in venvs where they belong.

This is all a non-issue despite constant "end of the world" folks who never learned sysadmin and are terrified of an error.

If a libraries conflict, uninstall them, and put them in a venv. Why do all the work up front? I haven't had to do that in so long I forget how long it was. Early this century.

by mixmastamyk

1/14/2025 at 4:00:26 AM

> This is all a non-issue despite constant "end of the world" folks who never learned sysadmin and are terrified of an error.

It's not a non-issue. Yes it's not a showstopper, but it's a niggling drag on productivity. As someone who's used to the JVM but currently having to work in Python, everything to do with package management is just harder and more awkward than it needs to be (and every so often you just get stuck and have to rebuild a venv or what have you) and the quality of tooling is significantly worse as a result. And uv looks like the first of the zillions of Python package management tools to actually do the obvious correct thing and not just keep shooting yourself in the foot.

by lmm

1/14/2025 at 5:01:32 PM

It’s not a drag if you ignore it and it doesn’t happen even once a decade.

Still I’m looking forward to uv because I’ve lost faith in pypa. They break things on purpose and then say they have no resources to fix it. Well they had the resources to break it.

But this doesn’t have much to do with installing tools into ~/.local.

by mixmastamyk

1/13/2025 at 3:10:48 AM

> pip doesn't resolve dependencies of dependencies.

This is simply incorrect. In fact the reason it gets stuck on resolution sometimes is exactly because it resolved transitive dependencies and found that they were mutually incompatible.

Here's an example which will also help illustrate the rest of my reply. I make a venv for Python 3.8, and set up a new project with a deliberately poorly-thought-out pyproject.toml:

  [project]
  name="example"
  version="0.1.0"
  dependencies=["pandas==2.0.3", "numpy==1.17.3"]
I've specified the oldest version of Numpy that has a manylinux wheel for Python 3.8 and the newest version of Pandas similarly. These are both acceptable for the venv separately, but mutually incompatible on purpose.

When I try to `pip install -e .` in the venv, Pip happily explains (granted the first line is a bit strange):

  ERROR: Cannot install example and example==0.1.0 because these package versions have conflicting dependencies.

  The conflict is caused by:
      example 0.1.0 depends on numpy==1.17.3
      pandas 2.0.3 depends on numpy>=1.20.3; python_version < "3.10"

  To fix this you could try to:
  1. loosen the range of package versions you've specified
  2. remove package versions to allow pip to attempt to solve the dependency conflict
If I change the Numpy pin to 1.20.3, that's the version that gets installed. (`python-dateutil`, `pytz`, `six` and `tzdata` are also installed.) If I remove the Numpy requirement completely and start over, Numpy 1.24.4 is installed instead - the latest version compatible with Pandas' transitive specification of the dependency. Similarly if I unpin Pandas and ask for any version - Pip will try to install the latest version it can, and it turns out that the latest Pandas version that declares compatibility with 3.8, indeed allows for fetching 3.8-compatible dependencies. (Good job not breaking it, Pandas maintainers! Although usually this is trivial, because your dependencies are also actively maintained.)

> pip will only respect version pinning for dependencies you explicitly specify. So for example, say I am using pandas and I pin it to version X. If a dependency of pandas (say, numpy) isn't pinned as well, the underlying version of numpy can still change when I reinstall dependencies.

Well, sure; Pip can't respect a version pin that doesn't exist anywhere in your project. If the specific version of Pandas you want says that it's okay with a range of Numpy versions, then of course Pip has freedom to choose one of those versions. If that matters, you explicitly specify it. Other programs like uv can't fix this. They can only choose different resolution strategies, such as "don't update the transitive dependency if the environment already contains a compatible version", versus "try to use the most recent versions of everything that meet the specified compatibility requirements".

> To get around this with pip you would need an additional tool like pip-tools, which allows you to pin all dependencies, explicit and nested, to a lock file for true reproducibility.

No, you just use Pip's options to determine what's already in the environment (`pip list`, `pip freeze` etc.) and pin everything that needs pinning (whether with a Pip requirements file or with `pyproject.toml`). Nothing prevents you from listing your transitive dependencies in e.g. the [project.dependencies] of your pyproject.toml, and if you pin them, Pip will take that constraint into consideration. Lock files are for when you need to care about alternate package sources, checking hashes etc.; or for when you want an explicit representation of your dependency graph in metadata for the sake of other tooling.

> This assumes the Python version you need is available from your package manager's repo. This won't work if you want a Python version either newer or older than what is available.

I have built versions 3.5 through 3.13 inclusive from source and have them installed in /opt and the binaries symlinked in /usr/local/bin. It's not difficult at all.

> True, but it's not best practice to do that because while the tool gets installed globally, it is not necessarily linked to a specific python version, and so it's extremely brittle.

What brittleness are you talking about? There's no reason why the tool needs to run in the same environment as the code it's operating on. You can install it in its own virtual environment, too. Since tools generally are applications, I use Pipx for this (which really just wraps a bit of environment management around Pip). It works great; for example I always have the standard build-frontend `build` (as `pyproject-build`) and the uploader `twine` available. They run from a guaranteed-compatible Python.

And they would if they were installed for the system Python, too. (I just, you know, don't want to do that because the system Python is the system package manager's responsibility.) The separate environment don't matter because the tool's code and the operated-on project's code don't even need to run at the same time, let alone in the same process. In fact, it would make no sense to be running the code while actively trying to build or upload it.

> And it gets even more complex if you need different tools that have different Python version requirements.

No, you just let each tool have the virtual environment it requires. And you can update them in-place in those environments, too.

by zahlman

1/13/2025 at 9:40:26 AM

> This is simply incorrect. In fact the reason it gets stuck on resolution sometimes is exactly because it resolved transitive dependencies and found that they were mutually incompatible.

The confusion might be that this used to be a problem with pip. It looks like this changed around 2020, but before then pip would happily install broken versions. Looking it up, this change of resolution happened in a minor release.

by IanCal

1/13/2025 at 10:52:11 AM

You have it exactly, except that Pip 20.3 isn't a "minor release" - since mid-2018, Pip has used quarterly calver, so that's just "the last release made in 2020". (I think there was some attempt at resolving package versions before that, it just didn't work adequately.)

by zahlman

1/14/2025 at 10:17:54 AM

Ah thank you for the correction, that makes sense - it seemed very odd for a minor version release.

I think a lot of people probably have strong memories of all the nonsense that earlier pip versions resulted in, I know I do. I didn't realise this was a more solved problem now as not seeing an infrequent issue is hard to notice.

by IanCal

1/13/2025 at 4:31:54 AM

> Well, sure; Pip can't respect a version pin that doesn't exist anywhere in your project. If the specific version of Pandas you want says that it's okay with a range of Numpy versions, then of course Pip has freedom to choose one of those versions. If that matters, you explicitly specify it

Nearly every other language solves this better than this. What your suggesting breaks down on large projects.

by jshen

1/13/2025 at 5:05:54 AM

>Nearly every other language solves this better than this.

"Nearly every other language" determines the exact version of a library to use for you, when multiple versions would work, without you providing any input with which to make the decision?

If you mean "I have had a more pleasant UX with the equivalent tasks in several other programming languages", that's justifiable and common, but not at all the same.

>What your suggesting breaks down on large projects.

Pinned transitive dependencies are the only meaningful data in a lockfile, unless you have to explicitly protect against supply chain attacks (i.e. use a private package source and/or verify hashes).

by zahlman

1/13/2025 at 8:59:02 AM

IMHO the clear separation between lockfile and deps in other package managers was a direct consequence of people being confused about what requirements.txt should be. It can be both and could be for ages (pip freeze) but the defaults were not conductive to clear separation. If we started with lockfile.txt and dependencies.txt, the world may have looked different. Alas.

by baq

1/13/2025 at 5:37:43 PM

The thing is, the distinction is purely semantic - Pip doesn't care. If you tell it all the exact versions of everything to install, it will still try to "solve" that - i.e., it will verify that what you've specified is mutually compatible, and check whether you left any dependencies out.

by zahlman

1/13/2025 at 2:51:42 PM

What's your process for ensuring all members of a large team are using the same versions of libraries in a non trivial python codebase?

by jshen

1/13/2025 at 5:31:18 PM

If all you need to do is ensure everyone's on the same versions of the libraries - if you aren't concerned with your supply chain, and you can accept that members of your team are on different platforms and thus getting different wheels for the same version, and you don't have platform-specific dependency requirements - then pinned transitive dependencies are all the metadata you need. pyproject.toml isn't generally intended for this, unless what you're developing is purely an application that shouldn't ever be depended on by anyone else or sharing an environment with anything but its own dependencies. But it would work. The requirements.txt approach also works.

If you do have platform-specific dependency requirements, then you can't actually use the same versions of libraries, by definition. But you can e.g. specify those requirements abstractly, see what the installer produces on your platform, and produce a concrete requirement-set for others on platforms sufficiently similar to yours.

(I don't know offhand if any build backends out there will translate abstract dependencies from an sdist into concrete ones in a platform-specific wheel. Might be a nice feature for application devs.)

Of course there are people and organizations that have use cases for "real" lockfiles that list provenance and file hashes, and record metadata about the dependency graph, or whatever. But that's about more than just keeping a team in sync.

by zahlman

1/14/2025 at 1:47:38 AM

So you a re proposing to manually manage all transitive dependencies?

by jshen

1/13/2025 at 2:00:12 AM

It’s like a whole post of all the things you’re not supposed to do with Python, nice.

by achileas

1/13/2025 at 12:00:33 AM

most developers I know do not use the system version of python. We use an older version at work so that we can maximize what will work for customers and don't try to stay on the bleeding edge. I imagine others do want newer versions for features, hence people find products like UV useful

by EasyMark

1/13/2025 at 2:42:26 AM

That assumes that you are using a specific version of a specific Linux distribution that happens to ship specific versions of Python that you are currently targeting. That's a big assumption. uv solves this.

by diath

1/13/2025 at 12:24:31 AM

(I've just learned about uv, and it looks like I have to pick it up since it performs very well.)

I just use pipx. Install guides suggest it, and it is only one character different from pip.

With Nix, it is very easy to run multiple versions of same software. The path will always be the same, meaning you can depend on versions. This is nice glue for pipx.

My pet peeve with Python and Vim is all these different package managers. Every once in a while a new one is out and I don't know if it will gain momentum. For example, I use Plug now in Vim but notice documentation often refers to different alternatives these days. With Python it is pip, poetry, pip search no longer working, pipx, and now uv (I probably forgot some things).

by Fnoord

1/13/2025 at 3:20:56 AM

Pipx is a tool for users to install finished applications. It isn't intended for installing libraries for further development, and you have to hack around it to make that work. (This does gain you a little bit over using Pip directly.)

I just keep separate compiled-from-source versions of Python in a known, logical place; I can trivially create venvs from those directly and have Pip install into them, and pass `--python` to `pipx install`.

>With Python it is pip, poetry, pip search no longer working, pipx, and now uv (I probably forgot some things).

Of this list, only Poetry and Uv are package managers. Pip is by design, only an installer, and Pipx only adds a bit of environment management to that. A proper package manager also helps you keep track of what you've installed, and either produces some sort of external lock file and/or maintains dependency listings in `pyproject.toml`. But both Poetry and Uv go further beyond that as well, aiming to help with the rest of the development workflow (such as building your package for upload to PyPI).

If you like Pipx, you might be interested in some tips in my recent blog post (https://zahlman.github.io/posts/2025/01/07/python-packaging-...). In particular, if you do need to install libraries, you can expose Pipx's internal copy of Pip for arbitrary use instead of just for updating the venvs that Pipx created.

by zahlman

1/13/2025 at 9:47:22 AM

I also tend to use the OS package manager to install other binary dependencies. Pip does the rest perfectly well.

by graemep

1/13/2025 at 3:05:18 PM

Yeah, venv is really the best way to manage Python environments. In my experience other tools like Conda often create more headaches than they solve.

Sure, venv doesn't manage Python versions, but it's not that difficult to install the version you need system-wide and point your env to it. Multiple Python versions can coexist in your system without overriding the default one. On Ubuntu, the deadsnakes PPA is pretty useful if you need an old Python version that's not in the official repos.

In the rare case where you need better isolation (like if you have one fussy package that depends on specific system libs, looking at you tensorflow), Docker containers are the next best option.

by Gazoche

1/12/2025 at 10:48:56 PM

Sometimes I feel like my up vote doesn't adequately express my gratitude.

I appreciate how thorough this was.

by kiddico

1/12/2025 at 11:05:04 PM

Oh wow, it actually can handle the Python executable? I didn't know that, that's great! Although it's in the article as well, it didn't click until you said it, thanks!

by stavros

1/13/2025 at 4:56:07 AM

I would avoid using this feature! It downloads a compiled portable python binary from some random github project not from PSF. That very same github project recommends against using their binary as the compilation flags is set for portability against performance. See https://gregoryszorc.com/docs/python-build-standalone/main/

by meitham

1/13/2025 at 12:28:04 PM

https://github.com/astral-sh/python-build-standalone is by the same people as uv, so it's hardly random. The releases there include ones with profile-guided optimisation and link time optimisation [1], which are used by default for some platforms and Python versions (and work seems underway to make them usable for all [2]). I don't see any recommendation against using their binaries or mention of optimising for portability at the cost of performance on the page you link or the pages linked from it that I've looked at.

[1] https://github.com/astral-sh/uv/blob/main/crates/uv-python/d... (search for pgo)

[2] https://github.com/astral-sh/uv/issues/8015

by mkl

1/13/2025 at 7:01:10 PM

Its not from some random github project its from a trusted member of open source community. Same as other libraries you use and install.

It was used by rye before rye and uv sort of merged and is used by pipx and hatch and mise (and bazel rules_python) https://x.com/charliermarsh/status/1864042688279908459

My understanding is that the problem is that psf doesnt publish portable python binaries (I dont think they even publish any binaries for linux). Luckily theres some work being done on a pep for similar functionality from an official source but that will likely take several years. Gregory has praised the attempt and made suggestions based on his experience. https://discuss.python.org/t/pep-711-pybi-a-standard-format-...

Apparently he had less spare time for open source and since astral had been helping with a lot of the maitinence work on the project he happily transfered over ownership to themin December

https://gregoryszorc.com/blog/2024/12/03/transferring-python... https://astral.sh/blog/python-build-standalone

by rat87

1/13/2025 at 8:31:29 PM

That makes sense thanks for sharing these details

by meitham

1/13/2025 at 3:24:21 AM

I still don't understand why people want separate tooling to "handle the Python executable". All you need to do is have one base installation of each version you want, and then make your venv by running the standard library venv for that Python (e.g. `python3.x -m venv .venv`).

by zahlman

1/13/2025 at 3:25:58 AM

> All you need to do is have one base installation of each version you want

Because of this ^

by stavros

1/13/2025 at 5:02:55 AM

But any tool you use for the task would do that anyway (or set them up temporarily and throw them away). Python on Windows has a standard Windows-friendly installer, and compiling from source on Linux is the standard few calls to `./configure` and `make` that you'd have with anything else; it runs quite smoothly and you only have to do it once.

by zahlman

1/13/2025 at 5:11:47 AM

I need to tell you a secret... I'm a long-life Linux user (since mandrake!)

Also, I don't have a c compiler installed.

by _ZeD_

1/13/2025 at 5:32:10 AM

Really? I was told Mint was supposed to be the kiddie-pool version of Linux, but it gave me GCC and a bunch of common dependencies anyway.

(By my understanding, `pyenv install` will expect to be able to run a compiler to build a downloaded Python source tarball. Uv uses prebuilt versions from https://github.com/astral-sh/python-build-standalone ; there is work being done in the Python community on a standard for packaging such builds, similarly to wheels, so that you can just use that instead of compiling it yourself. But Python comes out of an old culture where users expect to do that sort of thing.)

by zahlman

1/13/2025 at 12:23:32 PM

In Debian build-essential package is only recommended dependency of pip. Pyenv obviously wouldn't work without it.

by chupasaurus

1/13/2025 at 5:56:58 AM

Having to manually install python versions and create venvs is pretty painful compared to say the Rust tooling where you install rustup once, and then it will automatically choose the correct Rust version for each project based on what that project has configured.

UV seems like it provides a lot of that convenience for python.

by nicoburns

1/13/2025 at 3:40:53 AM

I'm glad to let uv handle that for me. It does a pretty good job at it!

by gtaylor

1/13/2025 at 7:06:13 PM

Lots of reasons starting. You may want many people to have the same point release. They have early builds without needing to compile it from source and have free threading (mogul) builds. I think they might even have pro builds. Not to mention that not all district releases will have the right python release. Also people want the same tool to handle both python version and venv creation and requirement installation

by rat87

1/14/2025 at 2:28:36 AM

>Also people want the same tool to handle both python version and venv creation and requirement installation

This is the part I don't understand. Why should it be the same tool? What advantage does that give over having separate tools?

by zahlman

1/14/2025 at 6:05:34 AM

Because its easier. Because it fits together nicer and more consistently. Also because UV is well written and written in rust so all the parts are fast. You can recreate a venv from scratch for every run.

Also as silly as it is I actually have a hard time remembering the venv syntax each time.

uv run after a checkout with a lock file and a .python-version file downloads the right python version creates a venv and then installs the packages. No more needing throwaway venvs to get a clean pip freeze for requirements. And I don't want to compile python, even with something helping me compile and keep track of compiles like pyenv a lot can go wrong.

And that assumes an individualindividual project run by someone who understands python packaging. UV run possibly in a wrapper script will do those things for my team who doesn't get packaging as well as I do. Just check in changes and next time they UV run it updates stuff for them

by rat87

1/14/2025 at 6:58:17 AM

I guess I will never really understand the aesthetic preferences of the majority. But.

>Because its easier. Because it fits together nicer and more consistently. Also because UV is well written and written in rust so all the parts are fast. You can recreate a venv from scratch for every run.

This is the biggest thing I try to push back on whenever uv comes up. There is good evidence that "written in Rust" has quite little to do with the performance, at least when it comes to creating a venv.

On my 10-year-old machine, creating a venv directly with the standard library venv module takes about 0.05 seconds. What takes 3.2 more seconds on top of that is bootstrapping Pip into it.

Which is strange, in that using Pip to install Pip into an empty venv only takes about 1.7 seconds.

Which is still strange, in that using Pip's internal package-installation logic (which one of the devs factored out as a separate project) to unpack and copy the files to the right places, make the script wrappers etc. takes only about 0.2 seconds, and pre-compiling the Python code to .pyc with the standard library `compileall` module takes only about 0.9 seconds more.

The bottleneck for `compileall`, as far as I can tell, is still the actual bytecode compilation - which is implemented in C. I don't know if uv implemented its own bytecode compilation or just skips it, but it's not going to beat that.

Of course, well thought-out caching would mean it can just copy the .pyc files (or hard-link etc.) from cache when repeatedly using a package in multiple environments.

by zahlman

1/13/2025 at 1:40:56 AM

[dead]

by DebugDetective

1/12/2025 at 9:08:16 PM

pip's resolving algorithm is not sound. If your Python projects are really simple it seems to work but as your projects get more complex the failure rate creeps up over time. You might

   pip install
something and have it fail and then go back to zero and restart and have it work but at some point that will fail. conda has a correct resolving algorithm but the packages are out of date and add about as many quality problems as they fix.

I worked at a place where the engineering manager was absolutely exasperated with the problems we were having with building and deploying AI/ML software in Python. I had figured out pretty much all the problems after about nine months and had developed a 'wheelhouse' procedure for building our system reliably, but it was too late.

Not long after I sketched out a system that was a lot like uv but it was written in Python and thus had problems with maintaining its own stable Python enivronment (e.g. poetry seems to trash itself every six months or so.)

Writing uv in Rust was genius because it eliminates that problem of the system having a stable surface to stand on instead of pipping itself into oblivion, never mind that it is much faster than my system would have been. (My system had the extra feature that it used http range requests to extract the metadata from wheel files before pypi started letting you download the metadata directly.)

I didn't go forward with developing it because I argued with a lot of people who, like you, thought it was "the perfect being the enemy of the good" when it was really "the incorrect being the enemy of the correct." I'd worked on plenty of projects where I was right about the technology and wrong about the politics and I am so happy that uv has saved the Python community from itself.

by PaulHoule

1/12/2025 at 9:21:35 PM

May I introduce you to our lord and saviour, Nix and it's most holy child nixpkgs! With only a small tithing of your sanity and ability to Interop with any other dependency management you can free yourself of all dependency woes forever!

[] For various broad* definitions of forever.

[*] Like, really, really broad**

[**] Maybe a week if you're lucky

by MadnessASAP

1/12/2025 at 10:07:34 PM

Except python builders in nixpkgs are really brain damaged because of the writers ways they inject search path which for example breaks if you try to execute a separate python interpreter assuming same library environment...

by p_l

1/12/2025 at 10:48:59 PM

Within the holy church of Nix the sect of Python is troubled one, it can however be tamed into use via vast tomes of scripture. Sadly these times can only be written by those you have truly given their mind and body over to the almighty Nix.

by MadnessASAP

1/12/2025 at 11:26:06 PM

It's not as bad as Common Lisp support which stinks to high heavens of someone not learning the lessons of the Common-Lisp-Controller fiasco

by p_l

1/13/2025 at 1:41:12 AM

Lisp is of the old gods, only the most brave of Nix brethren dare tread upon their parenthesised ways.

by MadnessASAP

1/12/2025 at 10:45:02 PM

Nix is really the best experience I've had with Python package management but only if all the dependencies are already in nixpkgs. If you want to quickly try something off github it's usually a pain in the ass.

by chpatrick

1/13/2025 at 12:48:09 AM

>May I introduce you to our lord and saviour, Nix and it's most holy child nixpkgs!

In this case, instead of working with Python, you change how you manage everything!

by coldtea

1/12/2025 at 9:54:51 PM

The Nix of Python, conda, was already mentioned.

> add about as many quality problems as they fix

by benatkin

1/12/2025 at 10:50:47 PM

I used to have 1 problem, then I used Nix to fix it, now I have 'Error: infinite recursion' problems.

by MadnessASAP

1/12/2025 at 10:00:26 PM

Ugh, I hate writing this but that's where docker and microservices comes to the rescue. It's a pain in the butt and inefficient to run but if you don't care about the overhead (and if you do care, why are you still using Python?), it works.

by morkalork

1/12/2025 at 10:24:52 PM

My experience was that docker was a tool data scientists would use to speedrun the process of finding broken Pythons. For instance we'd inexplicably find a Python had Hungarian as the default charset, etc.

The formula was

   - Docker - Discipline = Chaos
   Docker - Discipline = Chaos
   Docker + Discipline = Order
but

   - Docker + Discipline = Order
If you can write a Dockerfile to install something you can write a bash script. Circa 2006 I was running web servers on both Linux and Windows with hundreds of web sites on them with various databases, etc. It really was simple then as "configure a filesystem path" and "configure a database connection" and I had scripts that could create a site in 30 seconds or so.

Sure today you might have five or six different databases for a site but it's not that different in my mind. Having way too many different versions of things installed is a vice, not a virtue.

by PaulHoule

1/13/2025 at 2:09:06 AM

> If you can write a Dockerfile to install something you can write a bash script.

Docker is great for making sure that magic bash script that brings the system up actually works again on someone else’s computer or after a big upgrade on your dev machine or whatever.

So many custom build scripts I’ve run into over the years have some kind of unstated dependency on the initial system they were written on, or explicit dependencies on something tricky to install, and as such are really annoying to diagnose later on, especially if they make significant system changes.

Docker is strictly better than a folder full of bash scripts and a Readme.txt. I would have loved having it when I had to operate servers like that with tons of websites running on them. So much nicer to be able to manage dependency upgrades per-site rather than server-wide, invariably causing something to quietly break on one of 200 sites.

by macNchz

1/13/2025 at 9:04:18 AM

Unspoken libc dependencies are my favorite. Granted, you need to wait a few years after launching the project to feel that pain, but once you’re there, the experience is… unforgettable.

Second best are OpenSSL dependencies. I sincerely hope I won’t have to deal with that again.

by baq

1/12/2025 at 11:25:11 PM

Unfortunately sometimes you get to host things not written by you, or which exist for a long time and thus there's a lot of history involved that prevents nice and tidy.

My first production use of kubernetes started out because we put in the entirety of what we had to migrate to new hosting into spreadsheet, with columns for various parts of stack used by the websites, and figured we would go insane trying to pack it up - or we would lose the contract because we would be as expensive as the last company.

Could we package it nicely without docker? Yes, but the effort to package it in docker was smaller than packaging it in a way where it wouldn't conflict on a single host, because the simple script becomes way harder when you need to handle multiple versions of the same package, something that most distro do not support at all (these days I think we could have done it with NixOS, but that's a different kettle of deranged fishes)

And then the complexity of managing the stack was quickly made easier by turning each site into separate artifact (docker container) handled by k8s manifests (especially when it came to dealing with about 1000 domains across those apps).

So, theoretically discipline is enough, practical world is much dirtier though.

by p_l

1/13/2025 at 12:24:48 PM

> If you can write a Dockerfile to install something you can write a bash script.

The trick isn't installing things, it's uninstalling them. Docker container is isolated in ways your bash script equivalent is not - particularly when first developing it, when you're bound to make an occasional mistake.

by TeMPOraL

1/13/2025 at 12:50:48 AM

>For instance we'd inexplicably find a Python had Hungarian as the default charset, etc.

Sounds quite explicable: Docker image created by Hungarian devs perhaps?

by coldtea

1/13/2025 at 3:36:35 AM

My understanding is that UTF-8 is the world's charset and that reasonable Hungarians would use that (e.g. I sure don't use us-ascii or iso-latin-1 if I can at all help it. I mean my "better half" reads 中文 so I don't have to and having it all in UTF-8 makes it easy) The other mystery is how the data sci's found it.

by PaulHoule

1/13/2025 at 12:32:19 PM

IIRC there was some widely used image with many derivatives that redefined locale (the one in Docker Library used POSIX since forever).

by chupasaurus

1/13/2025 at 12:49:50 AM

>and if you do care, why are you still using Python?

Because I get other advantages of it. Giving in to overhead on one layer, doesn't mean I'm willing to give it up everywhere.

by coldtea

1/13/2025 at 4:16:43 AM

Docker will make it work, but is a heavy solution as it will happily take up GB of your disk. uv is a more efficient and elegant option.

by kussenverboten

1/13/2025 at 4:40:21 AM

Yes, another sound reason to use microservices. /s

by fulafel

1/13/2025 at 4:20:02 AM

> You might `pip install` something and have it fail and then go back to zero and restart and have it work but at some point that will fail.

Can you give a concrete example, starting from a fresh venv, that causes a failure that shouldn't happen?

> but it was written in Python and thus had problems with maintaining its own stable Python enivronment

All it has to do is create an environment for itself upon installation which is compatible with its own code, and be written with the capability of installing into other environments (which basically just requires knowing what version of Python it uses and the appropriate paths - the platform and ABI can be assumed to match the tool, because it's running on the same machine).

This is fundamentally what uv is doing, implicitly, by not needing a Python environment to run.

But it's also what the tool I'm developing, Paper, is going to do explicitly.

What's more, you can simulate it just fine with Pip. Of course, that doesn't solve the issues you had with Pip, but it demonstrates that "maintaining its own stable Python environment" is just not a problem.

>Writing uv in Rust was genius because it eliminates that problem of the system having a stable surface to stand on instead of pipping itself into oblivion, never mind that it is much faster than my system would have been.

From what I can tell, the speed mainly comes from algorithmic issues, caching etc. Pip is just slow above and beyond anything Python forces on it.

An example. On my system, creating a new venv from scratch with Pip included (which loads Pip from within its own vendored wheel, which then runs in order to bootstrap itself into the venv) takes just over 3 seconds. Making a new venv without Pip, then asking a separate copy of Pip to install an already downloaded Pip wheel would be about 1.7 seconds. But making that venv and using the actual internal installation logic of Pip (which has been extracted by Pip developer Pradyun Gedam as https://github.com/pypa/installer ) would take about 0.25 seconds. (There's no command-line API for this; in my test environment I just put the `installer` code side by side with a driver script, which is copied from my development work on Paper.) It presumably could be faster still.

I honestly have no idea what Pip is doing the rest of that time. It only needs to unzip an archive and move some files around and perform trivial edits to others.

> (My system had the extra feature that it used http range requests to extract the metadata from wheel files before pypi started letting you download the metadata directly.)

Pip has had this feature for a long time (and it's still there - I think to support legacy projects without wheels, because I think the JSON API won't be able to provide the data since PyPI doesn't build the source packages). It's why the PyPI server supports range requests in the first place.

> I'd worked on plenty of projects where I was right about the technology and wrong about the politics and I am so happy that uv has saved the Python community from itself.

The community's politics are indeed awful. But Rust (or any other language outside of Python) is not needed to solve the problem.

by zahlman

1/13/2025 at 5:45:51 PM

It occurs to me later: `installer` isn't compiling the .py files to .pyc, which probably accounts for the time difference. This can normally be done on demand (or suppressed entirely) but Pip wants to do it up front. Bleh. "Installing" from already-unpacked files would still be much faster.

by zahlman

1/13/2025 at 1:10:53 AM

Respectively, yes. The ability to create venvs so fast, that it becomes a silent operation that the end user never thinks about anymore. The dependency management and installation is lightning quick. It deals with all of the python versioning

and I think a killer feature is the ability to inline dependencies in your Python source code, then use: uv tool run <scriptname>

Your script code would like:

#!/usr/bin/env -S uv run --script # /// script # requires-python = ">=3.12" # dependencies = [ # "...", # "..." # ] # ///

Then uv will make a new venv, install the dependencies, and execute the script faster than you think. The first run is a bit slower due to downloads and etc, but the second and subsequent runs are a bunch of internal symlink shuffling.

It is really interesting. You should at least take a look at a YT or something. I think you will be impressed.

Good luck!

by ppierald

1/13/2025 at 4:38:51 AM

>Respectively, yes. The ability to create venvs so fast, that it becomes a silent operation that the end user never thinks about anymore.

I might just blow your mind here:

  $ time python -m venv with-pip

  real 0m3.248s
  user 0m3.016s
  sys 0m0.219s
  $ time python -m venv --without-pip without-pip

  real 0m0.054s
  user 0m0.046s
  sys 0m0.009s
The thing that actually takes time is installing Pip into the venv. I already have local demonstrations that this installation can be an order of magnitude faster in native Python. But it's also completely unnecessary to do that:

  $ source without-pip/bin/activate
  (without-pip) $ ~/.local/bin/pip --python `which python` install package-installation-test
  Collecting package-installation-test
    Using cached package_installation_test-1.0.0-py3-none-any.whl.metadata (3.1 kB)
  Using cached package_installation_test-1.0.0-py3-none-any.whl (3.1 kB)
  Installing collected packages: package-installation-test
  Successfully installed package-installation-test-1.0.0
I have wrappers for this, of course (and I'm explicitly showing the path to a separate Pip that's already on my path for demonstration purposes).

> a killer feature is the ability to inline dependencies in your Python source code, then use: uv tool run <scriptname>

Yes, Uv implements PEP 723 "Inline Script Metadata" (https://peps.python.org/pep-0723/) - originally the idea of Paul Moore from the Pip dev team, whose competing PEP 722 lost out (see https://discuss.python.org/t/_/29905). He's been talking about a feature like this for quite a while, although I can't easily find the older discussion. He seems to consider it out of scope for Pip, but it's also available in Pipx as of version 1.4.2 (https://pipx.pypa.io/stable/CHANGELOG/).

> The first run is a bit slower due to downloads and etc, but the second and subsequent runs are a bunch of internal symlink shuffling.

Part of why Pip is slow at this is because it insists on checking PyPI for newer versions even if it has something cached, and because its internal cache is designed to simulate an Internet connection and go through all the usual metadata parsing etc. instead of just storing the wheels directly. But it's also just slow at actually installing packages when it already has the wheel.

In principle, nothing prevents a Python program from doing caching sensibly and from shuffling symlinks around.

by zahlman

1/13/2025 at 6:10:34 AM

It's not the "runtime" that's slow for me with pip, but all the steps needed. My biggest gripe with python is you need to basically be an expert in different tools to get a random project running. Uv solves this. Just uv run the script and it works.

I don't care if pip technically can do something. The fact that I explicitly have to mess around with venvs and the stuff is already enough mental overhead that I disregard it.

I'm a python programmer at my job, and I've hated the tooling for years. Uv is the first time I actually like working with python.

by matsemann

1/13/2025 at 10:47:59 AM

None of GP is about what Pip can technically do. It's about what a better tool still written in Python could do.

The problems you're describing, or seeing solved with uv, don't seem to be about a problem with the design of virtual environments. (Uv still uses them.) They're about not having the paradigm of making a venv transiently, as part of the code invocation; or they're about not having a built-in automation of a common sequence of steps. But you can do that just as well with a couple lines of Bash.

I'm not writing any of this to praise the standard tooling. I'm doing it because the criticisms I see most commonly are inaccurate. In particular, I'm doing it to push back against the idea that a non-Python language is required to make functional Python tooling. There isn't a good conceptual reason for that.

by zahlman

1/13/2025 at 11:32:56 AM

It may not be required, but it has the virtue of existing. Now that it does, is it a problem that it's not written in Python? Especially given that they've chosen to take on managing the interpreter as well: being in a compiled language does mean that it doesn't have the bootstrap problem of needing an already functional Python installation that they need to avoid breaking.

by regularfry

1/13/2025 at 11:24:33 AM

Why does it matter if it's written in python or not? I want the best tooling, don't care how it's made.

by matsemann

1/13/2025 at 5:17:56 PM

You are free to evaluate tooling by your own standards.

But it commonly comes across that people think it can't be written in Python if it's to have XYZ features, and by and large they're wrong, and I'm trying to point that out. In particular, people commonly seem to think that e.g. Pip needs to be in the same environment to work, and that's just not true. There's a system in place that defaults to copying Pip into every environment so that you can `python -m pip`, but this is wasteful and unnecessary. (Pip is designed to run under the install environment's Python, but this is a hacky implementation detail. It really just needs to know the destination paths and the target Python version.)

It also happens that I care about disk footprint quite a bit more than most people. Maybe because I still remember the computers I grew up with.

by zahlman

1/12/2025 at 9:01:51 PM

If you switch to uv, you’ll have fewer excuses to take coffee breaks while waiting for pip to do its thing. :)

by amluto

1/12/2025 at 9:01:13 PM

Pip only has requirements.txt and doesn't have lockfiles, so you can't guarantee that the bugs you're seeing on your system are the same as the bugs on your production system.

by mplewis

1/12/2025 at 9:06:15 PM

I’ve always worked around that by having a requirements.base.txt and a requirements.txt for the locked versions. Obviously pip doesn’t do that for you but it’s not hard to manage yourself.

Having said that, I’m going to give uv a shot because I hear so many good things about it.

by aidos

1/12/2025 at 11:09:15 PM

With pip the best practice is to have a requirements.txt with direct requirements (strictly or loosely pinned), and a separate constraints.txt file [1] with strictly pinned versions of all direct- and sub-dependencies (basically the output of `pip freeze`). The latter works like a lock file.

[1] https://pip.pypa.io/en/stable/user_guide/#constraints-files

by selcuka

1/13/2025 at 5:57:29 AM

For direct requirements you're better off using the `pyproject.toml` for direct dependencies (and you can plausibly use it to pin everything if you're developing an application). It's project metadata that you'll need anyway for building your project, and the "editable wheel" hack allows Pip to use that information to set up an environment for your code (via `pip install -e .`).

by zahlman

1/12/2025 at 9:12:04 PM

I’m grouchy because I finally got religion on poetry a few years ago, but the hype on uv is good enough that I’ll have to give it a shot.

by mikepurvis

1/12/2025 at 9:40:29 PM

I freaking love Poetry. It was a huge breath of fresh air after years of pip and a short detour with Pipenv. If uv stopped existing I’d go back to Poetry.

But having tasted the sweet nectar of uv goodness, I’m onboard the bandwagon.

by kstrauser

1/13/2025 at 12:38:01 AM

This works until you need to upgrade something, pip might upgrade to a broken set of dependencies. Or if you run on a different OS and the dependencies are different there (because of env markers), your requirements file won't capture that. There are a lot of gotchas that pip can't fix.

by remram

1/13/2025 at 6:03:12 AM

> pip might upgrade to a broken set of dependencies.

I'm only aware of examples where it's the fault of the packages - i.e. they specify dependency version ranges that don't actually work for them (or stop working for them when a new version of the dependency is released). No tool can do anything about that on the user's end.

> Or if you run on a different OS and the dependencies are different there (because of env markers), your requirements file won't capture that. There are a lot of gotchas that pip can't fix.

The requirements.txt format is literally just command-line arguments to Pip, which means you can in fact specific the env markers you need there. They're part of the https://peps.python.org/pep-0508/ syntax which you can use on the Pip command line. Demo:

  $ pip install 'numpy;python_version<="2.7"'
  Ignoring numpy: markers 'python_version <= "2.7"' don't match your environment
> There are a lot of gotchas that pip can't fix.

There are a lot of serious problems with Pip - I just don't think these are among them.

by zahlman

1/13/2025 at 2:07:02 PM

You can specify markers in the requirements file you write, not in the frozen requirements from 'pip freeze'. Because it's just a list of what's installed on your machine.

by remram

1/13/2025 at 5:12:19 PM

Running 'pip freeze' creates a plain text file. You can edit it to contain anything that would have been in "the requirements file you write". "Your requirements file" may or may not capture what it needs to, depending on how you created it. But Pip supports it. (And so does the `pyproject.toml` specification.)

by zahlman

1/13/2025 at 11:43:26 PM

You will need another tool to write a lock file that actually locks dependencies for more environments than your own. I don't know what you're trying to say. Pip does not support writing it.

Sure, I guess if you have one Pip will "support" reading it.

by remram

1/12/2025 at 9:46:10 PM

The requirements.txt file is the lockfile. Anyways, this whole obsession with locked deps or "lockfiles" is such an anti-pattern, I have no idea why we went there as an industry. Probably as a result of some of the newer stuff that is classified as "hipster-tech" such as docker and javascript.

by zo1

1/12/2025 at 10:13:07 PM

Just because you don't understand it, it's ok to call it an "anti-pattern"?

Reproducibility is important in many contexts, especially CI, which is why in Node.js world you literally do "npm ci" that installs exact versions for you.

If you haven't found it necessary, it's because you haven't run into situations where not doing this causes trouble, like a lot of trouble.

by n144q

1/12/2025 at 10:34:21 PM

Just because someone has a different perspective than you doesn't mean they don't "understand".

Lockfiles are an anti-pattern if you're developing a library rather than an application, because you can't push your transitive requirements onto the users of your library.

by driggs

1/12/2025 at 10:56:14 PM

If you're developing a library, and you have a requirement for what's normally a transitive dependency, it should be specified as a top-level dependency.

by Uvix

1/12/2025 at 11:37:57 PM

The point is that if I'm writing a library and I specify `requests == 1.2.3`, then what are you going to do in your application if you need both my library and `requests == 1.2.4`?

This is why libraries should not use lockfiles, they should be written to safely use as wide a range of dependencies' versions as possible.

It's the developers of an application who should use a lockfile to lock transitive dependencies.

by driggs

1/13/2025 at 1:58:09 AM

The lock file is for developers of the library, not consumers. Consumers just use the library’s dependency specification and then resolve their own dependency closure and then generate a lock file for that. If you, as a library developer, want to test against multiple versions of your dependencies, there are other tools for that. It doesn’t make lock files a bad idea in general.

by phinnaeus

1/13/2025 at 6:16:52 AM

As another library developer, of course I want to test against multiple versions. Or more accurately, I don't want to prevent my users from using different versions prematurely. My default expectation is that my code will work with a wide range of those versions, and if it doesn't I'll know - because I like to pay attention to other libraries' deprecations, just as I'd hope for my users to pay attention to mine.

Lockfiles aren't helpful to me here because the entire point is not to be dependent upon specific versions. I actively want to go through the cycle of updating my development environment on a whim, finding that everything breaks, doing the research etc. - because that's how I find out what my version requirements actually are, so that I can properly record them in my own project metadata. And if it turns out that my requirements are narrow, that's a cue to rethink how I use the dependency, so that I can broaden them.

If I had a working environment and didn't want to risk breaking it right at the moment, I could just not upgrade it.

If my requirements were complex enough to motivate explicitly testing against a matrix of dependency versions, using one of those "other tools", I'd do that instead. But neither way do I see any real gain, as a library developer, from a lock file.

by zahlman

1/13/2025 at 7:28:40 AM

>If I had a working environment and didn't want to risk breaking it right at the moment, I could just not upgrade it.

The point of a lockfile is to only upgrade when you want to upgrade. I hope you understand that.

by imtringued

1/13/2025 at 9:20:56 AM

Why do I need a special file in order to not do something?

by zahlman

1/13/2025 at 12:02:37 AM

That’s not the perspective that OP was sharing, though.

by orf

1/13/2025 at 12:42:26 AM

You literally phrased it as "I have no idea why". You can't be upset if someone feels you don't understand why.

by remram

1/13/2025 at 2:20:01 AM

"I have no idea why" the industry went there. One can understand a technology or a design pattern yet think it's completely idiotic. (low-hanging fruit: JavaScript, containers, etc.)

by Pannoniae

1/12/2025 at 10:58:22 PM

I’m pretty sure it was a sarcasm.

by wiseowise

1/12/2025 at 11:54:37 PM

"pip freeze" generates a lockfile.

by polski-g

1/12/2025 at 11:59:37 PM

No, that generates a list of currently installed packages.

That’s very much not a lock file, even if it is possible to abuse it as such.

by orf

1/13/2025 at 6:07:35 AM

A list of currently installed packages in the current environment, with their exact versions. This is only the actually needed packages, with their transitive dependencies, unless you've left something behind from earlier in development. If you're keeping abstract dependencies up to date in `pyproject.toml` (which you need to do anyway to build and release the project), you can straightforwardly re-create the environment from that list and freeze whatever solution you get (after testing).

by zahlman

1/13/2025 at 9:33:49 AM

Doesn’t account for differences in platforms or Python versions, and doesn’t contain resolved dependency hashes.

So it’s a “lockfile” in the strictest, most useless definition: only works on the exact same Python version, on my machine, assuming no dependencies have published new packages.

by orf

1/13/2025 at 10:12:02 AM

Look, I'm not trying to sell this as a full solution - I'm just trying to establish that a lot of people really don't need a full solution.

>only works on the exact same Python version

It works on any Python version that all of the dependencies work on. But also it can be worked around with environment markers, if you really can support multiple Python versions but need a different set of dependencies for each.

In practical cases you don't need anything like a full (python-version x dependency) version matrix. For example, many projects want to use `tomllib` from the standard library in Python 3.11, but don't want to drop support for earlier Python because everything else still works fine with the same dependency packages for all currently supported Python versions. So they follow the steps in the tomli README (https://github.com/hukkin/tomli?tab=readme-ov-file#building-...).

>on my machine

(Elsewhere in the thread, people were trying to sell me on lock files for library development, to use specifically on my machine while releasing code that doesn't pin the dependencies.)

If my code works with a given version of a dependency on my machine, with the wheel pre-built for my machine, there is no good reason why my code wouldn't work on your machine with the same version of the dependency, with the analogous wheel - assuming it exists in the first place. It was built from the same codebase. (If not, you're stuck building from source, or may be completely out of luck. A lockfile can't fix that; you can't specify a build artifact that doesn't exist.)

This is also only relevant for projects that include non-Python code that requires a build step, of course.

>assuming no dependencies have published new packages.

PyPI doesn't allow you to replace the package for the same version. That's why there are packages up there with `.post0` etc. suffixes on their version numbers. But yes, there are users who require this sort of thing, which is why PEP 751 is in the works.

by zahlman

1/13/2025 at 12:40:28 PM

So many misunderstandings here :/ I can’t muster the energy to correct them past these two obvious ones

> It works on any Python version that all of the dependencies work on

No, it doesn’t. It’s not a lockfile: it’s a snapshot of the dependencies you have installed.

The dependencies you have installed depend on the Python version and your OS. The obvious case would be requiring a Linux-only dependency on… Linux, or a package only required on Python <=3.10 while you’re on 3.11.

> PyPI doesn't allow you to replace the package for the same version

Yes and no. You can continue to upload new wheels (or a sdist) long after a package version is initially released.

by orf

1/13/2025 at 5:08:09 PM

>So many misunderstandings here :/

I've spent most of the last two years making myself an expert on the topic of Python packaging. You can see this through the rest of the thread.

>No, it doesn’t. It’s not a lockfile: it’s a snapshot of the dependencies you have installed.

Yes, it does. It's a snapshot of the dependencies that you have installed. For each of those dependencies, there is some set of Python versions it supports. Collectively, the packages will work on the intersection of those sets of Python versions. (Because, for those Python versions, it will be possible to obtain working copies of each dependency at the specified version number.)

Which is what I said.

> The dependencies you have installed depend on the Python version and your OS. The obvious case would be requiring a Linux-only dependency on… Linux, or a package only required on Python <=3.10 while you’re on 3.11.

A huge amount of packages are pure Python and work on a wide range of Python versions and have no OS dependency. In general, packages may have such restrictions, but do not necessarily. I know this because I've seen my own code working on a wide range of Python versions without making any particular effort to ensure that. It's policy for many popular packages to ensure they support all Python versions currently supported by the core Python dev team.

Looking beyond pure Python - if I depend on `numpy==2.2.1` (the most recent version at time of writing), that supports Python 3.10 through 3.13. As long as my other dependencies (and the code itself) don't impose further restrictions, the package will install on any of those Python versions. If you install my project on a different operating system, you may get a different wheel for version 2.2.1 of NumPy (the one that's appropriate for your system), but the code will still work. Because I tested it with version 2.2.1 of NumPy on my machine, and version 2.2.1 of Numpy on your machine (compiled for your machine) provides the same interface to my Python code, with the same semantics.

I'm not providing you with the wheel, so it doesn't matter that the wheel I install wouldn't work for you. I'm providing you(r copy of Pip) with the package name and version number; Pip takes care of the rest.

>You can continue to upload new wheels (or a sdist) long after a package version is initially released.

Sure, but that doesn't harm compatibility. In fact, I would be doing it specifically to improve compatibility. It wouldn't change what Pip chooses for your system, unless it's a better match for your system than previously available.

by zahlman

1/13/2025 at 5:17:40 PM

Holy hell dude, you don’t need to write a novel for every reply. It’s not a lockfile because it’s a snapshot of what you have installed. End of.

It doesn’t handle environment markers nor is it reproducible. Given any non-trivial set of dependencies and/or more than 1 platform, it will lead to confusing issues.

Those confusing issues are the reason for lock files to exist, and the reason they are not just “the output of pip freeze”.

But you know this, given your two years of extensive expert study. Which I see very little evidence of.

by orf

1/13/2025 at 6:03:30 PM

>It’s not a lockfile because it’s a snapshot of what you have installed.

I didn't say it was. I said that it solves the problems that many people mistakenly think they need a lockfile for.

(To be clear: did you notice that I am not the person who originally said "'pip freeze' generates a lockfile."?)

>It doesn’t handle environment markers nor is it reproducible.

You can write environment markers in it (of course you won't get them from `pip freeze`) and Pip will respect them. And there are plenty of cases where no environment markers are applicable anyway.

It's perfectly reproducible insofar as you get the exact specified version of every dependency, including transitive dependencies.

>Given any non-trivial set of dependencies and/or more than 1 platform, it will lead to confusing issues.

Given more than 1 platform, with differences that actually matter (i.e. not pure-Python dependencies), you cannot use a lockfile, unless you specify to build everything from source. Because otherwise a lockfile would specify wheels as exact files with their hashes that were pre-built for one platform and will not work on the others.

Anyway, feel free to show a minimal reproducible example of the confusion you have in mind.

by zahlman

1/13/2025 at 6:27:36 PM

> Given more than 1 platform, with differences that actually matter (i.e. not pure-Python dependencies), you cannot use a lockfile, unless you specify to build everything from source. Because otherwise a lockfile would specify wheels as exact files with their hashes that were pre-built for one platform and will not work on the others.

What is more likely:

1. Using a lockfile means you cannot use wheels and have to build from source

2. You don’t know what you’re talking about

(When deciding, keep in mind that every single lockfile consuming and producing tool works fine with wheels)

by orf

1/13/2025 at 2:57:23 AM

Pip is sort of broken before because it encourages confusion between requirements and lock files. In other languages with package managers you generally specify your requirements with ranges and get a lock file with exact versions of those and any transitive dependencies letting you easily recreate a known working environment. The only way to do that in pip is to make a *new* venue install then pip freeze. I think pip tools package is supposed to help but it's a separate tool (one which I've also includes). Also putting stuff in pyproject.toml feels more solid then requirements files (and allows options to be set on requirements (like installing only one package that's only on your company's private python package index mirror while installing the others from the global python package index) and allows dev dependencies and other optional features dependency groups without multiple requirements files and having to update locks on those files.

It also automatically creates venvs if you delete them. And it automatically updates packages when you run something with uv run file.py (useful when somebody may have updated the requirements in git). It also lets you install self contained (installed in a virtualenv and linked to ~/.local/bin which is added to your path)python tools (replacing pipx). It installs self contained python builds letting you more easily pick python version and specify it in a .python-version file for your project (replacing pyenv and usually much nicer because pyenv compiles them locally)

Uv also makes it easier to explore and say start a ipython shell with 2 libraries uv run --with ipython --with colorful --with https ipython

It caches downloads. Of course the http itself isn't faster but they're exploring things to speed that part up and since it's written in rust local stuff (like deleting and recreating a venv with cached packages) tends to be blazing fast

by rat87

1/12/2025 at 9:05:19 PM

I am not a python developer, but sometimes I use python projects. This puts me in a position where I need to get stuff working while knowing almost nothing about how python package management works.

Also I don’t recognise errors and I don’t know which python versions generally work well with what.

I’ve had it happen so often with pip that I’d have something setup just fine. Let’s say some stable diffusion ui. Then some other month I want to experiment with something like airbyte. Can’t get it working at all. Then some days later I think, let’s generate an image. Only to find out that with pip installing all sorts of stuff for airbyte, I’ve messed up my stable diffusion install somehow.

Uv clicked right away for me and I don’t have any of these issues.

Was I using pip and asdf incorrectly before? Probably. Was it worth learning how to do it properly in the previous way? Nope. So uv is really great for me.

by mosselman

1/12/2025 at 10:27:30 PM

This is not just a pip problem. I had the problem with anaconda a few years ago where upgrading the built in editor (spyder?) pulled versions of packages which broke my ML code, or made dependencies impossible to reconsile. It was a mess, wasting hours of time. Since then I use one pip venv for each project and just never update dependencies.

by hyeonwho4

1/13/2025 at 5:48:03 AM

Spyder isn't built-in; IDLE comes with Python (unless you get it via Debian, at least), but is not separately upgradable (as the underlying `idlelib` is part of the standard library).

If upgrading Spyder broke your environment, that's presumably because you were using the same environment that Spyder itself was in. (Spyder is also implemented in Python, as the name suggests.) However, IDEs for Python also like to try to do environment management for you (which may conflict with other tools you want to use specifically for the purpose). That's one of the reasons I just edit my code in Vim.

If updating dependencies breaks your code, it's ultimately the fault of the dependency (and their maintainers will in turn blame you for not paying attention to their deprecation warnings).

by zahlman

1/13/2025 at 7:32:12 PM

Thanks. I understand this a lot more now that I've learned about venvs, and I'm between VScode and emacs for editing. No longer would I install a editor which depends on the same environment as the code I want to run.

As for Spyder, it is included in the default Windows install of Anaconda (and linked to by the default Anaconda Navigator). As a new user doing package management via the GUI, it was not clear at all that Spyder was sharing dependencies with my project until things started breaking.

Anaconda was also half-baked in other ways: it broke if the Windows username contains UTF-8 characters, so I ended up creating a new Windows user just for that ML work. PITA.

by hyeonwho4

1/13/2025 at 6:15:59 AM

You're all over the thread defending the standard python tools, which is fine, it works for you. But the amount of times you've had to write that something is natively supported already or people is just using it wrong speaks volumes about why people prefer uv: it just works without having to learn loads of stuff.

by matsemann

1/12/2025 at 10:35:29 PM

My life got a lot easier since I adopted the habit of making a shell script, using buildah and podman, that wrapped every python, rust, or golang project I wanted to dabble with.

It's so simple!

Create a image with the dependencies, then `podman run` it.

by loxias

1/13/2025 at 1:35:15 PM

I'm fairly minimalist when it comes to tooling: venv, pip, and pip-tools. I've started to use uv recently because it resolves packages significantly faster than pip/pip-tools. It will generate a "requirements.txt" with 30 packages in a few seconds rather than a minute or two.

by cpburns2009

1/12/2025 at 9:04:11 PM

Well, for one you can't actually package or add a local requirement (for example , a vendored package) to the usual pip requirements.txt (or with pyproject.toml, or any other standard way) afaik.

I saw a discourse reply that cited some sort of possible security issue but that was basically it and that means that the only way to get that functionality is to not use pip. It's really not a lot of major stuff, just a lot of little paper cuts that makes it a lot easier to just use something else once your project gets to a certain size.

by mardifoufs

1/12/2025 at 9:45:25 PM

Sure you can.

It's in their example for how to use requirements.txt: https://pip.pypa.io/en/stable/reference/requirements-file-fo...

Maybe there's some concrete example you have in mind though?

by BeefWellington

1/12/2025 at 11:23:21 PM

I don't think so, though maybe I didn't explain myself correctly. You can link to a relative package wheel I think, but not to a package repo. So if you have a repo, with your main package in ./src, and you vendor or need a package from another subfolder (let's say ./vendored/freetype) , you can't actually do it in a way that won't break the moment you share your package. You can't put ./vendored/freetype in your requirements.txt, it just fails.

That means you either need to use pypi or do an extremely messy hack that involves adding the vendored package as a sub package to your main source, and then do some importlib black magic to make sure that everything uses said package.

https://github.com/pypa/pip/issues/6658

https://discuss.python.org/t/what-is-the-correct-interpretat...

by mardifoufs

1/13/2025 at 2:22:40 AM

In this scenario, reading between the lines, the vendor is not providing a public / published package but does provide the source as like a tarball?

I have yet to run into that particular case where the vendor didn't supply their own repo in favour of just providing the source directly. However I do use what are essentially vendor-supplied packages (distant teams in the org) and in those cases I just point at their GitLab/GitHub repo directly. Even for some components within my own team we do it this way.

by BeefWellington

1/13/2025 at 4:30:50 AM

It's more for either monorepos or in my case, to fix packages that have bugs but that I can't fix upstream.

So for me, in my specific case, the freetype-py repo has a rather big issue with Unicode paths (it will crash the app if the path is in Unicode).

There's a PR but it hasn't and probably won't get merged for dubious reasons.

The easy choice, the one that actually is the most viable, is to pull the repo with the patch applied, temporarily add it to my ./vendored folder and just ideally change the requirements.txt with no further changes (or need to create a new pypi package). But it's basically impossible since I just can't use relative paths like that.

Again it's rather niche but that's just one of the many problems I keep encountering. packaging anything with CUDA is still far worse, for example.

by mardifoufs

1/13/2025 at 6:23:17 AM

>The easy choice, the one that actually is the most viable, is to pull the repo with the patch applied, temporarily add it to my ./vendored folder and just ideally change the requirements.txt with no further changes (or need to create a new pypi package). But it's basically impossible since I just can't use relative paths like that.

You can use the repo's setup to build a wheel, then tell pip to install from that wheel directly (in requirements.txt, give the actual path/name of the wheel file instead of an abstract dependency name). You need a build frontend for this - `pip wheel` will work and that's more or less why it exists; but it's not really what Pip is designed for overall - https://build.pypa.io/en/stable/ is the vanilla offering.

by zahlman

1/12/2025 at 9:03:20 PM

Yeah, it unifies the whole env experience with the package installation experience. No more forgetting to activate virtualenv first. No more pip installing into the wrong virtual env or accidentally borrowing from the system packages. It’s way easier to specify which version of python to use. Everything is version controlled including python version and variant like cpython, puppy, etc. it’s also REALLY REALLY fast.

by markerz

1/12/2025 at 9:53:01 PM

Performance and correctness mostly.

by benreesman

1/13/2025 at 4:42:51 AM

I was in your boat too. Been using Python since 2000 and pretty satisfied with venv and pip.

However, the speed alone is reason enough to switch. Try it once and you will be sold.

by __mharrison__

1/12/2025 at 9:03:45 PM

Also you can set the python version for that project. It will download whatever version you need and just use it.

by 2wrist

1/12/2025 at 9:18:00 PM

in my view, depending on your workflow you might have been missing out on pyenv in the past but not really if you feel comfortable self-managing your venvs.

now though, yes unequivocally you are missing out.

by whimsicalism

1/13/2025 at 7:24:32 PM

Yeah, I switched from pip to uv. uv seems like its almost the perfect solution for me.

it does virtualenv, it does pyenv, it does pip, so all thats managed in once place.

its much faster than pip.

its like 80% of my workflow now.

by ayjay_t

1/12/2025 at 9:41:57 PM

Much of the Python ecosystem blatantly violates semantic versioning. Most new tooling is designed to work around the bugs introduced by this.

by o11c

1/12/2025 at 10:08:00 PM

To be fair, Python itself doesn’t follow SemVer. Not in a “they break things they shouldn’t,” but in a “they never claim to be using SemVer.”

by sgarland

1/13/2025 at 6:26:27 AM

Relevant: https://iscinumpy.dev/post/bound-version-constraints/ Semver is hard; you never know what will break at least one of your users (see Hyrum's law), but on the other hand, clear backwards-compatibility breaks will often not affect a large fraction of users - if they preemptively declare that they won't support your next version, they may prevent Pip from finding a set of versions that would actually work just fine.

by zahlman

1/13/2025 at 12:31:24 AM

Cool story bro.

I've used pip, pyenv, poetry, all are broken in one way or another, and have blind spots they don't serve.

If your needs are simple (not mixing Python versions, simple dependencies, not packaging, etc) you can do it with pip, or even with tarballs and make install.

by coldtea

1/12/2025 at 9:02:04 PM

Pip doesn't resolve dependencies for you. On small projects that can be ok, but if you're working on something medium to large, or you're working on it with other people you can quickly get yourself into a sticky situation where your environment isn't easily reproducible.

Using uv means your project will have well defined dependencies.

by remus

1/12/2025 at 9:06:46 PM

Oh wow it doesn’t? What DOES it do then?

As I commented here just now I never got pip. This explains it.

by mosselman

1/12/2025 at 9:48:04 PM

The guy doesn't know what he's talking about as pip certainly has dependency resolution. Rather get your python or tech info from a non-flame-war infested thread full of anti-pip and anti-python folk.

by zo1

1/12/2025 at 8:57:34 PM

What is the deal with uv's ownership policy? I heard it might be VC backed. To my mind, that means killing pip and finding some kind of subscription revenue source which makes me uneasy.

The only way to justify VC money is a plot to take over the ecosystem and then profit off of a dominant position. (e.g. the Uber model)

I've heard a little bit about UV's technical achievements, which are impressive, but technical progress isn't the only metric.

by tehjoker

1/13/2025 at 12:30:31 AM

It’s dual MIT and Apache licensed. Worst case, if there’s a rug pull, fork it.

by feznyng

1/13/2025 at 3:13:49 AM

Is that the entire story? If so the VCs are pretty dumb. If they kill pip, that means the people who were maintaining it disperse and forking it won't restore the ecosystem that was there before.

by tehjoker

1/13/2025 at 8:39:04 PM

Yarn and pnpm didn’t kill npm. I don’t see how pip could ever be fully supplanted by uv. And if it is, that would probably mean the python foundation is stewarding it.

by feznyng

1/12/2025 at 10:44:09 PM

This:

> I haven't felt like it's a minor improvement on what I'm using

means that this:

> I'd love if we standardized on it as a community as the de facto default

…probably shouldn’t happen. The default and de facto standard should be something that doesn’t get put on a pedestal but stays out of the way.

It would be like replacing the python repl with the current version of ipython. I’d say the same thing, that it isn’t a minor improvement. While I almost always use ipython now, I’m glad it’s a separate thing.

by benatkin

1/13/2025 at 1:18:06 AM

> The default and de facto standard should be something that doesn’t get put on a pedestal but stays out of the way.

The problem is that in the python ecosystem there really isn't a default de facto standard yet at all. It's supposed to be pip, but enough people dislike pip that it's hard as a newcomer to know if it's actually the standard or not.

The nice thing about putting something like this on a pedestal is that maybe it could actually become a standard, even if the standard should be simple and get out of the way. Better to have a standard that's a bit over the top than no standard at all.

by lolinder

1/13/2025 at 1:30:27 AM

It feels even more standard than it used to, with python -m pip and python -m venv making it so it can be used with a virtalenv even if only python or python3 is in your path.

by benatkin

1/13/2025 at 6:39:48 AM

Just for the record, `venv` has been in since Python 3.3, although it didn't bootstrap Pip into the environment until 3.4 and wasn't the officially blessed way of doing things until 3.5 (which was still over 9 years ago).

Pip isn't part of the standard library; the standard library `ensurepip` (called from `venv`) includes a vendored wheel for Pip (https://github.com/python/cpython/tree/main/Lib/ensurepip/_b...), and imports Pip from within that wheel to bootstrap it. (Wheels are zip files, and Python natively knows how to import code from a zip file, so this just involves some `sys.path` manipulation. The overall process is a bit complex, but it's all in https://github.com/python/cpython/blob/main/Lib/ensurepip/__... .)

This is why you get prompted to upgrade Pip all the time in new virtual environments (unless you preempt that with `--upgrade-deps`). The core development team still wants to keep packaging at arms length. This also allows Pip to be developed, released and versioned independently.

by zahlman

1/13/2025 at 1:38:13 AM

Oh, it's certainly more standard than it used to be, and maybe it's on the way to being fully standard. But it definitely hasn't arrived in the spot that npm, cargo, hex, bundler, and similar have in their respective ecosystems.

Npm is a pretty good example of what pip should be. Npm has had to compete with other package managers for a long time but has remained the standard simply because it actually has all the basic features that people expect out of a package manager. So other package managers can spin up using the npm registry providing slightly better experiences in certain ways, but npm covers the basics.

Pip really does not even cover the basics, hence the perpetual search for a better default.

by lolinder

1/13/2025 at 6:41:35 AM

Pip intentionally and by design does not cover package management. It covers package installation - which is more complex for Python than for other languages because of the expectation of being able to (try to) build code in other languages, at install time.

by zahlman

1/13/2025 at 5:14:49 AM

As it happens, the Python REPL was just replaced a few months ago!

…Not with IPython. But with an implementation written in Python instead of C, originating from the PyPy project, that supports fancier features like multi-line editing and syntax highlighting. See PEP 762.

I was apprehensive when I heard about it, but then I had the chance to use it and it was a very nice experience.

by comex

1/13/2025 at 6:15:07 AM

Ooh, nice. I think recently the times I've used the latest version of python it's been with ipython, so I didn't notice. Going to check it out! It might be easier to make a custom repl now.

by benatkin

1/12/2025 at 11:24:00 PM

> The default and de facto standard should be something that doesn’t get put on a pedestal but stays out of the way.

This to me is unachievable. Perfection is impossible. On the the way there if the community and developers coalesced around a single tool then maybe we can start heading down the road to perfectionism.

by greazy

1/12/2025 at 11:31:43 PM

I mean, stays out of the way for simple uses.

When I first learned Python, typing python and seeing >>> and having it evaluate what I typed as if it appeared in a file was a good experience.

Now that I use python a lot, ipython is more out of the way to me than the built-in python repl is, because it lets me focus on what I'm working on, than limitations of a tool.

by benatkin

1/12/2025 at 10:31:55 PM

+1 uv now also supports system installation of python with the --default --preview flags. This probably allows me to replace mise (rtx) and go uv full time for python development. With other languages, I go back to mise.

by mrbonner

1/14/2025 at 10:35:21 AM

I use mise with uv for automatic activation of venvs when I cd into a directory containing one (alongside a mise.toml). Do you tackle this in some other manner?

by tomtom1337

1/14/2025 at 5:30:41 PM

Did you mean using mise to install uv? I never thought of this before. Do you have a reference somewhere of how it works? Thx

by mrbonner

1/13/2025 at 2:13:05 AM

(I will likely base a blog post in my packaging series off this comment later.)

What people seem to miss about Pip is that it's by design, not a package manager. It's a package installer, only. Of course it doesn't handle the environment setup for you; it's not intended for that. And of course it doesn't keep track of what you've installed, or make lock files, or update your `pyproject.toml`, or...

What it does do is offer a hideously complex set of options for installing everything under the sun, from everywhere under the sun. (And that complexity has led to long-standing, seemingly unfixable issues, and there are a lot of design decisions made that I think are questionable at best.)

Ideas like "welllll I use poetry but pyenv works or you could use conda too" are incoherent. They're for different purposes and different users, with varying bits of overlap. The reason people are unsatisfied is because any given tool might be missing one of the specific things they want, unless it's really all-in-one like Uv seems like it intends to be eventually.

But once you have a truly all-in-one tool, you notice how little of it you're using, and how big it is, and how useless it feels to have to put the tool name at the start of every command, and all the specific little things you don't like about its implementation of whatever individual parts. Not to mention the feeling of "vendor lock-in". Never mind that I didn't pay money for it; I still don't want to feel stuck with, say, your build back-end just because I'm using your lock-file updater.

In short, I don't want a "package manager".

I want a solid foundation (better than Pip) that handles installing (not managing) applications and packages, into either a specified virtual environment or a new one created (not managed, except to make it easy to determine the location, so other tools can manage it) for the purpose. In other words, something that fully covers the needs of users (making it possible to run the code), while providing only the bare minimum on top of that for developers - so that other developer tools can cooperate with that. And then I want specialized tools for all the individual things developers need to do.

The specialized tools I want for my own work all exist, and the tools others want mostly exist too. Twine uploads stuff to PyPI; `build` is a fine build front-end; Setuptools would do everything I need on the back-end (despite my annoyances with it). I don't need a lockfile-driven workflow and don't readily think in those terms. I use pytest from the command line for testing and I don't want a "package manager" nor "workflow tool" to wrap that for me. If I needed any wrapping there I could do a few lines of shell script myself. If anything, the problem with these tools is doing too much, rather than too little.

The base I want doesn't exist yet, so I've started making it myself. Pipx is a big step in the right direction, but it has some arbitrary limitations (I discuss these and some workarounds in my recent blog post https://zahlman.github.io/posts/2025/01/07/python-packaging-... ) and it's built on Pip so it inherits those faults and is that much bigger. Uv is even bigger still for the compiled binary, and I would only be using the installation parts.

by zahlman

1/13/2025 at 7:42:48 AM

Virtualenv should have never existed in the first place. So you claiming that UV or whatever tool is doing too much, sounds to me like you're arguing based on "traditionalist" or "conservative" reasons rather than doing any technical thinking here.

Node.js's replacement for virtualenv is literally just a folder named "node_modules". Meanwhile python has an entire tool with strange ideosyncracies that you have to pay attention to otherwise pip does the wrong thing by default.

It is as if python is pretending to be a special snowflake where installing libraries into a folder is this super hyper mega overcomplicated thing that necessitates a whole dedicated tool just to manage, when in reality in other programming languages nobody is really thinking about that the fact that the libraries end up in their build folders. It just works.

So again you're pretending that this is such a big deal that it needs a whole other tool, when the problem in question is so trivial that another tool is adding mental overhead with regard to the microscopic problem at hand.

by imtringued

1/13/2025 at 6:29:20 PM

> Node.js's replacement for virtualenv is literally just a folder named "node_modules".

Node_modules doesn't support an isolated node interpreter distinct from what may be installed elsewhere on the machine. Third party tools are available that do that for node, but node_modules alone addresses a subset of the issues that venvs solve.

OTOH, its arguably undesirable that there isn't a convenient way in Python to do just what node_modules does without the extra stuff that venvs do, because there are a lot of use cases where that kind of solution would be sufficient and lower overhead.

by dragonwriter

1/13/2025 at 9:56:17 AM

Uv doing things I'm not interested in, has absolutely nothing to do with the design of virtualenvs. But virtualenvs are easy enough to work with; they absolutely don't "necessitate a whole dedicated tool just to manage" unless you could the `activate` script that they come with.

But also, Uv uses them anyway. Because that's the standard. And if the Python standard were to use a folder like node_modules, Uv would follow suit, and so would I. And Uv would still be doing package management and workflow tasks that I'm completely uninterested in, and presenting an "every command is prefixed with the tool suite name* UI that I don't like.

There was a proposal for Python to use a folder much like node_modules: see https://discuss.python.org/t/pep-582-python-local-packages-d... . It went through years of discussion and many problems with the idea were uncovered. One big issue is that installing a Python package can install more than just the importable code; the other stuff has to be put somewhere sensible.

>So again you're pretending that this is such a big deal that it needs a whole other tool

I make no such claim. The tool I want is not for managing virtual environments. It's for replacing Pip, and offering a modicum of convenience on top of that. Any time you install a package, no matter whether you use Pip or uv or anything else, you have to choose (even if implicitly) where it will be installed. Might as well offer the option of setting up a new location, as long as we have a system where setup is necessary.

by zahlman

1/13/2025 at 6:31:03 PM

> But virtualenvs are easy enough to work with; they absolutely don't "necessitate a whole dedicated tool just to manage" unless you could the `activate` script that they come with.

venv is a dedicated tool.

virtualenv is a different dedicated tool.

by dragonwriter

1/13/2025 at 7:25:25 AM

Do you feel that Npm, mix, cargo went the wrong way, doing too much? It seems like their respective communities _love_ the standard tooling and all that it does. Or is Python fundamentally different?

by botanical76

1/13/2025 at 9:39:11 AM

Python is fundamentally used in different ways, and in particular is much more often used in conjunction with code in other languages that has to be compiled separately and specially interfaced with. It also has a longer history, and a userbase that's interested in very different ways of using Python than the "development ecosystem" model where you explicitly create a project with the aim of adding it to the same package store where you got your dependencies from. Different users have wildly different needs, and tons of the things workflow tools like uv/Poetry/PDM/Hatch/Flit do are completely irrelevant to lots of them. Tons of users don't want to "manage dependencies"; they want to have a single environment containing every package that any of their projects ever uses (and a shoulder to cry on if a version conflict ever arises from that). Tons of users don't want to make a "project" and they especially don't want to set up their Python code as a separate, reusable thing, isolated from the data they're working on with it right now. Tons of users think they know better than some silly "Pip" tool about exactly where each of their precious .py files ought to go on your hard drive. Tons of developers want their program to look and feel like a stand-alone, independent application, that puts itself in C:\Program Files and doesn't expect users to know what Python is. People more imaginative than I could probably continue this train of thought for quite a bit.

For many of the individual tasks, there are small Unix-philosophy tools that work great. Why is typing `poetry publish` better than typing `twine upload`? (And it would be just `twine`, except that there's a `register` command that PyPI doesn't need in the first place, and a `check` command that's only to make sure that other tools did their job properly - and PyPI will do server-side checks anyway.) Why is typing `poetry run ...` better than activating the venv with the script it provided itself, and then doing the `...` part normally?

An all-in-one tool works when people agree on the scope of "all", and you can therefore offer just one of them and not get complaints.

by zahlman

1/12/2025 at 9:26:02 PM

Heck, you can get even cleaner than that by using uv’s support for PEP 723’s inline script dependencies:

  # /// script
  # requires-python = ">=3.12"
  # dependencies = [
  #     "pandas",
  # ]
  # ///
h/t https://simonwillison.net/2024/Dec/19/one-shot-python-tools/

by riwsky

1/14/2025 at 10:10:27 PM

I've started a repo with some of these scripts, the most recent one being my favorite: a wrapper for Microsoft AutoGen's very recent Magentic-1, a generalist LLM-Multi-Agent-System. It can use the python code, the CLI, a browser (Playwright) and the file system to complete tasks.

A simple example a came across is having to rename some files:

1. you just open the shell in the location you want

2. and run this command:

uv run https://raw.githubusercontent.com/SimonB97/MOS/main/AITaskRu... "check the contents of the .md files in the working dir and structure them in folders"

There's a link to Magentic-1 docs and further info in the repo: https://github.com/SimonB97/MOS/tree/main/AITaskRunner (plus two other simple scripts).

by sbene970

1/12/2025 at 10:09:07 PM

I don't understand how things like this get approved into PEPs.

by aeurielesn

1/12/2025 at 10:27:59 PM

Seems like a great way to write self documenting code which can be optionally used by your python runtime.

by Karupan

1/12/2025 at 10:13:00 PM

As in, you think this shouldn't be possible or you think it should be written differently?

by zanie

1/12/2025 at 10:38:38 PM

The PEP page is really good at explaining the status of the proposal, a summary of the discussion to date, and then links to the actual detailed discussion (in Discourse) about it:

https://peps.python.org/pep-0723/

by epistasis

1/12/2025 at 11:39:15 PM

I see this was accepted (I think?); is the implementation available in a released python version? I don't see an "as of" version on the pep page, nor do lite google searches reveal any official python docs of the feature.

by noitpmeder

1/13/2025 at 12:19:56 AM

It's not a python the language feature, it's for packaging. So no language version is relevant. It's just there for any tool that wants to use it. uv, an IDE, or anything else that manages virtual environments would be the ones who implement it independent of python versions.

by jonesetc

1/13/2025 at 3:23:57 AM

This is a specification for Python packaging, which is tooling separate from Python releases (for better or worse, IMHO worse but the BDFL disagrees). There's a box below the Table of Contents of the PEP that points here:

https://packaging.python.org/en/latest/specifications/inline...

by epistasis

1/13/2025 at 5:22:37 AM

It's helpful as a way to publish minimal reproductions of bugs and issues in bug reports (compared to "please clone my repo" which has so many layers of friction involved).

I would want distributed projects to do things properly, but as a way to shorthand a lot of futzing about? It's excellent

by rtpg

1/12/2025 at 11:04:54 PM

And people were laughing at PHP comments configuring framework, right?

by misiek08

1/12/2025 at 11:54:49 PM

Python was always the late born twin brother of PHP with better hair and teeth, but the same eyes that peered straight into the depths of the abyss.

by throwup238

1/13/2025 at 12:57:17 PM

Python was first released in 1991, and PHP was first released in 1995.

by mkl

1/13/2025 at 1:31:05 AM

Why come types?

by franktankbank

1/13/2025 at 2:18:04 AM

That’s when each language reached sexual maturity but one went on to get a girlfriend and the other discovered internet porn.

by throwup238

1/13/2025 at 5:50:55 AM

I don't know which is which in this story.

by 8n4vidtmkvmk

1/13/2025 at 2:59:22 AM

with that expected use case of uv script run command it effectively makes those comments executable

python's wheels are falling off at an ever faster and faster rate

by blibble

1/13/2025 at 5:02:19 AM

Because of a feature that solves the problem of one off scripts being difficult the moment you need a 3rd party library?

A more Pythonic way of doing this might be __pyproject__ bit that has the tiiiiny snag of needing to execute the file to figure out its deps. I would have loved if __name__ == "pyproject" but while neat and tidy it is admittedly super confusing for beginners, has a "react hooks" style gotcha where to can't use any deps in that block, and you can't use top level imports. The comment was really the easiest way.

by Spivak

1/12/2025 at 10:32:07 PM

I don't think this IS a PEP, I believe it is simply something the uv tool supports and as far as Python is concerned it is just a comment.

by linsomniac

1/12/2025 at 10:48:49 PM

https://peps.python.org/pep-0723/

by mkl

1/13/2025 at 1:32:34 PM

Thank you for the pointer, I had searched for it but couldn't find it. (edit: Glad I spent the downvotes to get educated :-)

by linsomniac

1/12/2025 at 10:49:29 PM

No, this is a language standard now (see PEP 723)

by zanie

1/13/2025 at 1:36:44 AM

Is it possible for my IDE (vscode) to support this? Currently my IDE screams at me for using unknown packages and I have no type hinting, intellisense, etc.

by shlomo_z

1/13/2025 at 10:08:23 AM

With your python plugin you should be able choose .venv/bin/python as your interpreter after you've run `uv sync` and everything should resolve

by Zizizizz

1/15/2025 at 3:05:09 AM

But if I am using inline metadata to declare the dependencies, uv doesn't tell me where the venv is. And there is no `uv sync` command for single scripts as far as I can tell.

by shlomo_z

1/13/2025 at 4:14:56 PM

So it doesnt work wirk adhoc venvs, does it? Might still valuable for simply running scripts, but I’m not sure venv-less works for dev

by randomlurking

1/13/2025 at 2:14:02 AM

So it's like a shebang for dependencies. Cool.

by __MatrixMan__

1/12/2025 at 9:04:44 PM

As a NodeJS developer it's still kind of shocking to me that Python still hasn't resolved this mess. Node isn't perfect, and dealing with different versions of Node is annoying, but at least there's none of this "worry about modifying global environment" stuff.

by stevage

1/12/2025 at 9:11:00 PM

Caveat: I'm a node outsider, only forced to interact with it

But there are a shocking number of install instructions that offer $(npm i -g) and if one is using Homebrew or nvm or a similar "user writable" node distribution, it won't prompt for sudo password and will cheerfully mangle the "origin" node_modules

So, it's the same story as with python: yes, but only if the user is disciplined

Now ruby drives me fucking bananas because it doesn't seem to have either concept: virtualenvs nor ./ruby_modules

by mdaniel

1/12/2025 at 11:00:16 PM

It's worth noting that Node allows two packages to have the same dependency at different versions, which means that `npm i -g` is typically a lot safer than a global `pip install`, because each package will essentially create its own dependency tree, isolated from other packages. In practice, NPM has a deduplication process that makes this more complicated, and so you can run into issues (although I believe other package managers can handle this better), but I rarely run into issues with this.

That said, I agree that `npm i -g` is a poor system package manager, and you should typically be using Homebrew or whatever package manager makes the most sense on your system. That said, `npx` is a good alternative if you just want to run a command quickly to try it out or something like that.

by MrJohz

1/13/2025 at 7:20:16 AM

>It's worth noting that Node allows two packages to have the same dependency at different versions

Yes. It does this because JavaScript enables it - the default import syntax uses a file path.

Python's default import syntax uses symbolic names. That allows you to do fun things like split a package across the filesystem, import from a zip file, and write custom importers, but doesn't offer a clean way to specify the version you want to import. So Pip doesn't try to install multiple versions either (which saves it the hassle of trying to namespace them). You could set up a system to make it work, but it'd be difficult and incredibly ugly.

Some other language ecosystems don't have this problem because the import is resolved at compile time instead.

by zahlman

1/15/2025 at 4:06:33 PM

This is incorrect on several points.

Firstly, Node's `require` syntax predates the modern JS `import` syntax. It is related to some attempts at the time to create tools that could act like a module system for the browser (in particular RequireJS), but it is distinct in that a lot of the rules about how modules would be resolved were specifically designed to make sense with NodeJS and the `node_modules` system.

Secondly, although path imports are used for local imports, Node's `require` and `import` syntax both use symbolic names to refer to third-party packages (as well as built-in packages). If you have a module called `lodash`, you would write something like `import "lodash"`, and Node will resolve the name "lodash" to the correct location. This behaviour is in principle the same as Python's — the only change is the resolution logic, which is Node-specific, and not set by the Javascript language at all.

The part of NodeJS that _does_ enable this behaviour is the scoped module installation. Conceptually, NPM install modules in this structure:

    * index.js (your code goes here)
    * node_modules (top level third-party modules go here)
        * react
            * index.js (react source code)
            * node_modules (react's dependencies go here)
                * lodash/index.js
        * vuejs
            * index.js (vuejs source code)
            * node_modules (vue's dependencies go here)
                * lodash/index.js
        * lodash
            * index.js
Importantly, you can see that each dependency gets their own set of dependencies. When `node_modules/react/index.js` imports lodash, that will resolve to the copy of lodash installed within the react folder in the dependency tree. That way, React, VueJS, and the top-level package can all have their own, different versions of lodash.

Of course in practice, you usually don't want lots of versions of the same library floating around, so NPM also has a deduping system that attempts to combine compatible modules (e.g. if the version ranges for React, Vue, and Lodash overlap, a version will be chosen that works for all modules). In addition, there are various ways of removing excess copies of modules — by default, NPM flattens the tree in a way that allows multiple modules to "see" the same imported module, and tools like PNPM use symlinks to remove duplicated modules. And you mention custom importers in Python, but I believe Yarn uses them already in NodeJS to do the fun things you talk about like importing from zip files that are vendored in a repository.

All of the above would be possible in Python as well, without changing the syntax or the semantics of the `import` statement at all. However, it would probably break a lot of the ecosystem assumptions about where modules live and how they're packaged. And it's not necessarily required to fix the core Python packaging issues, so I don't think people in Python-land are investing much time in exploring this issue.

Basically, no, there is no fundamental reason why Python and Node have to have fundamentally different import systems. The two languages process imports in a very similar way, with the main difference being whether packages are imported in a nested way (Node) or as a flattened directory (Python).

by MrJohz

1/12/2025 at 11:30:53 PM

Because you don’t need virtualenvs or ruby_modules. You can have however many versions of the same gem installed it’s simply referenced by a gemfile, so for Ruby version X you are guaranteed one copy of gem version Y and no duplicates.

This whole installing the same dependencies a million times across different projects in Python and Node land is completely insane to me. Ruby has had the only sane package manager for years. Cargo too, but only because they copied Ruby.

Node has littered my computer with useless files. Python’s venv eat up a lot of space unnecessarily too.

by fny

1/13/2025 at 7:23:44 AM

In principle, venvs could hard-link the files from a common source, as long as the filesystem supports that. I'm planning to experiment with this for Paper. It's also possible to use .pth files (https://docs.python.org/3/library/site.html) to add additional folders to the current environment at startup. (I've heard some whispers that this causes a performance hit, but I haven't noticed. Python module imports are cached anyway.) Symlinks should work, too. (But I'm pretty sure Windows shortcuts would not. No idea about junctions.)

by zahlman

1/13/2025 at 3:56:28 PM

I wish the ecosystem made heavier use of .zip packages, which would gravely help with the logistics of your hardlink plan, in addition to slimming down 300MB worth of source code. The bad news is that (AIUI) the code needs to be prepared for use from a package and thus my retroactively just zipping them up will break things at runtime

Take for example:

  $ du -hs $HOMEBREW_PREFIX/Cellar/ansible/11.1.0_1/libexec/lib/python3.13/site-packages/* | gsort --human
  103M /usr/local/Cellar/ansible/11.1.0_1/libexec/lib/python3.13/site-packages/botocore
  226M /usr/local/Cellar/ansible/11.1.0_1/libexec/lib/python3.13/site-packages/ansible_collections

by mdaniel

1/13/2025 at 4:52:18 PM

Wheels are zip files with a different extension. And the Python runtime can import Python code from them - that's crucial to the Pip bootstrapping process, in fact.

But packages that stay zipped aren't a thing any more - because it's the installer that gets to choose how to install the package, and Pip just doesn't do that. Pip unpacks the wheel mainly because many packages are written with the assumption that the code will be unpacked. For example, you can't `open` a relative path to a data file that you include with your code, if the code remains in a zip file. Back in the day when eggs were still a common distribution format, it used to be common to set metadata to say that it's safe to leave the package zipped up. But it turned out to be rather difficult to actually be sure that was true.

The wheel is also allowed to contain a folder with files that are supposed to go into other specified directories (specific ones described by the `sysconfig` standard library https://docs.python.org/3/library/sysconfig.html ), and Pip may also read some "entry point" metadata and create executable wrappers, and leave additional metadata to say things like "this package was installed with Pip". The full installation process for wheels as designed is described at https://packaging.python.org/en/latest/specifications/binary... . (Especially see the #is-it-possible-to-import-python-code-directly-from-a-wheel-file section at the end.)

The unzipped package doesn't just take up extra space due to unzipping, but also because of the .pyc cache files. A recent Pip wheel is 1.8 MiB packed and 15.2 MiB unpacked on my system; that's about 5.8 apparent MiB .py, 6.5 MiB .pyc, 2.1 MiB from wasted space in 4KiB disk blocks, and [s].8 MiB from a couple hundred folders.[/s] Sorry, most of that last bit is actually stub executables that Pip uses on Windows to make its wrappers "real" executables, because Windows treats .exe files specially.

by zahlman

1/13/2025 at 11:44:18 AM

"They copied ruby" is a little unfair. From memory it was some of the same individuals.

by regularfry

1/13/2025 at 3:37:34 PM

You're right about this: apparently the same guy who wrote Bundler wrote Cargo.

by fny

1/12/2025 at 10:02:01 PM

Ruby has a number of solutions for this - rvm (the oldest, but less popular these days), rbenv (probably the most popular), chruby/gem_home (lightweight) or asdf (my personal choice as I can use the same tool for lots of languages). All of those tools install to locations that shouldn't need root.

by andrewmcdonough

1/12/2025 at 10:35:36 PM

Yes, I am aware of all of those, although I couldn't offhand tell anyone the difference in tradeoffs between them. But I consider having to install a fresh copy of the whole distribution a grave antipattern. I'm aware that nvm and pyenv default to it and I don't like that

I did notice how Homebrew sets env GEM_HOME=<Cellar>/libexec GEM_PATH=<Cellar>/libexec (e.g. <https://github.com/Homebrew/homebrew-core/blob/9f056db169d5f...>) but, similar to my node experience, since I am a ruby outsider I don't totally grok what isolation that provides

by mdaniel

1/13/2025 at 11:43:21 AM

The mainline ruby doesn't but tools to support virtualenvs are around. They're pretty trivial to write: https://github.com/regularfry/gemsh/blob/master/bin/gemsh

As long as you're in the ruby-install/chruby ecosystem and managed to avoid the RVM mess then the tooling is so simple that it doesn't really get any attention. I've worked exclusively with virtualenvs in ruby for years.

by regularfry

1/13/2025 at 5:54:10 AM

FWIW, you can usually just drop the `-g` and it'll install into `node_modules/.bin` instead, so it stays local to your project. You can run it straight out of there (by typing the path) or do `npm run <pkg>` which I think temporarily modifies $PATH to make it work.

by 8n4vidtmkvmk

1/13/2025 at 5:58:09 AM

The `npx` command (which comes bundled with any nodejs install) is the way to do that these days.

by nicoburns

1/14/2025 at 5:44:28 AM

`npx` doesn't update package.json/package.lock though, right? So you might get a different version of the package once awhile. If it's an executable you depend on for your project, it makes sense to version it IMO.

by 8n4vidtmkvmk

1/14/2025 at 5:54:09 AM

You can do a (local) install using `npm install` and then execute the binary using `npx`. npx will also try to fetch the binary over the network if you don't have it installed, which is questionable behaviour in my opinion, but you can just cancel this if it starts doing it.

by nicoburns

1/12/2025 at 9:23:35 PM

Python has been cleaning up a number of really lethal problems like:

(i) wrongly configured character encodings (suppose you incorporated somebody else's library that does a "print" and the input data contains some invalid characters that wind up getting printed; that "print" could crash a model trainer script that runs for three days if error handling is set wrong and you couldn't change it when the script was running, at most you could make the script start another python with different command line arguments)

(ii) site-packages; all your data scientist has to do is

   pip install --user
the wrong package and they'd trashed all of their virtualenvs, all of their condas, etc. Over time the defaults have changed so pythons aren't looking into the site-packages directories but I wasted a long time figuring out why a team of data scientists couldn't get anything to work reliably

(iii) "python" built into Linux by default. People expected Python to "just work" but it doesn't "just work" when people start installing stuff with pip because you might be working on one thing that needs one package and another thing that needs another package and you could trash everything you're doing with python in the process of trying to fix it.

Unfortunately python has attracted a lot of sloppy programmers who think virtualenv is too much work and that it's totally normal for everything to be broken all the time. The average data scientist doesn't get excited when it crumbles and breaks, but you can't just call up some flakes to fix it. [1]

[1] https://www.youtube.com/watch?v=tiQPkfS2Q84

by PaulHoule

1/12/2025 at 10:50:49 PM

> Python has been cleaning up a number of really lethal problems like

I wish they would stick to semantic versioning tho.

I have used two projects that got stuck in incompatible changes in the 3.x Python.

That is a fatal problem for Python. If a change in a minor version makes things stop working, it is very hard to recommend the system. A lot of work has gone down the drain, by this Python user, trying to work around that

by worik

1/13/2025 at 5:26:57 AM

I assume these breaking changes are in the stdlib and not in the python interpreter (the language), right?

There was previous discussions about uncoupling the stdlib (python libraries) from the release and have them being released independently, but I can’t remember why that died off

by meitham

1/13/2025 at 7:50:07 AM

> I assume these breaking changes are in the stdlib and not in the python interpreter (the language), right?

That's the usual case, but it can definitely also happen because of the language - see https://stackoverflow.com/questions/51337939 .

> There was previous discussions about uncoupling the stdlib (python libraries) from the release and have them being released independently, but I can’t remember why that died off

This sort of thing is mainly a social problem.

by zahlman

1/13/2025 at 4:18:54 PM

I assume these breaking changes are in the stdlib

Well yes, but sometimes the features provided by stdlib can feel pretty 'core'. I remember that how dataclasses work for example changed and broke a lot of our code when we upgraded from 3.8 to 3.10,

There have also been breaking changes in the C API. We have one project at work that is stuck at 3.11 since a third party dependency won't build on 3.12 without a rewrite.

by dagw

1/12/2025 at 10:05:58 PM

I don’t exactly remember the situation but a user created a python module named error.py.

Then in their main code they imported the said error.py but unfortunately numpy library also has an error.py. So the user was getting very funky behavior.

by whatever1

1/13/2025 at 7:32:54 AM

Yep, happens all the time with the standard library. Nowadays, third-party libraries have this issue much less because they can use relative imports except for their dependencies. But the Python standard library, for historical reasons, isn't organized into a package, so it can't do that.

Here's a fun one (this was improved in 3.11, but some other names like `traceback.py` can still reproduce a similar problem):

  /tmp$ touch token.py
  /tmp$ py3.10
  Python 3.10.14 (main, Jun 24 2024, 03:37:47) [GCC 11.4.0] on linux
  Type "help", "copyright", "credits" or "license" for more information.
  >>> help()
  Traceback (most recent call last):
    File "<stdin>", line 1, in <module>
    File "/opt/python/standard/lib/python3.10/_sitebuiltins.py", line 102, in __call__
      import pydoc
    File "/opt/python/standard/lib/python3.10/pydoc.py", line 62, in <module>
      import inspect
    File "/opt/python/standard/lib/python3.10/inspect.py", line 43, in <module>
      import linecache
    File "/opt/python/standard/lib/python3.10/linecache.py", line 11, in <module>
      import tokenize
    File "/opt/python/standard/lib/python3.10/tokenize.py", line 36, in <module>
      from token import EXACT_TOKEN_TYPES
  ImportError: cannot import name 'EXACT_TOKEN_TYPES' from 'token' (/tmp/token.py)
Related Stack Overflow Q&A (featuring an answer from me): https://stackoverflow.com/questions/36250353

by zahlman

1/12/2025 at 10:11:34 PM

... it's tricky. In Java there's a cultural expectation that you name a package like

  package organization.dns.name.this.and.that;
but real scalability in a module system requires that somebody else packages things up as

  package this.and.that;
and you can make the system look at a particular wheel/jar/whatever and make it visible with a prefix you specify like

  package their.this.and.that;
Programmers seem to hate rigorous namespace systems though. My first year programming Java (before JDK 1.0) the web site that properly documented how to use Java packages was at NASA and you still had people writing Java classes that were in the default package.

by PaulHoule

1/12/2025 at 10:45:43 PM

But let's all be real here: the ability of __init__.py to do FUCKING ANYTHING IT WANTS is insanity made manifest

I am kind of iffy on golang's import (. "some/packge/for/side-effects") but at least it cannot suddenly mutate GOPATH[0]="/home/jimmy/lol/u/fucked" as one seems to be able to do on the regular with python

I am acutely aware that is (programmer|package|organization|culture)-dependent but the very idea that one can do that drives us rigorous people stark-raving

by mdaniel

1/13/2025 at 7:45:48 AM

It took me a while to realize that you do mean __init__.py running when a package imports.

But, you know, top-level code (which can do anything) runs when you import any Python module (the first time), and Python code doesn't have to be in a package to get imported. (The standard library depends on this.)

by zahlman

1/12/2025 at 11:31:04 PM

> Programmers seem to hate rigorous namespace systems though

Pretty much a nothing burger in Rust, so I disagree that items necessarily hate the concept. Maybe others haven’t done a good job with the UX?

by vlovich123

1/13/2025 at 3:58:51 PM

That. Forcing people to do it right from day one helps a lot. Also Rust attracts a programmer who is willing to accept some pain up front to save pain later, if your motto is "give me convenience or give me death" you might stick with C or Python.

If you give people an "easy way out" it is very hard to compel them to do it more rigorously. Look at the history of C++ namespaces as well as the non-acceptance of various "Modula" languages and Ada back in the 1980s.

by PaulHoule

1/12/2025 at 9:56:40 PM

Half the time something breaks in a javascript repo or project, every single damn javascript expert in the team/company tells me to troubleshoot using the below sequence, as if throwing spaghetti on a wall with no idea what's wrong.

Run npm install

Delete node_modules and wait 30minutes because it takes forever to delete 500MB worth of 2 million files.

Do an npm install again (or yarn install or that third one that popped up recently?)

Uninstall/Upgrade npm (or is it Node? No wait, npx I think. Oh well, used to be node + npm, now it's something different.)

Then do steps 1 to 3 again, just in case.

Hmm, maybe it's the lockfile? Delete it, one of the juniors pushed their version to the repo without compiling maybe. (Someone forgot to add it to the gitignore file?)

Okay, do steps 1 to 3 again, that might have fixed it.

If you've gotten here, you are royally screwed and should try the next javascript expert, he might have seen your error before.

So no, I'm a bit snarky here, but the JS ecosystem is a clustermess of chaos and should rather fix it's own stuff first. I have none of the above issues with python, a proper IDE and out of the box pip.

by zo1

1/13/2025 at 2:01:26 AM

So you’re not experiencing exactly this with pip/etc? I hit this “just rebuild this 10GB venv” scenario like twice a day while learning ML. Maybe it’s just ML, but then regular node projects don’t have complex build-step / version-clash deps either.

by wruza

1/13/2025 at 6:47:23 AM

I think it's something unique to python's ML ecosystem, to be honest. There is a lot of up-in-the-air about how to handle models, binaries and all of that in a contained package, and that results in quite a few hand-rolled solutions, some of which encroach on the package manager's territory, plus of course drivers and windows.

I've worked on/with some fairly large/complex python projects, and they almost never have any packaging issues that aren't just obvious errors by users. Yes, every once in a while we have to be explicit about a dependency because some dependent project isn't very strict with their versioning policy and their API layers.

by zo1

1/13/2025 at 9:39:04 AM

I've not used python professionally for years - but I have had to do this maybe once in many years of usage. Seen it like once more in my team(s). A rounding error.

I've seen someone having to do this in node like once every month, no matter which year, no matter which project or company.

by wink

1/12/2025 at 10:22:11 PM

The pain is real. Most of the issues are navigable, but often take careful thought versus some canned recipe. npm or yarn in large projects can be a nightmare. starting with pnpm makes it a dream. Sometimes migrating to pnpm can be rough, because projects that work may rely on incorrect, transitive, undeclared deps actually resolving. Anyway, starting from pnpm generally resolves this sort of chaos.

Most packing managers are developed.

Pnpm is engineered.

It’s one of the few projects I donate to on GitHub

by cdaringe

1/12/2025 at 11:13:43 PM

What kind of amateurs are you working with? I’m not a Node.js dev and even I know about npm ci command.

by wiseowise

1/12/2025 at 11:24:28 PM

Sounds like a tale from a decade ago, people now use things like pnpm and tsx.

by mirekrusin

1/13/2025 at 3:01:46 AM

Environment and dependency management in JS-land is even worse.

Similar problems with runtime version management (need to use nvm for sanity, using built-in OS package managers seems to consistently result in tears).

More package managers and interactions (corepack, npm, pnpm, yarn, bun).

Bad package interop (ESM vs CJS vs UMD).

More runtimes (Node, Deno, Bun, Edge).

Then compound this all with the fact that JS doesn't have a comprehensive stdlib so your average project has literally 1000s of dependencies.

by fastball

1/13/2025 at 3:59:19 AM

Valid criticisms, but the "standard" choices all work well. Nvm is the de facto standard for node version management, npm is a totally satisfactory package manager, node is the standard runtime that those other runtimes try to be compatible with, etc.

Will also note that in my years of js experience I've hardly ever run into module incompatibilities. It's definitely gnarly when it happens, but wouldn't consider this to be the same category of problem as the confusion of setting up python.

Hopefully uv can convince me that python's environment/dependency management can be easier than JavaScript's. Currently they both feel bad in their own way, and I likely prefer js out of familiarity.

by nosefurhairdo

1/13/2025 at 11:53:33 AM

> I've hardly ever run into module incompatibilities

I'm not totally sure what you're referring to, but I've definitely had a number of issues along the lines of:

- I have to use import, not require, because of some constraint of the project I'm working in - the module I'm importing absolutely needs to be required, not imported

I really don't have any kind of understanding of what the fundamental issues are, just a very painful transition point from the pre-ESM world to post.

by stevage

1/13/2025 at 7:17:09 PM

I was referring to cjs vs esm (require vs import, respectively). I suspect I was late enough to the game (and focused on browser js) such that I've mostly only used esm.

I will note that node has embraced esm, and nowadays it's frowned upon to only publish cjs packages so this problem is shrinking every day. Also cool is some of the newer runtimes support esm/cjs interop.

by nosefurhairdo

1/13/2025 at 11:51:09 AM

>Similar problems with runtime version management (need to use nvm for sanity, using built-in OS package managers seems to consistently result in tears).

In practice I find this a nuisance but a small one. I wish there had been a convention that lets the correct version of Node run without me manually having to switch between them.

> More package managers and interactions (corepack, npm, pnpm, yarn, bun).

But they all work on the same package.json and node_modules/ principle, afaik. In funky situations, incompatibilities might emerge, but they are interchangeable for the average user. (Well, I don't know about corepack.)

> Bad package interop (ESM vs CJS vs UMD).

That is a whole separate disaster, which doesn't really impact consuming packages. But it does make packaging them pretty nasty.

> More runtimes (Node, Deno, Bun, Edge).

I don't know what Edge is. Deno is different enough to not really be in the same game. I find it hard to see the existence of Bun as problematic: it has been a bit of a godsend for me, it has an amazing ability to "just work" and punch through Typescript configuration issues that choke TypeScript. And it's fast.

> Then compound this all with the fact that JS doesn't have a comprehensive stdlib so your average project has literally 1000s of dependencies.

I guess I don't have a lot of reference points for this one. The 1000s of dependencies is certainly true though.

by stevage

1/13/2025 at 9:55:56 PM

> I wish there had been a convention that lets the correct version of Node run without me manually having to switch between them.

For what it's worth, I think .tool-versions is slowly starting to creep into this space.

Mise (https://mise.jdx.dev/dev-tools/) and ASDF (https://asdf-vm.com/) both support it.

Big reason I prefer ASDF to nvm/rvm/etc right now is that it just automatically adjusts my versions when I cd into a project directory.

by horsawlarway

1/12/2025 at 9:52:46 PM

I've only recently started with uv, but this is one thing it seems to solve nicely. I've tried to get into the mindset of only using uv for python stuff - and hence I haven't installed python using homebrew, only uv.

You basically need to just remember to never call python directly. Instead use uv run and uv pip install. That ensures you're always using the uv installed python and/or a venv.

Python based tools where you may want a global install (say ruff) can be installed using uv tool

by RobinL

1/12/2025 at 11:22:36 PM

> Python based tools where you may want a global install (say ruff) can be installed using uv tool

uv itself is the only Python tool I install globally now, and it's a self-contained binary that doesn't rely on Python. ruff is also self-contained, but I install tools like ruff (and Python itself) into each project's virtual environment using uv. This has nice benefits. For example, automated tests that include linting with ruff do not suddenly fail because the system-wide ruff was updated to a version that changes rules (or different versions are on different machines). Version pinning gets applied to tooling just as it does to packages. I can then upgrade tools when I know it's a good time to deal with potentially breaking changes. And one project doesn't hold back the rest. Once things are working, they work on all machines that use the same project repo.

If I want to use Python based tools outside of projects, I now do little shell scripts. For example, my /usr/local/bin/wormhole looks like this:

  #!/bin/sh
  uvx \
      --quiet \
      --prerelease disallow \
      --python-preference only-managed \
      --from magic-wormhole \
      wormhole "$@"

by rented_mule

1/13/2025 at 7:52:18 AM

>You basically need to just remember to never call python directly. Instead use uv run and uv pip install.

I don't understand why people would rather do this part specfically, rather than activate a venv.

by zahlman

1/12/2025 at 9:31:33 PM

Because node.js isn't a dependency of the Operating system.

Also we don't have a left pad scale dependency ecosystem that makes version conflicts such a pressing issue.

by TZubiri

1/13/2025 at 1:53:46 AM

Oh, tell us OS can’t venv itself a separate python root and keep itself away from what user invents to manage deps. This is non-explanation appealing to authority while it’s clearly just a mess lacking any thought. It just works like this.

we don't have a left pad scale dependency ecosystem that makes version conflicts such a pressing issue

TensorFlow.

by wruza

1/13/2025 at 3:23:03 AM

We have virtual envs and package isolation, it's usually bloated, and third party and doesn't make for a good robust OS base, it's more an app layer. See flatpak, snapcraft.

"Compares left pad with ml library backing the hottest AI companies of the cycle"

by TZubiri

1/13/2025 at 3:08:51 PM

Bloated is an emotion, it's not a technical term. The related, most commonly known technical term is "DLL hell". And I absolutely love to work with flatpak, because not once in my job I thought about it, apart from looking what folder to backup.

Left pad was ten years ago. Today is 2025-01-13 and we are still discussing how good yet another python PM presumably is. Even a hottest ML library can only do so much in this situation.

by wruza

1/13/2025 at 7:56:23 AM

>Oh, tell us OS can’t venv itself a separate python root and keep itself away from what user invents to manage deps.

I've had this thought too. But it's not without its downsides. Users would have to know about and activate that venv in order to, say, play with system-provided GTK bindings. And it doesn't solve the problem that the user may manage dependencies for more than one project, that don't play nice with each other. If everything has its own venv, then what is the "real" environment even for?

by zahlman

1/12/2025 at 10:11:40 PM

This. IME, JS devs rarely have much experience with an OS, let alone Linux, and forget that Python literally runs parts of the OS. You can’t just break it, because people might have critical scripts that depend on the current behavior.

by sgarland

1/13/2025 at 2:28:46 AM

I think it makes sense given that people using python to write applications are a minority of python users. It's mostly students, scientists, people with the word "analyst" in their title, etc. Perhaps this goes poorly in practice, but these users ostensibly have somebody else to lean on re: setting up their environments, and those people aren't developers either.

I have to imagine that the python maintainers listen for what the community needs and hear a thousand voices asking for a hundred different packaging strategies, and a million voices asking for the same language features. I can forgive them for prioritizing things the way they have.

by __MatrixMan__

1/13/2025 at 2:33:19 AM

I'm not sure I understand your point. Managing dependencies is easy in node. It seems to be harder in Python. What priority is being supported here?

by stevage

1/12/2025 at 9:19:57 PM

Hot take: pnpm is the best dx, of all p/l dep toolchains, for devs who are operating regularly in many projects.

Get me the deps this project needs, get them fast, then them correctly, all with minimum hoops.

Cargo and deno toolchains are pretty good too.

Opam, gleam, mvn/gradle, stack, npm/yarn, nix even, pip/poetry/whatever-python-malarkey, go, composer, …what other stuff have i used in the past 12 months… c/c++ doesn’t really have a first class std other than global sys deps (so ill refer back to nix or os package managers).

Getting the stuff you need where you need it is always doable. Some toolchains are just above and beyond, batteries included, ready for productivity.

by cdaringe

1/12/2025 at 10:29:02 PM

Have you used bun? It's also great. Super fast

by krashidov

1/12/2025 at 9:37:17 PM

pnpm is the best for monorepos. I've tried yarn workspaces and npm's idea of it and nothing comes close to the DX of pnpm

by theogravity

1/13/2025 at 1:15:33 AM

What actually, as an end user, about pnpm is better than Yarn? I've never found an advantage with pnpm in all the times I've tried it. They seem very 1:1 to me, but Yarn edges it out thanks to it having a plugin system and its ability to automatically pull `@types/` packages when needed.

by paularmstrong

1/13/2025 at 1:57:14 AM

automatically pull `@types/` packages when needed

Wait, what? Since when?

by wruza

1/12/2025 at 10:55:47 PM

I swear I'm not trolling: what do you not like about modern golang's dep management (e.g. go.mod and go.sum)?

I agree that the old days of "there are 15 dep managers, good luck" was high chaos. And those who do cutesy shit like using "replace" in their go.mod[1] is sus but as far as dx $(go get) that caches by default in $XDG_CACHE_DIR and uses $GOPROXY I think is great

1: https://github.com/opentofu/opentofu/blob/v1.9.0/go.mod#L271

by mdaniel

1/13/2025 at 2:29:13 AM

To be fair your specific example is due to… well forking terraform due to hashicorp licensing changes.

by geethree

1/13/2025 at 2:46:14 AM

hcl is still MPLv2 https://github.com/hashicorp/hcl/blob/v2.23.0/LICENSE and my complaint is that the .go file has one import path but the compiler is going to secretly use a fork, versus updating the import path like a sane person. The only way anyone would know to check for why the complied code behaves differently is to know that trickery was possible

And that's not even getting into this horseshit: https://github.com/opentofu/hcl/blob/v2.20.1/go.mod#L1 which apparently allows one to declare a repos _import_ path to be different from the url used to fetch it

I have a similar complaint about how in the world anyone would know how "gopkg.in/yaml.v3" secretly resolved to view-source:https://gopkg.in/yaml.v3?go-get=1

by mdaniel

1/13/2025 at 7:14:56 AM

When was the last time you saw a NodeJS package that expects to be able to compile Fortran code at installation time?

If you want Numpy (one of the most popular Python packages) on a system that doesn't have a pre-built wheel, you'll need to do that. Which is why there are, by my count, 54 different pre-built wheels for Numpy 2.2.1.

And that's just the actual installation process. Package management isn't solved because people don't even agree on what that entails.

The only way you avoid "worry about modifying the global environment" is to have non-global environments. But the Python world is full of people who refuse to understand that concept. People would rather type `pip install suspicious-package --break-system-packages` than learn what a venv is. (And they'll sometimes do it with `sudo`, too, because of a cargo-cult belief that this somehow magically fixes things - spoilers: it's typically because the root user has different environment variables.)

Which is why this thread happened on the Python forums https://discuss.python.org/t/the-most-popular-advice-on-the-... , and part of why the corresponding Stack Overflow question https://stackoverflow.com/questions/75608323 has 1.4 million views. Even though it's about an error message that was carefully crafted by the Debian team to tell you what to do instead.

by zahlman

1/12/2025 at 9:27:28 PM

It is kind of solved, but not default.

This makes a big difference. There is also the social problem of Python community with too loud opinions for making a good robust default solution.

But same has now happened for Node with npm, yarn and pnpm.

by miohtama

1/12/2025 at 9:54:35 PM

I wouldn't really say it's that black and white. It was only recently that many large libraries and tools recommended starting with "npm i -g ...". Of course you could avoid it if you knew better, but the same is true for Python.

by Etheryte

1/13/2025 at 6:09:29 PM

How has NodeJS solved it? There are tons of version managers for Node.

by trallnag

1/13/2025 at 10:21:31 AM

Node hasn't solved this mess because it doesn't have the same mess.

It has a super limited compiled extensions ecosystem, plugin ecosystem and is not used as a system language in mac and linux.

And of course node is much more recent and the community less diverse.

tldr: node is playing in easy mode.

by BiteCode_dev

1/13/2025 at 1:58:58 AM

I usually stay away far FAR from shiny new tools but I've been experimenting with uv and I really like it. I'm a bit bummed that it's not written in Python but other than that, it does what it says on the tin.

I never liked pyenv because I really don't see the point/benefit building every new version of Python you want to use. There's a reason I don't run Gentoo or Arch anymore. I'm very happy that uv grabs pre-compiled binaries and just uses those.

So far I have used it to replace poetry (which is great btw) in one of my projects. It was pretty straightforward, but the project was also fairly trivial/typical.

I can't fully replace pipx with it because 'uv tool' currently assumes every Python package only has one executable. Lots of things I work with have multiple, such as Ansible and Jupyterlab. There's a bug open about it and the workarounds are not terrible, but it'd be nice if they are able to fix that soon.

by bityard

1/13/2025 at 5:20:22 AM

uv is great, but downloading and installing base python interpreter is not a good feature as it doesn’t fetch that from PSF but from a project on GitHub, that very same project says this is compiled for portability over performance, see https://gregoryszorc.com/docs/python-build-standalone/main/

by meitham

1/13/2025 at 7:01:25 AM

>that very same project says this is compiled for portability over performance

Realistically, the options on Linux are the uv way, the pyenv way (download and compile on demand, making sure users have compile-time dependencies installed as part of installing your tool), and letting users download and compile it themself (which is actually very easy for Python, at least on my distro). Compiling Python is not especially fast (around a full minute on my 4-core, 10-year-old machine), although I've experienced much worse in my lifetime. Maybe you can get alternate python versions directly from your distro or a PPA, but not in a way that a cross-distro tool can feasibly automate.

On Windows the only realistic option is the official installer.

by zahlman

1/13/2025 at 8:04:59 AM

Yes, which is why it's silly to do it. Developers (not "users"!) need to learn how to install Python on their system. I honestly don't know how someone can call themselves a Python developer if they can't even install the interpreter!

by globular-toast

1/13/2025 at 9:19:13 AM

The Python community has the attitude that everyone needs to be welcomed; I mostly agree, and I don't see the point in fussing about what people want to call themselves. Everyone starts somewhere, and people try really hard to help (one random example from 2022: https://discuss.python.org/t/python-appears-in-cmd/15858)

But overall, computer literacy is really on the decline these days. Nowadays before you can teach programming in any traditional sense, you may have to teach the concept of a file system, then a command line...

by zahlman

1/13/2025 at 12:04:21 PM

I for one enjoy the convenience of automatically installing python versions. Yes I know how to do it manually. Yes it is possible to install multiple versions. But that does not mean I want to do it every time, just to test how things behave in different python versions. For that, it's also okay if it does not install the most performant version of the interpreter.

by bennofs

1/13/2025 at 5:49:20 PM

>Yes it is possible to install multiple versions. But that does not mean I want to do it every time, just to test how things behave in different python versions

You only have to do it once per version with this approach. Then you can create venvs from that base, and it's basically instantaneous if you do it `--without-pip`.

by zahlman

1/13/2025 at 1:12:06 PM

Sure. We've had system package managers for decades. I install a major version once a year and it gets automatically upgraded to the latest patch version by my system package manager, just like everything else.

by globular-toast

1/13/2025 at 4:18:19 PM

But PSF doesn't distribute binary builds, so what's the alternative?

by bityard

1/13/2025 at 11:35:40 AM

that it does it automatically is weird

by brainzap

1/13/2025 at 4:24:33 PM

I've only noticed after my corporate firewall stopped me!

by meitham

1/12/2025 at 9:06:28 PM

There's so many more!

1. `uvx --from git+https://github.com/httpie/cli httpie` 2. https://simonwillison.net/2024/Aug/21/usrbinenv-uv-run/ uv in a shebang

by emiller88

1/12/2025 at 9:22:13 PM

Yes! since that Simon Willison article, I've slowly been easing all my scripts into just using a uv shebang, and it rocks! I've deleted all sorts of .venvs and whatnot. really useful

by FergusArgyll

1/12/2025 at 9:12:06 PM

The uv shebang is definitely the killer feature for me, especially with so much of the AI ecosystem tied up in Python. Before, writing Python scripts was a lot more painful requiring either a global scripts venv and shell scripts to bootstrap them, or a venv per script.

I’m sure it was already possible with shebangs and venv before, but uv really brings the whole experience together for me so I can write python scripts as freely as bash ones.

by throwup238

1/12/2025 at 11:08:47 PM

Super neat re Willison article.. would something like this work under powershell though?!

by dingdingdang

1/12/2025 at 8:49:36 PM

The activation of the virtualenv is unnecessary (one can execute pip/python directly from it), and the configuring of your local pyenv interpreter is also unnecessary, it can create a virtual environment with one directly:

  pyenv virtualenv python3.12 .venv
  .venv/bin/python -m pip install pandas
  .venv/bin/python
Not quite one command, but a bit more streamlined; I guess.

by supakeen

1/12/2025 at 9:20:48 PM

Note that in general calling the venv python directly vs activating the venv are not equivalent.

E.g. if the thing you run invokes python itself, it will use the system python, not the venv one in the first case.

by BeeOnRope

1/13/2025 at 5:33:31 AM

Surely if you want to invoke python you call sys.executable otherwise if your subprocess doesn’t inherit PATH nothing will work with uv or without uv

by meitham

1/15/2025 at 3:44:38 PM

I don't think that's "sure" at all. For one thing, only Python code directly calling Python has that option in the first place, often there is another layer of indirection, e.g., Python code which executes a shell script, which itself invokes Python, etc.

IME it is common to see a process tree with multiple invocations of Python in a ancestor relationship with other processes in between.

by BeeOnRope

1/13/2025 at 8:04:33 AM

In rare cases, programs might also care about the VIRTUAL_ENV environment variable set by the activate script, and activation may also temporarily clear out any existing PYTHONHOME (a rarely used override for the location of the standard library). But yes, in general you can just run the executable directly.

by zahlman

1/12/2025 at 8:51:05 PM

Indeed, you're right ;).

by astronautas

1/12/2025 at 8:44:27 PM

Uv also bundles uvx command so you can run Python scripts without installing them manually:

uvx --from 'huggingface_hub[cli]' huggingface-cli

by lukax

1/12/2025 at 8:46:58 PM

Neat!

by astronautas

1/13/2025 at 5:35:44 AM

Ok, this must be a dumb question answered by the manual, but I still haven't got my hands on uv, so: but does it solve the opposite? I mean, I pretty much never want any "ad-hoc" environments, but I always end up with my .venv becoming an ad-hoc environment, because I install stuff while experimenting, not bothering to patch requirements.txt, pyproject.toml or anything of the sort. In fact, now I usually don't even bother typing pip install, PyCharm does it for me.

This is of course bad practice. What I would like instead is what PHP's composer does: installing stuff automatically changes pyprpject.toml (or whatever the standard will be with uv), automatically freezes the versions, and then it is on git diff to tell me what I did last night, I'll remove a couple of lines from that file, run composer install and it will remove packages not explicitly added to my config from the environment. Does this finally get easy to achieve with uv?

by krick

1/13/2025 at 7:02:05 AM

I think it does! uv add [0] adds a dependency to your pyproject.toml, as well as your environment.

If you change your pyproject.toml file manually, uv sync [1] will update your environment accordingly.

[0]: https://docs.astral.sh/uv/guides/projects/#managing-dependen... [1]: https://docs.astral.sh/uv/reference/cli/#uv-sync

by mk12345

1/13/2025 at 7:59:52 AM

If I read [1] correctly, it seems it checks against lockfile, not pyproject.toml. So it seems like it won't help if I change pyproject.toml manually. Which is a big inconveniece, if so.

Whatever, I think I'll try it for myself later today. It's long overdue.

by krick

1/14/2025 at 11:37:10 AM

Most uv commands will (unless otherwise instructed like e.g. with --frozen) by default update your lockfile to be in sync with your pyproject.toml.

by hobofan

1/13/2025 at 6:54:19 AM

>installing stuff automatically changes pyprpject.toml (or whatever the standard will be with uv)

pyproject.toml represents an inter-project standard and Charlie Marsh has committed to sticking with it, along with cooperating with future Python packaging PEPs. But while you can list transitive dependencies, specify exact versions etc. in pyproject.toml, it's not specifically designed as a lockfile - i.e., pyproject.toml is meant for abstract dependencies, where an installer figures out transitively what's needed to support them and decides on exact versions to install.

The current work for specifying a lockfile standard is https://peps.python.org/pep-0751/ . As someone else pointed out, uv currently already uses a proprietary lockfile, but there has been community interest in trying to standardize this - it just has been hard to find agreement on exactly what it needs to contain. (In the past there have been proposals to expand the `pyproject.toml` spec to include other information that lockfiles often contain for other languages, such as hashes and supply-chain information. Some people are extremely against this, however.)

As far as I know, uv isn't going to do things like analyzing your codebase to determine that you no longer need a certain dependency that's currently in your environment and remove it (from the environment, lock file or `pyproject.toml`). You'll still be on the hook for figuring out abstractly what your project needs, and this is important if you want to share your code with others.

by zahlman

1/13/2025 at 7:48:48 AM

> uv isn't going to do things like analyzing your codebase

Sure, that's not what I meant (unless we call pyproject.toml a part of your codebase, which it kinda is, but that's probably not what you meant).

In fact, as far as I can tell from your answer, Python does move in the direction I'd like it to move, but it's unclear by how far it will miss and if how uv handles it is ergonomical.

As I've said, I think PHP's composer does a very good job here, and to clarify, this is how it works. There are 2 files: composer.json (≈pyproject.toml) and composer.lock (≈ PEP751) (also json). The former is kinda editable by hand, the latter you ideally never really touch. However, for the most part composer is smart enough that it edits both files for you (with some exceptions, of course), so every time I run `composer require your/awesomelib` it

1) checks the constraints in these files

2) finds latest appropriate version of your/awesomelib (5.0.14) and all its dependencies

3) writes "your/awesomelib": "^5.0"

4) writes "your/awesomelib": "5.0.14" and all its dependencies to composer.lock (with hashsums, commit ids and such)

It is a good practice to keep both inside of version control, so when I say "git diff tells me what I did last night" it means that I'll also see what I installed. If (as usual) most of it is some useless trash, I'll manually remove "your/awesomelib" from composer.json, run `composer install` and it will remove it and all its (now unneeded) dependencies. As the result, I never need to worry about bookkeeping, since composer does it for me, I just run `composer require <stuff>` and it does the rest (except for cases when <stuff> is a proprietary repo on company's gitlab and such, then I'll need slightly more manual work).

That is, what I hope to see in Python one day (10 years later than every other lang did it) is declarative package management, except I don't want to have to modify pyproject.toml manually, I want my package manager do it for me, because it saves me 30 seconds of my life every time I install something. Which accumulates to a lot.

by krick

1/13/2025 at 10:23:07 AM

I believe you're searching for `uv sync`: https://docs.astral.sh/uv/getting-started/features/#projects

With this, you can manage the dependency list via `uv add/remove` (or the `pyproject.toml` directly), and run `uv sync` to add/remove any dependencies to the managed virtual env.

Edit: adding about uv add/uv remove

by rochacon

1/13/2025 at 5:50:35 AM

I'm not an expert, but as far as I can tell UV allows you to do this without feeling so guilty (it handles multiple versions of Python and libraries AFAIK quite well).

by wisty

1/13/2025 at 6:04:10 AM

I'm waiting for this issue to be done: Add an option to store virtual environments in a centralized location outside projects https://github.com/astral-sh/uv/issues/1495

I have used virtualenvwrapper before and it was very convenient to have all virtual environments stored in one place, like ~/.cache/virtualenvs.

The .venv in the project directory is annoying because when you copy folder somewhere you start copying gigabytes of junk. Some tools like rsync can't handle CACHEDIR.TAG (but you can use --exclude .venv)

by tandav

1/13/2025 at 1:39:12 AM

Why can’t python just adopt something like yarn/pnpm + and effing stop patch-copying its binaries into a specific path? And pick up site_packages from where it left it last time? Wtf. How hard it is to just pick python-modules directory and python-project.json and sync it into correctness by symlink/mklink-inf missing folders from a package cache in there in a few seconds?

Every time when I have to reorganize or upgrade my AI repos, it’s yet another 50GB writes to my poor ssd. Half of it is torch, another half auto-downloaded models that I cannot stop because they become “downloaded” and you never know how to resume it back or even find where they are cause python logging culture is just barbaric.

by wruza

1/12/2025 at 11:37:19 PM

Ridiculous post:

The author says that a normal route would be:

   - Take the proper route:

   - Create a virtual environment

   - pip install pandas

   - Activate the virtual environment

   - Run python
Basically, out of the box, when you create an virtual it is immediately activated. And you would obviously need to have it activated before doing a pip install...

In addition, in my opinion this is the thing that would sucks about UV to have different functions being tied to a single tool execution.

It is a breeze to be able to activate a venv, and be done with it, being able to run multiple times your program in one go, even with crashes, being able to install more dependencies, test it in REPL, ...

by greatgib

1/13/2025 at 8:49:33 AM

Hey, I actually made a silly mistake in my post, indeed you first activate the environment and then install stuff in it. Fixed!

I disagree though it is activated immediately, or at least to me with venv I always have to activate it explicitly.

by astronautas

1/13/2025 at 1:11:15 AM

You can still use traditional venvs with UV though, if you want.

by hamandcheese

1/13/2025 at 5:21:43 AM

Uh, but then you don't really need uv, right?

by krick

1/13/2025 at 6:32:58 PM

No, but it is insanely faster than pip.

by hamandcheese

1/13/2025 at 2:02:30 AM

Python package management has always seemed like crazyland to me. I've settled on Anaconda as I've experimented with all the ML packages over the years, so I'd be interested to learn why uv, and also what/when are good times to use venv/pip/conda/uv/poetry/whatever else has come up.

NeutralCrane has a really helpful comment below[0], would love to have a more thorough post on everything!

[0]https://news.ycombinator.com/item?id=42677048

by forgingahead

1/13/2025 at 1:03:06 PM

If you use conda, and can use conda for what you need to do, use conda w/ conda-forge. It has a much better story for libraries with binary dependencies, whereas PyPI (which `uv` uses) is basically full of static libraries that someone else compiled and promises to work.

Note, I use PyPI for most of my day-to-day work, so I say this with love!

by agoose77

1/13/2025 at 4:34:06 AM

been using conda for years with multiple projects each of which has numerous environments (for different versions). fairly large complex environments with coda, tf, jax, etc. has always worked well, and my biggest complaint - the sluggish resolver - largely addressed with mamba resolver. packages not available on conga-forge can be installed into the conda env pip. maybe I'm missing something but it's not clear to me what advantage uv would provide over conda.

by insane_dreamer

1/13/2025 at 6:36:01 AM

It is very difficult for most conda users to maintain conda environments. They use the same env for nearly all their work, don’t understand the hierarchical nature of conda envs, don’t know which one they’re installing into, install stuff with pip without recording it in their env file, etc. The worst local environment messes i’ve ever seen always involve conda.

It can be used effectively, but does not make it easy to do so.

uv makes itself harder to misuse

by claytonjy

1/13/2025 at 5:11:40 PM

When using Maven build tool with Java, the downloaded artifacts always have the version number added to prefix (artifact-version.jar).. and that means there can be multiple versions stored in parallel, cached globally and cherry picked without any ambiguity. The first time when I used Node and Python, I was shocked that the version number is not part of any downloaded artifacts. Versioning the dependencies is such a fundamental need and having it part of the artifact file itself seems like a common sense to me. Can anyone please explain why the Python/Node build tools do not follow that?

by guru4consulting

1/16/2025 at 6:16:13 AM

Version number is part of every wheel and sdist filename.

by mixmastamyk

1/12/2025 at 9:22:54 PM

What would be interesting is if you could do something similar for IPython/Jupyter Notebooks: while front-ends like JupyterLab and VS Code Notebooks do let you select a .venv if present in the workspace, it's annoying to have to set one up and build one for every project.

by minimaxir

1/13/2025 at 1:38:17 AM

For anyone that used rye, it's worth noting that the creator of rye recommends using uv. Also, rye is going to be continually updated to just interface with uv until rye can be entirely replaced by uv.

by cyrialize

1/13/2025 at 7:22:35 AM

I believe they are from the same author, Charlie Marsh / Astral

by andelink

1/13/2025 at 10:24:47 AM

no armin created rye then gave it to astral

by BiteCode_dev

1/13/2025 at 2:52:50 AM

I want to like uv, but unfortunately there's some kind of technical distinction between a Python "package manager" and a "build system". Uv doesn't include a "build system", but encourages you to use some other one. The net result is that external dependencies don't build the same as on Poetry, don't work, and uv points the finger at some other dependency.

I do hope the situation changes one day. Python packaging is such a mess, but Poetry is good enough and actually works, so I'll stick with it for now.

by CGamesPlay

1/13/2025 at 8:35:17 AM

It's not "some sort of technical distinction". Package managers are for keeping track of which pieces of code you need in your project's environment. Build systems are for... building the code, so that it can actually be used in an environment.

Usually, you can directly make a pre-built wheel, and then an installer like Pip or uv can just unpack that into the environment. If it needs to be build on the user's machine, then you offer an sdist, which specifies its build backend. The installer will act as a build frontend, by downloading and setting up the specified backend and asking it to make a wheel from the sdist, then installing the wheel.

Poetry's build backend (`poetry.masonry`) doesn't build your external dependencies unless a) you obtain an sdist and b) the sdist says to use that backend. And in these cases, it doesn't matter what tools you're using. Your installer (which could be Pip, which is not a package manager in any meaningful sense) can work with `poetry.masonry` just fine.

If you can give a much more specific, simple, reproducible example of a problem you encountered with external dependencies and uv, I'll be happy to try to help.

by zahlman

1/13/2025 at 2:53:40 PM

Maybe the docs are misleading? Seems that if I want my package to be installed, I need to pick a build system, regardless of if I am using any native code. https://docs.astral.sh/uv/concepts/projects/init/#packaged-a...

> Package managers are for keeping track of which pieces of code you need in your project's environment. Build systems are for... building the code, so that it can actually be used in an environment.

This probably means something to the developers of the package managers and build systems, but to me, as a Python developer who wants to be able to publish a pure Python CLI program to PyPI, it seems like a distinction without a difference.

by CGamesPlay

1/13/2025 at 4:27:40 PM

>Seems that if I want my package to be installed, I need to pick a build system, regardless of if I am using any native code.

If you want to distribute it to be installable by others, yes. Except that at least for now, installers will assume Setuptools by default if you don't mention anything in your `pyproject.toml`.

>but to me, as a Python developer who wants to be able to publish a pure Python CLI program to PyPI,

If you're making pure Python projects, the actual build process is trivial and every build system will do just fine. But again, the build system you choose builds your code, not your external dependencies. Whatever you put in `pyproject.toml` here has nothing to do with the packages that you install. It has to do with other people installing your package (and you taking steps to make that easier). So "external dependencies don't build the same as on Poetry" makes no sense in this context. If you need to build external dependencies (i.e. they don't come pre-built for your system), they will automatically be built with the build system that they choose.

>it seems like a distinction without a difference.

Let me try again: When you use a package manager, it's so that you can keep track of which code from other people you're using. When you choose a build system and mention it in `pyproject.toml`, it's so that other people can use your code. (For your pure Python project, you will normally run the build system locally - https://pradyunsg.me/blog/2022/12/31/wheels-are-faster-pure-... . But the overall packaging ecosystem is designed so that you can push part of that work onto the end user, when it makes sense to do so.)

by zahlman

1/13/2025 at 4:28:39 PM

Uv doesn't include a "build system", but encourages you to use some other one.

Personally I consider this one of uv's greatest strengths. The inflexibility and brittleness of Poetry's build system is what made me give up on poetry entirely. Had poetry made it easy to plug in a different build system I might never have tried uv.

by dagw

1/12/2025 at 8:46:52 PM

OK, I'm convinced. I just installed uv. Thanks for sharing!

by tasn

1/12/2025 at 8:47:37 PM

Ditto. This is pretty cool!

by smallmancontrov

1/12/2025 at 9:43:25 PM

Sometimes, only a specific wheel is available (e.g. on Nvidia's Jetson platform where versions are dictated by the vendor).

Can uv work with that?

by amelius

1/12/2025 at 11:11:23 PM

Small misorder in the „right route” - you should first activate the virtual environment just created and then install pandas.

by misiek08

1/13/2025 at 8:21:35 AM

I really thought this would mention uv script deps (standardized by some PEP) together with a `#!/usr/bin/env -S uv run` shebang line which automatically install the deps on execution.

Has been super useful to write single-file tools/scripts which LLMs can understand and run easily.

by ErikBjare

1/12/2025 at 9:56:28 PM

Whats the point if you have other binary dependencies?

Use Nix for Python version as well as other bin deps, and virtualenv + pip-tools for correct package dependency resolution.

Waiting 4s for pip-tools instead of 1ms for uv doesn't change much if you only run it once a month.

by mgd020

1/12/2025 at 8:58:03 PM

I love this, the biggest problem I have right now with python scripts is distributing my single file utility scripts (random ops scripts).

I wish there was a way to either shebang something like this or build a wheel that has the full venv inside.

by faizshah

1/12/2025 at 9:01:45 PM

There’s a shebang now. as of PEP 722 you can declare dependencies in a comment at the top of a single file script that a package manager can choose to read and resolve.

uv has support for it: https://docs.astral.sh/uv/guides/scripts/#running-a-script-w... (which only helps if your team is all in on uv, but maybe they are)

by easton

1/12/2025 at 9:36:08 PM

How does that work with the shebang?

by amelius

1/13/2025 at 12:48:41 AM

  #!uv run
  # /// script
  # requires-python = ">=3.10"
  # dependencies = [
  #     "click>8",
  #     "rich",
  # ]
  # ///
and that's it

by wizzard0

1/13/2025 at 9:41:43 AM

Cool. I wasn't sure it would skip the first comment line.

by amelius

1/12/2025 at 9:14:16 PM

Do other package managers support this yet?

by miohtama

1/13/2025 at 8:37:52 AM

Pipx isn't in any meaningful sense a package manager (although it can manage environments to a limited extent), but the `pipx run` command supports this PEP as of version 1.4.2.

by zahlman

1/12/2025 at 9:07:40 PM

https://peps.python.org/pep-0723/ is at the very least related. It's a way of specifying the metadata in the script, allowing other tools to do the right thing. One of the use cases is:

> A user facing CLI that is capable of executing scripts. If we take Hatch as an example, the interface would be simply hatch run /path/to/script.py [args] and Hatch will manage the environment for that script. Such tools could be used as shebang lines on non-Windows systems e.g. #!/usr/bin/env hatch run

https://micro.webology.dev/2024/08/21/uv-updates-and.html shows an example with uv:

> With this new feature, I can now instruct users to run uv run main.py without explaining what a venv or virtualenv is, plus a long list of requirements that need to be passed to pip install.

That ends:

> PEP 723 also opens the door to turning a one-file Python script into a runnable Docker image that doesn’t even need Python on the machine or opens the door for Beeware and Briefcase to build standalone apps.

by eesmith

1/12/2025 at 9:42:33 PM

You mean like pyinstaller https://pyinstaller.org that takes your python and makes a standalone, self extracting or onedir archive to convert you ops script plus dependencies into something you can just distribute like a binary?

by mianos

1/13/2025 at 8:39:56 AM

This makes sense when you need to provision Python itself to the end user, not just third-party libraries.

by zahlman

1/13/2025 at 2:11:07 AM

I used to have pyenv, asdf or mise to manage python versions (never use conda unless I need DL lib like pytorch). Now just uv is enough.

by vietvu

1/12/2025 at 10:33:54 PM

I do like uv and hope to try it soon but I don't get the point of the article.

Pyenv + poetry already gives you ability to "pull in local dependencies". Yes, you have to create a virtual environment and it's not "ad-hoc".

But if you're going to pull in a bunch of libraries, WHY would you want to invoke python and all your work dependencies on a one liner? Isn't it much better and easier to just spell-out the dependencies in a pyproject.toml? How "ad-hoc" are we talking here?

by crispyambulance

1/13/2025 at 6:44:45 AM

Yes! I love verbosity. It gives me job security. I’m tired of these tools making my job easier.

Previously I could allocate a whole week to setup initial scaffold for the project. Also more tools - more failure points, so I can flex on stupid juniors how smart I am. Now I can’t even go to pee with how fast and easy this freaking uv is. WTF.

by wiseowise

1/13/2025 at 1:04:56 PM

Well I guess I am not smart enough to dump multiple dependencies + python, densely, on one line to spin everything up so I can do “ad-hoc” computing without just spelling them out in a file. Sorry, but that just isn’t a killer feature for me and it doesn’t seem like a big deal anyway.

I do like the idea of getting rid of pyenv though. And since poetry has failed to become as widespread as I hoped, maybe uv has a better shot?

by crispyambulance

1/12/2025 at 10:14:11 PM

That’s like a killer app type feature. However it says adhoc so you probably can’t get back to that setup easily

by m3kw9

1/12/2025 at 9:28:50 PM

I honestly really hate the venv ergonomics but uv does still depend on it as the golden path if you don’t use the —with flags (in my understanding). Is there a way to do a clean break with just the new —script inline dependencies, or is that wrong/suboptimal?

by laidoffamazon

1/12/2025 at 10:23:21 PM

You can definitely do that — it's just sub-optimal when you have multiple files that share dependencies.

by zanie

1/12/2025 at 10:13:53 PM

But I still need pip to install uv, right? Or download it using a one-liner alternatively.

by sonium

1/13/2025 at 3:11:22 AM

cargo install

by dontdieych

1/12/2025 at 9:32:09 PM

one useful UV alias I use is uvsys='uv pip install --system'

So I can just do uv {package} for a quick and dirty global install. I'm so used to pip install being global by default just making this shorthand makes things a bit easier.

by aizk

1/13/2025 at 6:46:54 AM

this sounds like it’s asking for trouble! very easy to mess up your whole system

i would highly recommend only using —-system in a docker container or similar

by claytonjy

1/12/2025 at 9:46:27 PM

Can you also specify which version of pandas to use?

by valcron1000

1/12/2025 at 10:33:09 PM

Of course!

uv run -q --with pandas==2.1.4 python -c "import pandas; print(pandas.__version__)" 2.1.4

by zanie

1/12/2025 at 9:08:31 PM

uv has does not (nor do they plan to add) support for conda, and that is a deal-breaker.

by curiousgal

1/12/2025 at 9:25:32 PM

I can't see why anyone is using Conda in 2025. In 2018, yeah, pip (now uv) was hard and you could get a "just works" experience installing Tensorflow + NVIDIA on Conda. In 2023 it was the other way around and it still is.

by PaulHoule

1/12/2025 at 9:29:19 PM

Well, when you're building python packages that have non python dependencies and a big chunk of your users are on Windows, conda is the only option, even in 2025 :)

Examples include, quant libraries, in-house APIs/tools, etc.

by curiousgal

1/12/2025 at 9:49:31 PM

Circa 2018, I figured out how to pack up the CUDA libraries inside conda for Windows so I could have different conda environments with different versions of CUDA which was essential back then because if you had a model that was written w/ a certain version of Tensorflow you had to have a matching CUDA and if you used NVIDIA's we-need-your-email-address installers you could only have one version of CUDA installed at a time.

Worked great except for conda making the terrible mistake of compressing package files with bzip2 which took forever to decompress for huge packages.

I see no reason you can't install any kind of non-Python thing that a Python system wants with uv because a wheel is just a ZIP file, so long as it doesn't need to be installed in a particular place you can just unpack it and go.

by PaulHoule

1/13/2025 at 8:45:38 AM

For that matter, you can install arbitrary content from a wheel with Pip.

The problem is all the things that do need to be installed in a particular place. Linux seems to have a lot of those, especially if they're build-time dependencies for something else. Hence the need for, and slow development of, https://peps.python.org/pep-0725/ (relevant background, though I'm sure you know this sort of stuff: https://pypackaging-native.github.io/ ).

by zahlman

1/12/2025 at 9:40:23 PM

Conda worked for me in the past, but at some point I was getting inexplicable segfaults from Python scripts. I switched back to just pip and everything worked fine again. And installation was much faster.

by amelius

1/12/2025 at 9:49:58 PM

That was basically my experience. At one time conda made my life easier, eventually it made it impossible.

by PaulHoule

1/12/2025 at 10:45:51 PM

I’m on Windows and I categorically refuse to install Conda. It’s not necessary.

by forrestthewoods

1/12/2025 at 9:28:21 PM

Why would it be a deal breaker? uv would replace conda. And I hope it does. Conda has been such a headache for me when I've used it in the past. If the Python (particularly ML/academic community) could move on from conda it would be a great thing.

by NeutralCrane

1/12/2025 at 9:52:40 PM

uv can’t replace conda, any more than it can replace apt or nix.

Conda packages general binary packages, not just python packages. uv is just python packages.

by jph00

1/12/2025 at 11:06:08 PM

Python packages (distributed via the Python package index, PyPI) can also be general binary packages. Try pip install cmake, for example.

by woodson

1/13/2025 at 1:04:13 PM

Yes, but there are restrictions; for one thing, it's not trivial to share binary dependencies between Python packages; conda just handles that.

by agoose77

1/12/2025 at 9:48:36 PM

Pixi might be something worth looking for, if you want a uv conda equivalent

by Zaloog

1/13/2025 at 12:18:43 AM

Fun fact: pixi uses uv as a library to install pypi packages!

by emmanueloga_

1/13/2025 at 1:22:10 PM

Interesting, will check it out.

by astronautas

1/12/2025 at 11:19:22 PM

pixi seems fine, but it also is just using mamba on the backend so you might as well continue to use miniforge

by cd4plus

1/12/2025 at 9:22:01 PM

That doesn't make sense, respectfully.

by throwaway314155

1/12/2025 at 8:52:09 PM

I've replaced the linkbait title with an attempt at saying what the feature is. If there's a more accurate wording, we can change it again.

by dang

1/12/2025 at 10:22:03 PM

I don't feel strongly, but as a uv author, I found "local dependencies" misleading. It's more like "uv's killer feature is making ad-hoc environments easy".

When we talk about local dependencies in the Python packaging ecosystem, it's usually adding some package on your file system to your environment. The existing title made me think this would be about the `[tool.uv.sources]` feature.

Really, it's about how we create environments on-demand and make it trivial to add packages to your environment or try other Python versions without mutating state.

by zanie

1/12/2025 at 10:58:16 PM

Sorry dang, didn't know the practice + got a bit emotional haha. I agree with the remark above, my message is rather on how easy it is to run python scripts with dependencies (without mutating the state.

by astronautas

1/13/2025 at 12:18:20 AM

No worries!

by dang

1/13/2025 at 12:18:07 AM

Happy to take correction from an author! I've switched the wording above.

by dang

1/13/2025 at 12:31:13 AM

Thanks!

by zanie

1/12/2025 at 8:54:36 PM

uh, thanks I guess.

by astronautas

1/12/2025 at 8:58:31 PM

How about "A UV feature that intrigues me most"?

by astronautas

1/12/2025 at 9:24:04 PM

That's still clickbait, although less tropey. The common definition of clickbait is intentionally omitting information that incentives the user to click.

by minimaxir

1/12/2025 at 9:17:30 PM

Still clickbait if you don't say what the feature is.

by airstrike

1/12/2025 at 9:55:22 PM

[flagged]

by secondcoming

1/13/2025 at 9:06:06 AM

>you need a virtual environment for some reason

You have always needed on, practically speaking. Python isn't designed to have multiple versions of the same library in the same runtime environment. A virtual environment is just a separate place to put the packages you need for the current project, so that they're isolated from other packages, and thus you don't get version conflicts. This includes the system packages. If you want to play with, say, the latest version of Requests, and you try sudo installing that in a system environment, and it happens that the latest version of Requests breaks Apt (which is written in Python), you're in for a bad time.

The new warning is because even user-level installations can mess with system scripts, when those scripts are run without sudo. Also, Apt has no real way to know about or understand anything you do with Pip, so that interferes with Apt's actual package management.

>installing packages [with] sudo doesn't make them available to other users

If you use sudo to install packages for the system Python, then yes they absolutely are available to all users. But you don't see them in virtual environments by default (you can change this) because the default is to ignore the system installation's `site-packages` completely (including user-level installations).

> on ubuntu it seems pip has been replaced with 'python-*' debian packages

None of this is new, and it doesn't even remotely "replace" Pip. You're just facing a little more pressure to actually use the system package manager when installing packages for your system, since that can actually manage packages, and integrate them with the rest of your system (the non-Python parts). The Debian packages are specifically vetted and tested for this purpose and may include Canonical's own patches that you won't get from PyPI. On the other hand, PyPI provides vastly more different packages.

When you install in a virtual environment, you'll generally use Pip to do it (unless you use uv etc.). Because the environment is specifically created to be isolated from your system, so that Apt doesn't have to care.

Please see https://stackoverflow.com/questions/75608323 for details. It wasn't a snap decision; see https://discuss.python.org/t/pep-668-marking-python-base-env... for context. Arch implements analogous protections, too, for the same reasons (https://www.youtube.com/watch?v=35PQrzG0rG4). I recall Fedora having similar plans but I didn't hear about it being implemented yet.

by zahlman

1/12/2025 at 8:42:57 PM

wow, they've re-invented a tiny bit of Nix, purely legend!

by instig007

1/12/2025 at 8:47:14 PM

A few months ago I saw someone hacking the linker to get mundane package management working in Nix. It was bubbling up to the top of my "to try" list and that bumped it back down. It'll be good eventually, I'm sure.

by smallmancontrov

1/13/2025 at 4:08:32 AM

> It'll be good eventually, I'm sure.

not with this attitude of getting scared of things by watching someone doing something, for sure

by instig007

1/17/2025 at 4:32:00 PM

I have fixed enough (dynamic) linker issues for this lifetime and probably several more. I think I'll let the kids take these. Godspeed.

by smallmancontrov

1/12/2025 at 9:25:12 PM

That you can use without having 4 PhDs. It's pretty good. You should try it sometime when your done fully ingesting algebraic topology theory or whatever the fuck Nix requires to know just to install figlet.

by lucsky

1/12/2025 at 11:32:54 PM

Try flox [0]. It's an imperative frontend for Nix that I've been using. I don't know how to use nix-shell/flakes or whatever it is they do now, but flox makes it easy to just install stuff.

[0]: https://flox.dev/

by Evidlo

1/13/2025 at 4:05:40 AM

> You should try it sometime when your done fully ingesting algebraic topology theory or whatever the fuck Nix requires to know

aka how to say that you've never really tried learning Nix without saying it directly.

by instig007

1/12/2025 at 9:34:11 PM

Oh come on, it's not that hard even for packaging stuff (let alone usage). Quite trivial compared to leetcode grind I'd say.

by 331c8c71

1/12/2025 at 10:15:08 PM

You say that, but there seems be a vast chasm between “can solve LC hards” and “can administer an OS,” even though the latter is generally not at all abstract, extremely well-documented, and almost certainly has associated man pages.

by sgarland

1/12/2025 at 9:37:52 PM

Everybody will reinvent Nix if they are in software engineering long enough ...

by amelius

1/12/2025 at 10:00:43 PM

If only... Majority will use whatever is shoved up their a## be it docker or anything else.

by 331c8c71

1/12/2025 at 10:44:38 PM

Nix people are more annoying than Rust Defense Force.

I use Windows, and not WSL. Nix does literally nothing for me.

by forrestthewoods

1/13/2025 at 4:06:42 AM

[flagged]

by instig007

1/13/2025 at 4:11:34 AM

There’s vastly more Windows devs than you realize. Vastly.

by forrestthewoods