-
Notifications
You must be signed in to change notification settings - Fork 9
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
PEP517 Build-system dependencies should be pinned for deterministic builds from source code #15
Comments
Note that we are talking about the isolated build environment, this is of course fully separate from the |
Are you actually seeing a breaking change here. I can definitely see putting a major version ceiling dept for things I know that could be problematic but I want to avoid managing dependencies going forward. This is especially true for things like Cython where a new release could fix bugs for future Python versions or just general issues with gcc that pop up. These 2 scenarios are actually real for Python 3.11 and gcc 3.x.y which only just happened. If you need reproducible builds then I recommend you pin the build deps to what you need or even better build a wheel specific to your platform and install that. |
Yes this is a real-life case, slightly obscure and in principle nothing to do with The trigger was the setuptools release Presumably our environment previously and accidentally worked with < 65.2.0 and we could build I guess the thing I am trying to discuss here is how we expect isolated builds to work. Prior to isolated builds we would be relying on setuptools' "Isolated builds" is just a beefed up version of I know Note also that in many situations Also, in the same way that wheels are build for specific versions of Python, I would also expect supported Python versions to be explicitly declared (and tested) with regards to the build system. |
BTW I struggled to Google for other cases whereby non-deterministic build-backends were reported as an issue. I guess isolated builds are relatively new and broken dependencies rare. |
For reference, the Python environment I am talking about is the Cloudfoundry Python buildpack, now fixed here (I think): https://github.com/cloudfoundry/python-buildpack/blob/master/CHANGELOG#L5 |
Let me know what you’re thinking on this @jborean93 |
Sorry I've been on holidays so never got back round to this. I think I can be convinced to do a better pin on setuptools but not pinning Cython was done on purpose as Cython introduces newer features to fix things like problems I've just had with Python 3.11 and GCC 12 that would have required a new release. Even then pinning setuptools doesn't fully fix this problem as I could have pinned a release to a version that didn't work on your environment meaning you would either still have to had disabled build isolation or fixed the problem with setuptools. Granted that would have had to happen when you actually bumped the If you have an internal environment then you would be far better off building the wheel yourself once and distributing that. This brings a few benefits:
I cannot do this on PyPI unfortunately because I cannot statically link the krb5 libs without causing problems with other modules being loaded in the same process trying to link against the system krb5 libs. But that doesn't stop you from distributing it from your own wharehouse if your nodes are the same. |
No problem at all.
Exactly this. That's what I am hoping to achieve that for a given fixed version of Understand about pre-building our own wheels. However, in practice that may be not trivial. I don't know too much about the details on building wheels for non-pure Python packages. But given the diversity in our platforms I am not sure whether a single (Linux) wheel would work everywhere. The main thing I am trying to discover here is, if isolated builds are meant to solve such build time dependencies, how can this be made robust. It really should do the same as building wheels offline, right? |
One more thought is that this is happening on a Cloudfoundry platform. Surely that's a sufficiently large platform to warrant some thought on how this is supposed to work. The whole idea which Cloudfoundry buildpacks is that app environments are build on the fly at the point of deployment. |
That's fair, I can't publish wheels for Linux on PyPI due to the policies there. I would have to statically link the krb5 lib which causes havoc if also need to load the system library or try and load another module that also loads krb5. I was unsure on your environment and if it was homogeneous enough that you could have your own custom wheel.
I'm not aware of Cloudfoundry or have used it before but if they had an environment problem where setuptools would fail then the problem arguably doesn't lie with the libraries being used but the environment. What would have happened if I pinned setuptools to Strictly speaking I agree with you that I should probably pin these versions to ensure that an sdist stays the same over its lifecycle but the realist in me is concerned that in the future I might get to a situation where I cannot pin to just one version and might need 2 different ones for Python 3.7 and Python 3.12. I am also concerned about the maintenance burden that might be increased if I need to manually maintain these dependencies and update over time or even deal with requests saying they can't rely on this specific version for a release and need it to be something else. Because of this I'm going to have to say that I will continue the current course I am in today. This gives me flexibility as a maintainer, flexibility as a user of the sdist for their build environments, and better compatibility with future versions or bugfixes that become available without requiring a new release of this library. As a slight mitigation you can add a constraints file to your environment to either skip or pin versions of I know this may not be the answer you are looking for and I do appreciate the time you've put into this issue. If this type of problem continues to appear in the future I am more than happy to look into it again as I was really on the fence about this. |
The constraints tip is actually quite useful. Many thanks @jborean93 |
The migration to PEP517 is a good move. However, there are some edge cases whereby a
pip install
s from source code might fail.Currently the build backend is configured like this:
This means, for example, than any new releases to
setuptools
could cause apip install
failure from source code when it's trying to build a wheel.This is not only possible after a major version updates of setuptools. There are cases whereby other minor/micro changes could break the build, for example, depending on the exact Python installation.
For discussion: should the build system's dependencies be fully pinned at each release of
krb5
to ensure for a given source distribution a binary distribution can always be rebuild, exactly as was tested at the point of the release?The text was updated successfully, but these errors were encountered: