Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

pyerrors does not work with the upcoming numpy 2 release #231

Closed
fjosw opened this issue Apr 1, 2024 · 8 comments · Fixed by #239
Closed

pyerrors does not work with the upcoming numpy 2 release #231

fjosw opened this issue Apr 1, 2024 · 8 comments · Fixed by #239

Comments

@fjosw
Copy link
Owner

fjosw commented Apr 1, 2024

pyerrors raises multiple errors in combination with the numpy 2 release candidate

After the relevant changes have made it to pypi we will probably have to bump our minimal dependencies and potentially drop support for python 3.8 as it does not work with numpy 2 (EOL is 31 Oct 2024).

@s-kuberski
Copy link
Collaborator

Thanks a lot for anticipating this and for fixing the problems in the dependencies! I am fine with dropping python 3.8 in the future.

@fjosw
Copy link
Owner Author

fjosw commented Jun 17, 2024

NumPy 2.0 has now been released. With autograd not being maintained anymore we might have to think about alternatives if my proposed changes are not included in a new release (let's see). Things that come to my mind:

  • Fork autograd and maintain a numpy2 compatible version ourselves.
  • Migrate to JAX.
  • Include relevant code for automatic differentiation in our own code base.
  • ...

Any opinions @s-kuberski @jkuhl-uni @JanNeuendorf @PiaLJP ?

@JanNeuendorf
Copy link
Contributor

Maybe we should first write down all the "features" of autograd that are currently being used. What is the actual functionality that needs to be replaced?

@fjosw
Copy link
Owner Author

fjosw commented Jun 17, 2024

Autograd is mainly needed for automatic differentiation & error propagation in fits, roots, integration and linear algebra functions. It conveniently provides the relevant wrappers for most numpy and scipy functions.

  • Forking the project should be fairly easy, as I already proposed the relevant changes in Fix numpy v2 breaking changes HIPS/autograd#618 but that adds of course maintance overhead for us.
  • JAX is partially a sucessor to autograd, so the API is fairly similar, porting the code is a bit of work but should be doable. Downsides:
    • JAX is a research code and might change significantly.
    • The binaries are much larger (jax + jaxlib wheels are > 60 MB on my machine vs 50 kB for autograd) and JAX can do much more than we actually need.
    • I evaluated performance for our specific use-case a while ago and JAX was actually slower as it is mainly optimized for GPUs/TPUs (this might have changed in the meantime).

There is no urgency yet, but to keep pyerrors up and running with future python versions we will need to find a solution at some point.

@jkuhl-uni
Copy link
Collaborator

Hey
overall, I agree with Fabian on this matter. Those are also the options I see.
I read through your PR on the autograd repo. I think that forking would be the most elegant option here, as the functionality for other users of autograd could potentially be preserved. In a perfect world, I would even go so far as to make an autograd2 with numpy>=2.0.0 as a requirement. However, as you mentioned, the maintenance of the code for upcoming releases of scipy and numpy could be significant (even though I am sure we would not be the only ones working on it and others would help out). This does not seem practical for us, but I'd like to have a look into the autograd codebase to assess that.

If this is not assessed as a viable option by the others, I'd advocate for migrating the relevant code into pyerrors. JAX seems to have a rather different focus and has built-in autograd only as a side-car if I read it right in their description.

Last, I want to point out, that autograd and JAX are probably not the only options, so we could also have a look at other libraries for automatic differentiation.

@jkuhl-uni
Copy link
Collaborator

Hey,
just FYI: while we were talking yesterday, someone else already forked the repo and incorporated @fjosw's changes. See here: mfschubert/autograd#1.

@s-kuberski
Copy link
Collaborator

Hi,
thanks a lot for having this in mind and for your collection of suggestions. The easiest option would of course be to hope for someone to take care of your pull request. I'd propose to wait a bit to monitor the situation, it does not seem to be hopeless.

If, however, autograd won't be maintained anymore and we'll run into problems in the long run, we of course have to find a solution. In principle, I'd be inclined to test JAX, but I see the problem that a change could potentially break many workflows. This might be the best solution in the long run, but only if we have the feeling that we could gain anything else from the change (performance, flexibility, ...). Maintaining our own fork of autograd might be less stressful...

Let's wait a bit longer.

@fjosw
Copy link
Owner Author

fjosw commented Aug 22, 2024

We managed to create a new autograd release that supports numpy 2 (https://github.com/HIPS/autograd/releases/tag/v1.7.0), I'm preparing a new pyerrors version that also supports numpy 2 and drops python 3.8 support (#239)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants