Skip to content

Conversation

@Hashbrownsss
Copy link

@Hashbrownsss Hashbrownsss commented Aug 4, 2025

What does this PR do?

This PR addresses the ModuleNotFoundError for the einops library by adding it as a core dependency.

It fixes issue #39811, where certain parts of the transformers library, specifically in the flash attention code, were trying to import einops without it being a declared dependency. This bug prevented users from successfully running models that rely on this library.

By adding einops to setup.py, this PR ensures that the library is automatically installed for all users, making the transformers package more robust and reliable.

Fixes #39811

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Did you read the contributor guideline,
          Pull Request section?
  • Was this discussed/approved via a Github issue or the forum? Please add a link
          to it if that's the case. This PR fixes issue Missing einops dependency causing ModuleNotFoundError #39811.
  • Did you make sure to update the documentation with your changes? Here are the
          documentation guidelines, and
          here are tips on formatting docstrings.

    This change is a dependency fix and does not require a documentation update.

  • Did you write any new necessary tests?

    This change is a dependency fix and does not require a new test. The existing test suite was used to confirm that the einops dependency is now correctly installed.

Who can review?

@ArthurZucker
@ivarflakstad
@iforgetmyname

Copy link
Collaborator

@ArthurZucker ArthurZucker left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks! Tho I think this should be added to the npu extra instead!

@Hashbrownsss Hashbrownsss force-pushed the fix-einops-dependency branch from cd473fb to 8c11f1b Compare August 5, 2025 08:33
@Hashbrownsss
Copy link
Author

Hey! Thanks for the quick feedback. Although i couldnt find a dedicated npu extra. ive updated the PR to add einops to dev-torch instead, as it seems like the most apt place for the dependency I hope that works!

@vasqu
Copy link
Contributor

vasqu commented Aug 7, 2025

It seems like the einops dependency is only due to the un-/padding functions. It would be best to just rewrite them with normal torch and/or use the fa3 equivalents in the fa modeling utils file - no need for the einops dependency.

@vasqu
Copy link
Contributor

vasqu commented Aug 7, 2025

#40002 will probably remove the einops dependency but will keep you updated if not

@Hashbrownsss
Copy link
Author

That sounds good! I'll close this PR and keep an eye on #40002

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Missing einops dependency causing ModuleNotFoundError

3 participants