Skip to content

Conversation

@LeiWang1999
Copy link
Member

@LeiWang1999 LeiWang1999 commented Aug 8, 2025

since we require only for correctness tests

as title, which maybe a stable version for undefined symbol: _ZN3c104cuda9SetDeviceEi as described in Dao-AILab/flash-attention#966

@github-actions
Copy link

github-actions bot commented Aug 8, 2025

👋 Hi! Thank you for contributing to the TileLang project.

Please remember to run bash format.sh in the root directory of the project to ensure your changes are properly linted and formatted. This will help ensure your contribution passes the format check.

We appreciate you taking this step! Our team will review your contribution, and we look forward to your awesome work!

🚀

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Summary of Changes

Hello @LeiWang1999, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

I've made a targeted update to our test environment dependencies. The primary goal of this change is to address a specific runtime error related to flash-attn that has been impacting our CI stability. By pinning the flash-attn version to 2.5.8, I anticipate resolving this issue and ensuring our test suite runs more reliably.

Highlights

  • Dependency Update: I've updated the flash-attn dependency in requirements-test.txt to specifically use version 2.5.8. Previously, it allowed any version less than or equal to 2.2.0.
  • CI Stability / Bug Fix: This specific version update aims to resolve a known undefined symbol error (_ZN3c104cuda9SetDeviceEi) that has been observed with flash-attn, which should improve the stability of our continuous integration (CI) tests.
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point in your pull request via creating an issue comment (i.e. comment on the pull request page) using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in issue comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments or fill out our survey to provide feedback.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request updates the flash-attn dependency in requirements-test.txt to version 2.5.8, pinning it to resolve a CI issue. The change is sound. I've added one suggestion to include an inline comment explaining why this specific version is pinned, which will improve the long-term maintainability of this dependency file.

attrs
decorator
flash-attn<=2.2.0
flash-attn==2.5.8
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

For better maintainability, it's a good practice to add a comment explaining why a dependency is pinned to a specific version. This is especially helpful when it's to resolve a specific issue, as it provides context for future developers who might need to update this dependency. You can reference the issue mentioned in the pull request description.

flash-attn==2.5.8  # Pinned to fix https://github.com/Dao-AILab/flash-attention/issues/966

@LeiWang1999 LeiWang1999 changed the title [CI] Update flash-attn version in requirements-test.txt into 2.5.8 [CI] Remove Flash Attention dependency, since we require only for correctness tests Aug 8, 2025
@LeiWang1999 LeiWang1999 changed the title [CI] Remove Flash Attention dependency, since we require only for correctness tests [CI] Remove Flash Attention dependency Aug 8, 2025
@LeiWang1999 LeiWang1999 merged commit 87aae29 into tile-ai:main Aug 8, 2025
3 checks passed
RubiaCx pushed a commit to RubiaCx/tilelang that referenced this pull request Nov 24, 2025
* Update flash-attn version in requirements-test.txt from <=2.2.0 to ==2.5.8

* lint fix

* Remove unused dependencies from requirements-test.txt

* Update import path for padding functions in example MHA forward variable length script

* Refactor code formatting in bert_padding.py for improved readability
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant