Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[JIT] No module named 'flashinfer.jit.aot_config' when install from source #756

Closed
ByronHsu opened this issue Jan 27, 2025 · 1 comment
Closed

Comments

@ByronHsu
Copy link
Collaborator

ByronHsu commented Jan 27, 2025

  1. A clean container from dockerfile
  2. Follow Installation to install in JIT mode from source
pip install --no-build-isolation --verbose --editable .
  1. When I import flashinfer, suffer from the following error
$ python                                                                                               04:51:33
Python 3.12.8 (main, Jan 14 2025, 22:49:14) [Clang 19.1.6 ] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import flashinfer
/workspaces/flashinfer/.venv/lib/python3.12/site-packages/torch/_subclasses/functional_tensor.py:295: UserWarning: Failed to initialize NumPy: No module named 'numpy' (Triggered internally at ../torch/csrc/utils/tensor_numpy.cpp:84.)
  cpu = _conversion_method_template(device=torch.device("cpu"))
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/workspaces/flashinfer/flashinfer/__init__.py", line 17, in <module>
    from .activation import gelu_and_mul as gelu_and_mul
  File "/workspaces/flashinfer/flashinfer/activation.py", line 21, in <module>
    from .jit import gen_act_and_mul_module, has_prebuilt_ops, load_cuda_ops    # noqa: F401
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/workspaces/flashinfer/flashinfer/jit/__init__.py", line 20, in <module>
    from .aot_config import prebuilt_ops_uri as prebuilt_ops_uri
ModuleNotFoundError: No module named 'flashinfer.jit.aot_config'
>>> 

I found aot_config is generated when aot is enabled. I tried to run the install command for AOT, and aot_config.py was indeed generated.

TORCH_CUDA_ARCH_LIST="9.0" FLASHINFER_ENABLE_AOT=1 pip install --no-build-isolation --verbose --editable .

I could import flashinfer afterwards. However, I think it should also work with JIT install from source?

@ByronHsu ByronHsu changed the title [JIT Editable Installation] No module named 'flashinfer.jit.aot_config' [JIT Install from Source] No module named 'flashinfer.jit.aot_config' Jan 27, 2025
@ByronHsu ByronHsu changed the title [JIT Install from Source] No module named 'flashinfer.jit.aot_config' [JIT] No module named 'flashinfer.jit.aot_config' when install from source Jan 27, 2025
@yzh119
Copy link
Collaborator

yzh119 commented Jan 27, 2025

Should have been fixed in #757

yzh119 added a commit that referenced this issue Jan 27, 2025
This pull request includes changes to the `flashinfer/jit/__init__.py`
file to improve the import structure and handle the `prebuilt_ops_uri`
import conditionally.

Improvements to import structure:

* Removed the unconditional import of `prebuilt_ops_uri` from
`aot_config` and added it conditionally within the try-except block to
handle cases where `_kernels` or `_kernels_sm90` are not available.
[[1]](diffhunk://#diff-39845bb8e1f81f9ca7d510e99dad0c15c7596ebe1ac909dc1b3c25b742700b5cL20)
[[2]](diffhunk://#diff-39845bb8e1f81f9ca7d510e99dad0c15c7596ebe1ac909dc1b3c25b742700b5cR48-R52)

cc @ByronHsu
@yzh119 yzh119 closed this as completed Jan 27, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants