Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add llvm flag #547

Open
wants to merge 3 commits into
base: triton-mlir
Choose a base branch
from
Open

Add llvm flag #547

wants to merge 3 commits into from

Conversation

zhanglx13
Copy link

@zhanglx13 zhanglx13 commented Mar 28, 2024

This PR allows us to set llvm backend flags, e.g. -mllvm --print-after-all, using an env var TRITON_LLVM_FLAG.

Example usage:

export TRITON_LLVM_FLAG=---amdgpu-enable-max-ilp-scheduling-strategy=1
python perf-kernels/06-fused-attention-fwd-transV.py

Don't forget to unset the flag if you don't need it anymore by export TRITON_LLVM_FLAG=

@zhanglx13 zhanglx13 requested review from vgokhale and jtang10 March 28, 2024 01:56
@zhanglx13 zhanglx13 changed the base branch from main to triton-mlir March 28, 2024 01:56
@scxiao
Copy link

scxiao commented Apr 13, 2024

Can we rename as TRITON_LLVM_FLAG? basically, I think it would be better env var for triton have prefix TRITON_.

@scxiao
Copy link

scxiao commented Apr 15, 2024

Can we rename as TRITON_LLVM_FLAG? basically, I think it would be better env var for triton have prefix TRITON_.

Thanks. Another question, can we use multiple flags together? if true, what is the format for that?

@zhanglx13
Copy link
Author

Can we rename as TRITON_LLVM_FLAG? basically, I think it would be better env var for triton have prefix TRITON_.

Thanks. Another question, can we use multiple flags together? if true, what is the format for that?

I think so, the value processed as a string. You can do something like
TRITON_LLVM_FLAG="--flag1 --flag2"

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants