We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
like fp32/fp16 , fp32/bf16.
Thank you very much .
The text was updated successfully, but these errors were encountered:
@sanbuphy Unfortunately AITemplate doesn't support mixed precision/auto-cast functionality.
@chenyang78 CC'ing you for visibility, given the recent discussion on the topic
Sorry, something went wrong.
@sanbuphy Unfortunately AITemplate doesn't support mixed precision/auto-cast functionality. @chenyang78 CC'ing you for visibility, given the recent discussion on the topic
I see, so I cannot use AIT to compile any mixed precision models with amp.autocast in PyTorch. Thank you for your response.
No branches or pull requests
like fp32/fp16 , fp32/bf16.
Thank you very much .
The text was updated successfully, but these errors were encountered: