Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

torch.log(): cast int arguments to float32 #2017

Merged
merged 2 commits into from
Nov 7, 2023
Merged

Conversation

pcuenca
Copy link
Contributor

@pcuenca pcuenca commented Oct 16, 2023

torch.log automatically casts arguments to float32:

>>> torch.log(torch.tensor([10], dtype=torch.int32))
tensor([2.3026])

However, the same operation fails when converting to Core ML.

This patch makes it possible to convert mistral models, such as mistralai/Mistral-7B-Instruct-v0.1.

pcuenca added a commit to huggingface/exporters that referenced this pull request Oct 16, 2023
Includes a workaround for apple/coremltools#2017
Copy link
Collaborator

@junpeiz junpeiz left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for fixing this issue! LGTM.
I kicked off an CI run: https://gitlab.com/coremltools1/coremltools/-/pipelines/1043280060

@pcuenca
Copy link
Contributor Author

pcuenca commented Oct 24, 2023

Thanks @junpeiz! Looks like the CI run finished successfully, let me know if you'd like any changes :)

@junpeiz
Copy link
Collaborator

junpeiz commented Oct 27, 2023

Great! We will merge it after the release is finished. Thank you for your patience!

@pcuenca
Copy link
Contributor Author

pcuenca commented Oct 27, 2023

Oh, of course, no worries at all!

@junpeiz junpeiz merged commit b2f7190 into apple:main Nov 7, 2023
@pcuenca pcuenca deleted the log-int-val branch November 7, 2023 08:46
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants