-
Notifications
You must be signed in to change notification settings - Fork 2.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Problems with predictions on MacBook Air with M1 chip in Java project based on Maven #10874
Comments
You need to compile ORT from source on the M1 to get it to work in Java, the jars on Maven Central don't have a macOS ARM64 binary in the jar. It should compile without issue provided you have a native macOS ARM64 JVM (and compiler toolchain). Once you've done that you can copy the resulting |
Ok. Thanks for the reply @Craigacp. I'm running ONNX runtime on the Linux server, but I wanted to be able to execute tests on the MacBook Air M1 too for the development purposes. I have 2 more questions:
Regards, |
If you build ORT from source with the Java API enabled then you'll get a jar which has the native library packaged into it inside the I've not investigated cross-compiling to produce fat binaries for JNI bindings, though it may be possible (facebook/rocksdb#7720 (comment)). I think the Python binaries are cross-compiled, though I've not checked this. If we do want to make fat binaries for macOS then the JNI loader will need some revision as it currently separates out |
No. Python's setup.py doesn't support cross-compiling. Though it's possible to cross-compile the C/C++ part then use a separated machine to run the packaging part, it's costly to maintain. |
Ah ok. I was a bit confused by this section in the 1.10 release notes: |
onnxruntime-osx-universal2-1.10.0.tgz in our release page. For mac and python, we could provide one package for x86_64 and another one for universal2. Both can be built on x86_64. |
If the universal binary is not much bigger and we can get it to work on the JVM then it might be better to replace the x86_64 macOS binary in the jar with a universal one (after suitable modification of the Java native loader). However further down in that RocksDB thread there's a note saying enabling it completely broke their Mac JVM build though, so I don't think it's necessarily straightforward. |
@pwittchen , as a quick fix to unblock you, you download onnxruntime arm binaries from https://github.com/microsoft/onnxruntime/releases/download/v1.10.0/onnxruntime-osx-arm64-1.10.0.tgz, then unzip it, extract the so file, put it in your local jar file inside the ai/onnxruntime/native/osx-arm64 directory. |
That won't include the JNI wrapper |
Ahh... Good point! |
Thanks for your replies. I'll try to prepare custom jar for M1 chip later. In addition, I'll appreciate dedicated official jar in the Maven Central repository if possible. Regards, |
@Craigacp @snnn I compiled onnxruntime on my Macbook with the command |
If you append
to the command line arguments of build.sh, you will get universal2 binaries. |
Ok, I will check this out. |
By the way, is there any chance of publishing official macOS (or universal) jar with ONNX Runtime to the Maven Central Repository, so I could use it instead of compiling and including custom jar inside the project? |
We're planning this for the next release (1.12) |
Release 1.12 does not fix this issue, it is still happening. |
It is fixed in |
Has anyone built a valid Java Inference ONNXRuntime 1.12.0 for Darwin Aarch64? I am facing the weird error:
Thanks! |
Yes, though I've not done it recently. Is your JVM an x86 one or an ARM one? |
Thanks, Craigacp! Yes, I accidentally used the broken Java. When I use proper Java, I face the proxy error when running the
I saw that this problem is handled by turning off the proxy server (link). Unfortunately, I can not turn off the proxy on the server I use to build this jar. Is there any alternative? Thank you a lot! |
I have the following problem:
I'm using ONNX runtime in version
1.8.1
in the Java project with Maven.When I invoke:
then I get an error:
and I cannot perform prediction.
It works fine on Linux on different distros, but I have problems with executing it on MacBook Air with M1 chip.
The text was updated successfully, but these errors were encountered: