-
Notifications
You must be signed in to change notification settings - Fork 1.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug] Converting Mistral-7B-Instruct-V0.2 #1483
Comments
Hi @scarlettekk could you try compiling using the new workflow? https://llm.mlc.ai/docs/compilation/compile_models.html Let us know if you run into issues |
Hi, However, I now encounter issues in Android Studio building the Android demo application with the new model:
|
This comment was marked as resolved.
This comment was marked as resolved.
I think this may have something to do with how the folder structure of the android app is different than the documentation, with the MLCChat folder missing, its contents moved to the ./android/ folder, and the prepare_libs.sh script being moved to ./android/library, meaning there is a tvm4j_core.jar under both the library/build/output directory and the app/src/main/libs directory, both of which are in scope of the android project Edit: I can build successfully by not copying tvm4j_core.jar to the app/src/main/libs directory. Whether this prevents the app from functioning remains to be seen. |
Running the app on start shows an error: "Add model failed: Failed requirement." Trying to manually add the model url errors with "Model lib null is not supported." Is my huggingface repo configured wrong? It's linked here. |
The Android PR just got merged in #1494, with an updated documentation page as well: https://llm.mlc.ai/docs/deploy/android.html Let us know if issues persist! |
Hmm I think I saw this issue posted by someone else as well.. I guess it could be due to various reasons; yea perhaps try the prebuilt ones first, if those do not fail it is probably mistral-specific. @Kartik14 might be good to add Mistral as prebuilt as well if we have the bandwidth. |
I have just added the Mistral prebuilt lib url #1514. @scarlettekk Can you try this lib when building apk? |
Now we get this when preparing libs. I may have set up the prebuilt incorrectly? I downloaded the Mistral-android.tar to dist/libs and edited app-config.json to match the tarfile and the new URL.
|
Someone else was getting the same error. I think this might be an issue with the tvm. Can you try fetching the latest version of |
No dice, 3rdparty/tvm checks out at 72a7644159cc415788f4d819c7e8196e0eef751d and still produces the same error in prepare_libs.sh |
Seeing the same issue. Using the latest repo (tvm) Llama-2-7b-chat-hf-q4f32_1-android.tar[100%] Built target mlc_llm_static Llama-2-7b-chat-hf-q4f16_1-android.tar[100%] Building CXX object CMakeFiles/tvm4j_runtime_packed.dir/mnt/c/AndroidProjects/MLC_AI/mlc-llm/3rdparty/tvm/jvm/native/src/main/native/org_apache_tvm_native_c_api.cc.o RedPajama-INCITE-Chat-3B-v1-q4f16_1-android.tar[100%] Built target mlc_llm_static |
🐛 Bug
Converting Mistral-7B-Instruct-v0.2 for Android results in an error.
conda
, source): pippip
, source): pipThe text was updated successfully, but these errors were encountered: