ChatLLM is an Android application based on the multimodal LLM inference engine mllm. It supports text and image conversations offline.
Here are some previews of the ChatLLM app in action:
| Model | Chat | Image Chat |
|---|---|---|
| PhoneLM 1.5B | ✔️ | ❌ |
| Qwen1.5 1.8B | ✔️ | ❌ |
| SmolLM 1.7B | ✔️ | ❌ |
| OpenELM 1.1B (Removed) | ✔️ | ❌ |
| Phi-3-Vision 3.8B | ✔️ | ✔️ |
| Phi-3-Vision Finetuned 3.8B | ✔️ | ✔️ |
The model can be found in repository Huggingface. It will be automatically downloaded when loading the model if it not found in phone download storage.
- Install and open the
ChatLLM.apk, give permission to manage files. - Select the models in the settings menu.
- Use the Image Reader or Chat options.
- Wait for the model to be downloaded before starting conversations.
Put the downloaded libmllm_lib.a file into the following directory:
app/src/main/cpp/libsGet mllm codes:
git clone https://github.com/UbiquitousLearning/mllm
cd mllmBuild mllm_lib:
mkdir ../build-arm-app
cd ../build-arm-app
cmake .. \
-DCMAKE_TOOLCHAIN_FILE=$ANDROID_NDK/build/cmake/android.toolchain.cmake \
-DCMAKE_BUILD_TYPE=Release \
-DANDROID_ABI="arm64-v8a" \
-DANDROID_NATIVE_API_LEVEL=android-28 \
-DNATIVE_LIBRARY_OUTPUT=. -DNATIVE_INCLUDE_OUTPUT=. $1 $2 $3 \
-DARM=ON \
-DAPK=ON \
-DQNN=ON \
-DDEBUG=OFF \
-DTEST=OFF \
-DQUANT=OFF \
-DQNN_VALIDATE_NODE=ON \
-DMLLM_BUILD_XNNPACK_BACKEND=OFF
make mllm_lib -j$(nproc)Copy mllm_lib to ChatBotApp:
cp ./libmllm_lib.a ChatBotApp/app/src/main/cpp/libs/Note
ChatLLM credits the MLLM Engine and SaltedFish.



