Use your Ollama managed models with LM Studio!
A utility script that fixes and enables LM Studio to use your Ollama models by creating the necessary symbolic links between Ollama's model storage and LM Studio's expected format.
- Ollama installed with at least one model downloaded
- LM Studio installed
jq
command-line JSON processor- macOS:
brew install jq
- Linux:
sudo apt-get install jq
or equivalent - Windows:
choco install jq
- macOS:
- Clone the repository:
git clone https://github.com/eelbaz/ollama-lmstudio-bridge.git
cd ollama-lmstudio-bridge
- Make the script executable:
chmod +x ollama-lmstudio-bridge.sh
-
Run the script:
-
The script will:
- Scan your Ollama models
- Create a
publicmodels/lmstudio
directory in your home folder - Create symbolic links to your Ollama model files
-
In LM Studio:
- Go to Settings
- Set Models Directory to the path shown by the script
- Your Ollama models should now appear in LM Studio from the dropdown
- macOS
- Linux
- Windows (requires Developer Mode or Administrator privileges for symlink creation)
-
macOS/Linux:
- Ollama manifests:
~/.ollama/models/manifests/registry.ollama.ai
- Bridge output:
~/publicmodels/lmstudio
- Ollama manifests:
-
Windows:
- Ollama manifests:
%USERPROFILE%\AppData\Local\ollama\models\manifests\registry.ollama.ai
- Bridge output:
%USERPROFILE%\Documents\publicmodels\lmstudio
- Ollama manifests:
- Symlink Creation Fails: On Windows, enable Developer Mode or run as Administrator
- Models Not Found: Ensure you have downloaded models through Ollama first
- jq Not Found: Install jq using your system's package manager
MIT License - See LICENSE file for details.