VScode extension to integrate with locally running codellama LLM
- Download and install
ollama
- Pull the 3 codellama models:
ollama pull codellama:13b-instruct
ollama pull codellama:13b-code
ollama pull codellama:13b-python
Both the following modes (Instruct & auto-complete) are triggered using the Trigger Inline Suggestion
command in VSCode Cmd + Shift + P
Automatic triggers are not supported
Trigger Codellama: Ask
and provide a prompt for instruction based question answering.
This uses
codellama:13b-instruct
Trigger Codellama: Explain
to explain the selected code. If no selection is provided, it will aim to explain the full document
This uses
codellama:13b-instruct
Write any code and trigger a code completion for it using Trigger Inline Completion
Based on the filetype, it will use
codellama:13b-python
for Python andcodellama:13b-code
for other languages
When switching languages or models within a session, the initial prompt on a switch can be slow, as the new model needs to be loaded into memory In case you end up loading all 3 models, you might run out of RAM
Enjoy!