Skip to content

Commit

Permalink
📝 add more description to autocomplete docs
Browse files Browse the repository at this point in the history
  • Loading branch information
sestinj committed May 6, 2024
1 parent 4db5b6d commit 4cd2340
Showing 1 changed file with 16 additions and 3 deletions.
19 changes: 16 additions & 3 deletions core/autocomplete/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,15 +14,28 @@ Once it has been downloaded, you should begin to see completions in VS Code.

## Setting up a custom model

All of the configuration options available for chat models are available to use for tab-autocomplete. For example, if you wanted to use a remote Ollama instance you would edit your `config.json` like this (note that it is not inside the models array):
All of the configuration options available for chat models are available to use for tab-autocomplete. For example, if you wanted to use a remote vLLM instance you would edit your `config.json` like this (note that it is not inside the models array), filling in the correct model name and vLLM endpoint:

```json title=~/.continue/config.json
{
"tabAutocompleteModel": {
"title": "Tab Autocomplete Model",
"provider": "openai",
"model": "<MODEL_NAME>",
"apiBase": "<VLLM_ENDPOINT_URL>"
},
...
}
```

As another example, say you want to use a different model, `deepseek-coder:6.7b-base`, with Ollama:

```json title=~/.continue/config.json
{
"tabAutocompleteModel": {
"title": "Tab Autocomplete Model",
"provider": "ollama",
"model": "starcoder:3b",
"apiBase": "https://<my endpoint>"
"model": "deepseek-coder:6.7b-base"
},
...
}
Expand Down

0 comments on commit 4cd2340

Please sign in to comment.