Skip to content

Commit

Permalink
Merge pull request #1258 from rbrisita/fix_doc
Browse files Browse the repository at this point in the history
Fix Documentation
  • Loading branch information
KillianLucas authored Aug 21, 2024
2 parents aa21637 + 271d60f commit 30df78f
Showing 1 changed file with 7 additions and 8 deletions.
15 changes: 7 additions & 8 deletions docs/language-models/custom-models.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -9,21 +9,20 @@ As long as your system can accept an input and stream an output (and can be inte
Simply replace the OpenAI-compatible `completions` function in your language model with one of your own:

```python
def custom_language_model(openai_message):
def custom_language_model(messages, model, stream, max_tokens):
"""
OpenAI-compatible completions function (this one just echoes what the user said back).
To make it OpenAI-compatible and parsable, `choices` has to be the root property.
The property `delta` is used to signify streaming.
"""
users_content = openai_message[-1].get("content") # Get last message's content

# To make it OpenAI-compatible, we yield this first:
yield {"delta": {"role": "assistant"}}
users_content = messages[-1].get("content") # Get last message's content

for character in users_content:
yield {"delta": {"content": character}}
yield {"choices": [{"delta": {"content": character}}]}

# Tell Open Interpreter to power the language model with this function

interpreter.llm.completion = custom_language_model
interpreter.llm.completions = custom_language_model
```

Then, set the following settings:
Expand All @@ -39,4 +38,4 @@ And start using it:

```
interpreter.chat("Hi!") # Returns/displays "Hi!" character by character
```
```

0 comments on commit 30df78f

Please sign in to comment.