Skip to content

Commit

Permalink
doc on streaming usage litellm proxy
Browse files Browse the repository at this point in the history
  • Loading branch information
ishaan-jaff committed Dec 31, 2024
1 parent 83879d2 commit 60bdfb4
Show file tree
Hide file tree
Showing 3 changed files with 50 additions and 9 deletions.
8 changes: 5 additions & 3 deletions docs/my-website/docs/completion/stream.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,9 +3,11 @@ import TabItem from '@theme/TabItem';

# Streaming + Async

- [Streaming Responses](#streaming-responses)
- [Async Completion](#async-completion)
- [Async + Streaming Completion](#async-streaming)
| Feature | LiteLLM SDK | LiteLLM Proxy |
|---------|-------------|---------------|
| Streaming |[start here](#streaming-responses) |[start here](../proxy/user_keys#streaming) |
| Async |[start here](#async-completion) |[start here](../proxy/user_keys#streaming) |
| Async Streaming |[start here](#async-streaming) |[start here](../proxy/user_keys#streaming) |

## Streaming Responses
LiteLLM supports streaming the model response back by passing `stream=True` as an argument to the completion function
Expand Down
45 changes: 45 additions & 0 deletions docs/my-website/docs/proxy/user_keys.md
Original file line number Diff line number Diff line change
Expand Up @@ -381,6 +381,51 @@ assert user.age == 25

```

### **Streaming**


<Tabs>
<TabItem value="curl" label="curl">

```bash
curl http://0.0.0.0:4000/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $OPTIONAL_YOUR_PROXY_KEY" \
-d '{
"model": "gpt-4-turbo",
"messages": [
{
"role": "user",
"content": "this is a test request, write a short poem"
}
],
"stream": true
}'
```
</TabItem>
<TabItem value="sdk" label="SDK">

```python
from openai import OpenAI
client = OpenAI(
api_key="sk-1234", # [OPTIONAL] set if you set one on proxy, else set ""
base_url="http://0.0.0.0:4000",
)

messages = [{"role": "user", "content": "this is a test request, write a short poem"}]
completion = client.chat.completions.create(
model="gpt-4o",
messages=messages,
stream=True
)

print(completion)

```
</TabItem>
</Tabs>


### Function Calling

Here's some examples of doing function calling with the proxy.
Expand Down
6 changes: 0 additions & 6 deletions docs/my-website/docusaurus.config.js
Original file line number Diff line number Diff line change
Expand Up @@ -43,12 +43,6 @@ const config = {
id: 'release_notes',
path: './release_notes', // Folder where your release notes are stored
routeBasePath: '/release_notes', // URL path for the release notes
sortPosts: (a, b) => {
// Extract folder names from the file paths
const folderA = a.metadata.permalink.split('/')[2]; // Get folder name from permalink
const folderB = b.metadata.permalink.split('/')[2];
return folderA.localeCompare(folderB); // Compare folder names
},
include: ['**/*.md', '**/*.mdx'], // Files to include
// Other blog options
},
Expand Down

0 comments on commit 60bdfb4

Please sign in to comment.