Skip to content

Conversation

vladimirivic
Copy link
Contributor

Summary:
Adding MessageAttachment.base64(path) so that sending attachment will look like this:

from llama_stack_client.lib.inference.utils import MessageAttachment

response = client.inference.chat_completion(
    model_id="meta-llama/llama3.2-11b-vision-instruct",
    messages=[
        {
            "role": "user",
            "content": {
                "type": "image",
                "image": {
                    "data": MessageAttachment.base64("images/tennis-game.png")
                }
            }
        },
        {
            "role": "user",
            "content": "What's in this image?",
        }
    ]
)

Test Plan:

pip install .

# start a new notebook and run

from llama_stack_client import LlamaStackClient
from llama_stack_client.lib.inference.utils import MessageAttachment

client = LlamaStackClient(
    base_url='localhost:8321'
)

response = client.inference.chat_completion(
    model_id="meta-llama/llama3.2-11b-vision-instruct",
    messages=[
        {
            "role": "user",
            "content": {
                "type": "image",
                "image": {
                    "data": MessageAttachment.base64("images/tennis-game.png")
                }
            }
        },
        {
            "role": "user",
            "content": "What's in this image?",
        }
    ]
)

print(response)

Summary:
Adding `MessageAttachment.base64(path)` so that sending attachment will look like this:

```
from llama_stack_client.lib.inference.utils import MessageAttachment

response = client.inference.chat_completion(
    model_id="meta-llama/llama3.2-11b-vision-instruct",
    messages=[
        {
            "role": "user",
            "content": {
                "type": "image",
                "image": {
                    "data": MessageAttachment.base64("images/tennis-game.png")
                }
            }
        },
        {
            "role": "user",
            "content": "What's in this image?",
        }
    ]
)
```

Test Plan:
```
pip install .

# start a new notebook and run

from llama_stack_client import LlamaStackClient
from llama_stack_client.lib.inference.utils import MessageAttachment

client = LlamaStackClient(
    base_url='localhost:8321'
)

response = client.inference.chat_completion(
    model_id="meta-llama/llama3.2-11b-vision-instruct",
    messages=[
        {
            "role": "user",
            "content": {
                "type": "image",
                "image": {
                    "data": MessageAttachment.base64("images/tennis-game.png")
                }
            }
        },
        {
            "role": "user",
            "content": "What's in this image?",
        }
    ]
)

print(response)
```
@hardikjshah hardikjshah merged commit 13ab205 into main Feb 6, 2025
3 checks passed
@hardikjshah hardikjshah deleted the pr125 branch February 6, 2025 22:41
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants