Skip to content

Conversation

baijum
Copy link
Contributor

@baijum baijum commented Feb 10, 2025

What does this PR do?

Update the example to reflect current API

Test Plan

Copy paste the example in the README and check the output. Without this change an error is raised:

python3 hello.py 
Traceback (most recent call last):
  File "/Users/bmuthuka/wa/meta-llama/baiju-experiments/hello.py", line 8, in <module>
    response = client.inference.chat_completion(
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/miniconda3/envs/myml/lib/python3.12/site-packages/llama_stack_client/_utils/_utils.py", line 274, in wrapper
    raise TypeError(msg)
TypeError: Missing required arguments; Expected either ('messages' and 'model_id') or ('messages', 'model_id' and 'stream') arguments to be given

@yanxi0830 yanxi0830 merged commit d6e855e into llamastack:main Feb 14, 2025
2 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants