Skip to content

Conversation

yanxi0830
Copy link
Contributor

@yanxi0830 yanxi0830 commented Nov 7, 2024

TL;DR

  • add provider_data to LlamaStackClient for specifying API keys

Tests

client = LlamaStackClient(
        base_url=f"http://{host}:{port}",
        provider_data={
            "together_api_key": os.environ.get("TOGETHER_API_KEY"),
        },
    )

Run in llama-stack-apps w/ Together inference

@yanxi0830 yanxi0830 marked this pull request as ready for review November 7, 2024 06:35
Copy link
Contributor

@ashwinb ashwinb left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This works -- but we need to figure out how to make this work across the next Stainless generation (i.e., we need a check which makes sure this won't get overwritten silently if someone forgets something.)

@yanxi0830
Copy link
Contributor Author

This works -- but we need to figure out how to make this work across the next Stainless generation (i.e., we need a check which makes sure this won't get overwritten silently if someone forgets something.)

Our stainless sync script pushes changes in a separate branch to this repo and we currently make a PR every time we sync stainless to this repo. I think we can just add a CI check to see if there's any changes in this _client.py file.

@yanxi0830 yanxi0830 merged commit f048589 into main Nov 7, 2024
3 checks passed
@yanxi0830 yanxi0830 deleted the patch_api_key branch November 7, 2024 06:54
@ashwinb
Copy link
Contributor

ashwinb commented Nov 7, 2024

Can we try to apply this same exact patch to the _client.py every single time? If it fails, we know something new has come up.

@yanxi0830
Copy link
Contributor Author

yanxi0830 commented Nov 15, 2024

scripts for applying patch:

sed -i '' '/from __future__/a\
import json
' $META_LLAMA_PYTHON_SDK_REPO/src/llama_stack_client/_client.py


# Add provider_data parameter to both sync and async client __init__ methods
sed -i '' '/def __init__/,/_strict_response_validation: bool = False,/{/_strict_response_validation: bool = False,/a\
        provider_data: Mapping[str, Any] | None = None,
}' $META_LLAMA_PYTHON_SDK_REPO/src/llama_stack_client/_client.py

# Add provider_data handling logic after base_url check in both sync and async clients
sed -i '' '/base_url = f"http:\/\/any-hosted-llama-stack.com"/a\
\
        if provider_data is not None:\
            if default_headers is None:\
                default_headers = {}\
            default_headers["X-LlamaStack-ProviderData"] = json.dumps(provider_data)' $META_LLAMA_PYTHON_SDK_REPO/src/llama_stack_client/_client.py

cdoern pushed a commit to cdoern/llama-stack-client-python that referenced this pull request Aug 8, 2025
This is a followup for:
add client-side utility for getting OAuth tokens simply llamastack#230

- Add extra_headers parameter to ReActAgent.__init__ method
- Pass extra_headers to parent Agent class
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants