-
Notifications
You must be signed in to change notification settings - Fork 1.2k
chore: Stack server no longer depends on llama-stack-client #4094
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I had the extact same branch locally. 😅
You also need to edit pre-commit, it was failing for me:
- id: provider-codegen
name: Provider Codegen
additional_dependencies:
- uv==0.7.8
entry: ./scripts/uv-run-with-index.sh run --extra client --group codegen ./scripts/provider_codegen.py
And change the test setup to use --extra client.
| "sqlalchemy[asyncio]>=2.0.41", # server - for conversations | ||
| ] | ||
|
|
||
| [project.optional-dependencies] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's add the client as an optional deps
[project.optional-dependencies]
client = [
"llama-stack-client>=0.3.0", # Optional for library-only usage
]
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@leseb I don't see why this is needed. there is nothing which depends on the client, and provider codegen should not need it either.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
mypy on the other hand needs so I will add it there
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also see #4097
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
How about when we use as a library src/llama_stack/core/library_client.py? This requires both llama-stack and llama-stack-client. So to be it'll be natural in this scenario to do pip install llama-srack[client].
No?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
But that means the client does get packaged together! That's what we want to avoid.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No, optional dependencies do not automatically get packaged or installed with the main project.
They are only installed if the user explicitly requests them.
wdyt?
I don't consider this a blocker on my end, I'm willing to let it slip if you still strongly disagree. I tread this as a convience / nice to have type of things.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In fact optional dependencies DO get packaged. They just don't get installed.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
One advantage of bundling is that we can bundle the correct version of the client that's supposed to work with this version of the Stack.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In fact optional dependencies DO get packaged. They just don't get installed.
Hum, only metadata is packaged, not the dependency code? I've done some tests
One advantage of bundling is that we can bundle the correct version of the client that's supposed to work with this version of the Stack.
Yes!
| ) | ||
| except ImportError as e: | ||
| raise ImportError( | ||
| "llama-stack-client is not installed. Please install it with `pip install llama-stack-client`." |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| "llama-stack-client is not installed. Please install it with `pip install llama-stack-client`." | |
| "llama-stack-client is not installed. Please install it with `pip install llama-stack-client` 'uv pip install llama-stack[client]'" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't see why this is needed, I disagree.
|
This pull request has merge conflicts that must be resolved before it can be merged. @ashwinb please rebase it. https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/working-with-forks/syncing-a-fork |
leseb
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't want to block this work with having llama-stack-client as a optional dep to the project. We can revisit later.
Let's just make pre-commit happy, thanks
|
@github-actions run precommit |
|
hum has pre-commit bot started its weekend already? |
|
@leseb we had to delete the pre-commit bot. It had a security vulnerability. many I will update the PR to do what you suggested. |
|
The CI failure for docker is due to Huggingface limiting downloads of the nomic model. Landing. |
This dependency has been bothering folks for a long time (cc @leseb). We really needed it due to "library client" which is primarily used for our tests and is not a part of the Stack server. Anyone who needs to use the library client can certainly install
llama-stack-clientin their environment to make that work.Updated the notebook references to install
llama-stack-clientadditionally when setting things up.