Skip to content

"llama-stack-client" Command Not Working #8

@dawenxi-007

Description

@dawenxi-007

I used the pip install -r requirements.txt installed the llama-stack-client in the venv through the https://github.com/meta-llama/llama-stack-apps/blob/main/requirements.txt.

However, the llama-stack-client -h gave me the following error:

(llamastk_localgpu_env) tao@r7625h100:~/demo_1024/llamastk_metaflow/llama-stack-apps$ llama-stack-client -h
Traceback (most recent call last):
  File "/home/tao/demo_1024/llamastk_metaflow/llamastk_localgpu_env/bin/llama-stack-client", line 8, in <module>
    sys.exit(main())
  File "/home/tao/demo_1024/llamastk_metaflow/llamastk_localgpu_env/lib/python3.10/site-packages/llama_stack_client/lib/cli/llama_stack_client.py", line 45, in main
    parser = LlamaStackClientCLIParser()
  File "/home/tao/demo_1024/llamastk_metaflow/llamastk_localgpu_env/lib/python3.10/site-packages/llama_stack_client/lib/cli/llama_stack_client.py", line 31, in __init__
    ModelsParser.create(subparsers)
  File "/home/tao/demo_1024/llamastk_metaflow/llamastk_localgpu_env/lib/python3.10/site-packages/llama_stack_client/lib/cli/subcommand.py", line 16, in create
    return cls(*args, **kwargs)
  File "/home/tao/demo_1024/llamastk_metaflow/llamastk_localgpu_env/lib/python3.10/site-packages/llama_stack_client/lib/cli/models/models.py", line 27, in __init__
    ModelsList.create(subparsers)
  File "/home/tao/demo_1024/llamastk_metaflow/llamastk_localgpu_env/lib/python3.10/site-packages/llama_stack_client/lib/cli/subcommand.py", line 16, in create
    return cls(*args, **kwargs)
  File "/home/tao/demo_1024/llamastk_metaflow/llamastk_localgpu_env/lib/python3.10/site-packages/llama_stack_client/lib/cli/models/list.py", line 25, in __init__
    self._add_arguments()
  File "/home/tao/demo_1024/llamastk_metaflow/llamastk_localgpu_env/lib/python3.10/site-packages/llama_stack_client/lib/cli/models/list.py", line 29, in _add_arguments
    self.endpoint = get_config().get("endpoint")
AttributeError: 'NoneType' object has no attribute 'get'

pip list | grep llama shows me the version as the following:

llama_models       0.0.45
llama_stack        0.0.45
llama_stack_client 0.0.41

However, the python env gave me another version number:


(llamastk_localgpu_env) tao@r7625h100:~/demo_1024/llamastk_metaflow/llama-stack-apps$ python
Python 3.10.12 (main, Sep 11 2024, 15:47:36) [GCC 11.4.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import llama_stack_client
>>> print(llama_stack_client.__version__)
0.0.1-alpha.0
>>>

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions