-
Notifications
You must be signed in to change notification settings - Fork 555
Issues: meta-llama/llama-stack
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Llama-guard and remote::vllm name model name mismatch
#365
opened Nov 4, 2024 by
stevegrubb
1 of 2 tasks
Model ids that contains a colon throws error when trying to install on Windows
#347
opened Oct 30, 2024 by
Sandstedt
1 of 2 tasks
ValueError: Further information is requested
Llama3.1-8B-Instruct
not registered. Make sure there is an Inference provider serving this model.
question
#345
opened Oct 29, 2024 by
ducktapeonmydesk
2 tasks
Issue saving and querying PDF to vector store (meta-reference)
#342
opened Oct 29, 2024 by
jeffxtang
2 tasks
High GPU power consumption even in standby.
#337
opened Oct 28, 2024 by
JoseGuilherme1904
1 of 2 tasks
TypeError: expected str, bytes or os.PathLike object, not NoneType
#336
opened Oct 28, 2024 by
JoseGuilherme1904
1 of 2 tasks
[enhancement] Add support for llama 3.2 models using Amazon AWS bedrock inference
#334
opened Oct 28, 2024 by
shrinitg
How to specify the model type using the pre-build docker?
#331
opened Oct 27, 2024 by
Travis-Barton
2 tasks done
Guardrail Loading Failed with Unexpected Large GPU Memory Requirement at Multi-GPU Server
#328
opened Oct 25, 2024 by
dawenxi-007
2 tasks
Server webmethod endpoint and llama-stack-spec.yaml file mismatch
#322
opened Oct 25, 2024 by
cheesecake100201
2 tasks
What configs input when build from distributions/meta-reference-gpu/build.yaml
#321
opened Oct 25, 2024 by
AlexHe99
2 tasks
Create a remote memory provider for pinecone
good first issue
Good for newcomers
#268
opened Oct 18, 2024 by
raghotham
pytorch CUDA not found in host that has CUDA with working pytorch
question
Further information is requested
#257
opened Oct 16, 2024 by
nikolaydubina
Previous Next
ProTip!
Adding no:label will show everything without a label.