-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
astra-assistants-api: A backend implementation of the OpenAI beta Assistants API #396
Labels
llm-applications
Topics related to practical applications of Large Language Models in various fields
llm-function-calling
Function Calling with Large Language Models
openai
OpenAI APIs, LLMs, Recipes and Evals
Comments
This was referenced Feb 27, 2024
This was referenced Mar 8, 2024
1 task
This was referenced Aug 3, 2024
Open
This was referenced Aug 16, 2024
1 task
This was referenced Nov 7, 2024
Open
1 task
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Labels
llm-applications
Topics related to practical applications of Large Language Models in various fields
llm-function-calling
Function Calling with Large Language Models
openai
OpenAI APIs, LLMs, Recipes and Evals
Astra Assistant API Service
A drop-in compatible service for the OpenAI beta Assistants API with support for persistent threads, files, assistants, messages, retrieval, function calling and more using AstraDB (DataStax's db as a service offering powered by Apache Cassandra and jvector).
Compatible with existing OpenAI apps via the OpenAI SDKs by changing a single line of code.
Getting Started
with:
Or, if you have an existing astra db, you can pass your db_id in a second header:
By default, the service uses AstraDB as the database/vector store and OpenAI for embeddings and chat completion.
Third party LLM Support
We now support many third party models for both embeddings and completion thanks to litellm. Pass the api key of your service using
api-key
andembedding-model
headers.For AWS Bedrock, you can pass additional custom headers:
and again, specify the custom model for the assistant.
Additional examples including third party LLMs (bedrock, cohere, perplexity, etc.) can be found under
examples
.To run the examples using poetry:
.env
file in this directory with your secrets.Coverage
See our coverage report here.
Roadmap
Suggested labels
{ "key": "llm-function-calling", "value": "Integration of function calling with Large Language Models (LLMs)" }
The text was updated successfully, but these errors were encountered: