astra-assistants-api: A backend implementation of the OpenAI beta Assistants API #396
Labels
llm-applications
Topics related to practical applications of Large Language Models in various fields
llm-function-calling
Function Calling with Large Language Models
openai
OpenAI APIs, LLMs, Recipes and Evals
Astra Assistant API Service
A drop-in compatible service for the OpenAI beta Assistants API with support for persistent threads, files, assistants, messages, retrieval, function calling and more using AstraDB (DataStax's db as a service offering powered by Apache Cassandra and jvector).
Compatible with existing OpenAI apps via the OpenAI SDKs by changing a single line of code.
Getting Started
with:
Or, if you have an existing astra db, you can pass your db_id in a second header:
By default, the service uses AstraDB as the database/vector store and OpenAI for embeddings and chat completion.
Third party LLM Support
We now support many third party models for both embeddings and completion thanks to litellm. Pass the api key of your service using
api-key
andembedding-model
headers.For AWS Bedrock, you can pass additional custom headers:
and again, specify the custom model for the assistant.
Additional examples including third party LLMs (bedrock, cohere, perplexity, etc.) can be found under
examples
.To run the examples using poetry:
.env
file in this directory with your secrets.Coverage
See our coverage report here.
Roadmap
Suggested labels
{ "key": "llm-function-calling", "value": "Integration of function calling with Large Language Models (LLMs)" }
The text was updated successfully, but these errors were encountered: