-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ez-openai: better openai python library for assistants and function calling #683
Comments
Related issues#396: astra-assistants-api: A backend implementation of the OpenAI beta Assistants API### DetailsSimilarity score: 0.89 - [ ] [datastax/astra-assistants-api: A backend implementation of the OpenAI beta Assistants API](https://github.com/datastax/astra-assistants-api)Astra Assistant API ServiceA drop-in compatible service for the OpenAI beta Assistants API with support for persistent threads, files, assistants, messages, retrieval, function calling and more using AstraDB (DataStax's db as a service offering powered by Apache Cassandra and jvector). Compatible with existing OpenAI apps via the OpenAI SDKs by changing a single line of code. Getting Started
client = OpenAI(
api_key=OPENAI_API_KEY,
) with: client = OpenAI(
base_url="https://open-assistant-ai.astra.datastax.com/v1",
api_key=OPENAI_API_KEY,
default_headers={
"astra-api-token": ASTRA_DB_APPLICATION_TOKEN,
}
) Or, if you have an existing astra db, you can pass your db_id in a second header: client = OpenAI(
base_url="https://open-assistant-ai.astra.datastax.com/v1",
api_key=OPENAI_API_KEY,
default_headers={
"astra-api-token": ASTRA_DB_APPLICATION_TOKEN,
"astra-db-id": ASTRA_DB_ID
}
)
assistant = client.beta.assistants.create(
instructions="You are a personal math tutor. When asked a math question, write and run code to answer the question.",
model="gpt-4-1106-preview",
tools=[{"type": "retrieval"}]
) By default, the service uses AstraDB as the database/vector store and OpenAI for embeddings and chat completion. Third party LLM SupportWe now support many third party models for both embeddings and completion thanks to litellm. Pass the api key of your service using For AWS Bedrock, you can pass additional custom headers: client = OpenAI(
base_url="https://open-assistant-ai.astra.datastax.com/v1",
api_key="NONE",
default_headers={
"astra-api-token": ASTRA_DB_APPLICATION_TOKEN,
"embedding-model": "amazon.titan-embed-text-v1",
"LLM-PARAM-aws-access-key-id": BEDROCK_AWS_ACCESS_KEY_ID,
"LLM-PARAM-aws-secret-access-key": BEDROCK_AWS_SECRET_ACCESS_KEY,
"LLM-PARAM-aws-region-name": BEDROCK_AWS_REGION,
}
) and again, specify the custom model for the assistant. assistant = client.beta.assistants.create(
name="Math Tutor",
instructions="You are a personal math tutor. Answer questions briefly, in a sentence or less.",
model="meta.llama2-13b-chat-v1",
) Additional examples including third party LLMs (bedrock, cohere, perplexity, etc.) can be found under To run the examples using poetry:
poetry install
poetry run python examples/completion/basic.py
poetry run python examples/retreival/basic.py
poetry run python examples/function-calling/basic.py CoverageSee our coverage report here. Roadmap
Suggested labels{ "key": "llm-function-calling", "value": "Integration of function calling with Large Language Models (LLMs)" }#129: Few-shot and function calling - API - OpenAI Developer Forum### DetailsSimilarity score: 0.87 - [ ] [Few-shot and function calling - API - OpenAI Developer Forum](https://community.openai.com/t/few-shot-and-function-calling/265908/10)The thing to understand here is that function calling introduced a new role for the chat prompt messages ("role": "function"). To use few-shot examples with chat model prompts you provide a series of alternating (possibly 'fake') messages that show how the e.g.
#132: Examples - OpenAI API### DetailsSimilarity score: 0.87 - [ ] [Examples - OpenAI API](https://platform.openai.com/examples/default-time-complexity?lang=curl)
TransformCode curl https://api.openai.com/v1/chat/completions
#399: openai-python api doc### DetailsSimilarity score: 0.86 - [ ] [openai-python/api.md at main · openai/openai-python](https://github.com/openai/openai-python/blob/main/api.md)Add error handling for failed API requestsIs this a bug or feature request? What is the current behavior? What is the expected behavior? What is the impact of this issue? Possible Solutions:
Steps to reproduce:
Additional context: Suggested labels{ "key": "ai-platform", "value": "Platforms and tools for implementing AI solutions" }#638: Announcing function calling and JSON mode### DetailsSimilarity score: 0.86 - [ ] [Announcing function calling and JSON mode](https://www.together.ai/blog/function-calling-json-mode)Announcing function calling and JSON modeDESCRIPTION: Introduction to JSON mode and function callingWhile both JSON mode and function calling can enhance your interaction with LLMs, it's important to understand that they are not interchangeable — they serve different purposes and offer unique benefits. Specifically:
JSON ModeWith JSON mode, you can specify a schema for the output of the LLM. While the OpenAI API does not inherently allow for the specification of a JSON schema, we augmented the response_format argument with schema. When a schema is passed in, we enforce the model to generate the output aligned with the given schema. Here's an example of how you can use JSON mode with Mixtral: import os
import json
import openai
from pydantic import BaseModel, Field
# Create client
client = openai.OpenAI(
base_url = "https://api.together.xyz/v1",
api_key = os.environ['TOGETHER_API_KEY'],
)
# Define the schema for the output.
class User(BaseModel):
name: str = Field(description="user name")
address: str = Field(description="address")
# Generate
chat_completion = client.chat.completions.create(
model="mistralai/Mixtral-8x7B-Instruct-v0.1",
response_format={
"type": "json_object",
"schema": User.model_json_schema()
},
messages=[
{"role": "system", "content": "You are a helpful assistant that answers in JSON."},
{"role": "user", "content": "Create a user named Alice, who lives in 42, Wonderland Avenue."}
],
)
created_user = json.loads(chat_completion.choices[0].message.content)
print(json.dumps(created_user, indent=2)) In this example, we define a schema for a User object that contains their name and address. The LLM then generates a response that matches this schema, providing a structured JSON object that we can use directly in our application in a deterministic way. The expected output of this example is: {
"address": "42, Wonderland Avenue",
"name": "Alice"
} More Examples:
For more detailed information, check out our documentation on JSON mode. Function CallingWith function calling, it will output a JSON object containing arguments for external functions that are defined. After the functions are defined, the LLM will intelligently determine if a function needs to be invoked and if it does, it will suggest the appropriate one with the correct parameters in a JSON object. After that, you can execute the API call within your application and relay the response back to the LLM to continue working. Let's illustrate this process with a simple example: creating a chatbot that has access to weather data. The function is defined in tools: import os
import json
import openai
# Create client
client = openai.OpenAI(
base_url = "https://api.together.xyz/v1",
api_key = os.environ['TOGETHER_API_KEY'],
)
# Define function(s)
tools = [
{
"type": "function",
"function": {
"name": "get_current_weather",
"description": "Get the current weather in a given location",
"parameters": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "The city and state, e.g. San Francisco, CA"
},
"unit": {
"type": "string",
"enum": [
"celsius",
"fahrenheit"
]
}
}
}
}
}
]
# Generate
response = client.chat.completions.create(
model="mistralai/Mixtral-8x7B-Instruct-v0.1",
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "What is the current temperature of New York?"}
],
tools=tools,
tool_choice="auto",
)
print(json.dumps(response.choices[0].message.dict()['tool_calls'], indent=2)) In this example, we define an external function that gets the current weather in a given location. We then use this function in our chat completion request. The AI model generates a response that includes calls to this function, providing real-time weather data for the requested locations. The expected output is: [
{
"id": "...",
"function": {
"arguments": "{\"location\":\"New York\",\"unit\":\"fahrenheit\"}",
"name": "get_current_weather"
},
"type": "function"
}
] More Examples:
For more detailed information, check out our documentation on function calling. ConclusionWe believe that JSON mode and function calling are a significant step forward, bringing a new level of versatility and functionality to AI applications. By enabling a more structured interaction with the model and allowing for specific types of outputs and behaviors, we're confident that it will be a valuable tool for developers. We can't wait to see what you build on Together AI! For more info, check out our function calling and JSON mode docs. Suggested labels{'label-name': 'JSON-structure', 'label-description': 'Describes JSON schema usage and generation for structured data output in AI interactions.', 'gh-repo': 'knowledge-repo', 'confidence': 53.09}#305: Home - LibreChat### DetailsSimilarity score: 0.86 - [ ] [Home - LibreChat](https://docs.librechat.ai/index.html)Table of contents LibreChat 🪶 Features 🖥️ UI matching ChatGPT, including Dark mode, Streaming, and 11-2023 updates LibreChat brings together the future of assistant AIs with the revolutionary technology of OpenAI's ChatGPT. Celebrating the original styling, LibreChat gives you the ability to integrate multiple AI models. It also integrates and enhances original client features such as conversation and message search, prompt templates and plugins. With LibreChat, you no longer need to opt for ChatGPT Plus and can instead use free or pay-per-call APIs. We welcome contributions, cloning, and forking to enhance the capabilities of this advanced chatbot platform. Suggested labels"ai-platform" |
TITLE: skorokithakis/ez-openai: Ez API, ez life
DESCRIPTION:
AGPL-3.0 license
Ez OpenAI
My opinion of the openai Python library is best illustrated by the fact that if you ask ChatGPT about it, it will usually hallucinate a more reasonable API. So, I wrote this library, because if I had to manually poll for a tool update again I would instigate the robot uprising myself.
Installation
Run this somewhere:
Usage
Basic usage
Using Ez OpenAI is (hopefully) straightforward, otherwise I've failed at the one thing I've set out to make:
Function calling
No more wizardry, just plain Python functions:
Because assistants change (eg if you want to add some more functions), and it's tedious to create new ones every time, there's a helper method that will update an assistant with new functions/instructions:
The text was updated successfully, but these errors were encountered: