Based on SBERT and SpaCy Transforms, ChatsAPI is designed to enable seamless natural language processing for AI-powered conversational agents. With hybrid search capabilities and an extensible architecture, ChatsAPI offers blazing-fast performance and intuitive route management.
- SBERT & SpaCy-Based NLP: Combines the power of Sentence-BERT embeddings and SpaCy for intelligent semantic matching and entity extraction.
- Hybrid Search: Supports HNSWlib-based nearest neighbor search and BM25 hybrid search for efficient query handling.
- Dynamic Routing: Easily define conversational routes with decorators.
- Parameter Extraction: Automatically extract parameters from user input with flexible type handling.
- LLM Integration: Integrates with popular LLMs such as OpenAI, Gemini, and LlamaAPI for extended conversational capabilities.
- Conversation Management: Supports multi-session conversation handling with unique session IDs.
Install the package via pip:
pip install chatsapi
from chatsapi import ChatsAPI
chat = ChatsAPI(
llm_type="gemini", # Choose LLM type (e.g., gemini, openai, ollama)
llm_model="models/gemini-pro", # Specify model
llm_api_key="YOUR_API_KEY" # API key for the LLM
)
Define conversational routes using decorators. Routes map user inputs to specific handler functions.
@chat.trigger("Want to cancel a credit card.")
@chat.extract([("card_number", "Credit card number (a 12-digit number)", str, None)])
async def cancel_credit_card(chat_message: str, extracted: dict):
return {"message": chat_message, "extracted": extracted}
Explanation:
@chat.trigger
: Registers the route with a user-friendly description.@chat.extract
: Automatically extracts parameters from user input.
Use the run
method to handle user inputs.
async def main():
response = await chat.run("I want to cancel my credit card.")
print(response)
from fastapi import FastAPI, Request, Response
from pydantic import BaseModel
from chatsapi import ChatsAPI
app = FastAPI()
chat = ChatsAPI(
llm_type="gemini",
llm_model="models/gemini-pro",
llm_api_key="YOUR_API_KEY",
)
@chat.trigger("Need help with account settings.")
@chat.extract([
("account_number", "Account number (a 9-digit number)", int, None),
("holder_name", "Account holder's name (a person name)", str, None)
])
async def account_help(chat_message: str, extracted: dict):
return {"message": chat_message, "extracted": extracted}
class RequestModel(BaseModel):
message: str
@app.post("/chat")
async def message(request: RequestModel):
reply = await chat.run(request.message)
return {"response": reply}
ChatsAPI supports multi-session conversations using unique session IDs:
session_id = chat.set_session() # Start a new session
response = await chat.conversation("Tell me about my account", session_id)
print(response)
chat.end_session(session_id) # End the session
- OpenAI (ChatGPT)
- Gemini
- LlamaAPI
- Ollama
- SBERT: Used for creating sentence embeddings.
- HNSWlib: Provides fast approximate nearest neighbor search.
- BM25: Implements Okapi BM25 for token-based matching.
- SpaCy: Handles natural language parsing and entity recognition.
This project is licensed under the MIT License. See the LICENSE file for details.
Contributions are welcome! Feel free to open issues or submit pull requests.