Skip to content

Latest commit

 

History

History
194 lines (143 loc) · 5.74 KB

README_en.md

File metadata and controls

194 lines (143 loc) · 5.74 KB

Toolkit for Chat API, supporting multi-turn dialogue, proxy, and asynchronous data processing.

Installation

pip install chattool --upgrade

Usage

Set API Key and Base URL

Set environment variables in ~/.bashrc or ~/.zshrc:

export OPENAI_API_KEY="sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"
export OPENAI_API_BASE="https://api.example.com/v1" 
export OPENAI_API_BASE_URL="https://api.example.com" # optional

Or in Python code:

import chattool
chattool.api_key = "sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"
chattool.api_base = "https://api.example.com/v1"

Note: OPENAI_API_BASE is prior to OPENAI_API_BASE_URL, and you only need to set one of them.

Examples

Example 1, simulate multi-turn dialogue:

# first chat
chat = Chat("Hello, GPT-3.5!")
resp = chat.getresponse()

# continue the chat
chat.user("How are you?")
next_resp = chat.getresponse()

# add response manually
chat.user("What's your name?")
chat.assistant("My name is GPT-3.5.")

# save the chat history
chat.save("chat.json", mode="w") # default to "a"

# print the chat history
chat.print_log()

Example 2, process data in batch, and use a checkpoint file checkpoint:

# write a function to process the data
def msg2chat(msg):
    chat = Chat()
    chat.system("You are a helpful translator for numbers.")
    chat.user(f"Please translate the digit to Roman numerals: {msg}")
    # We need to call `getresponse` here to get the response
    chat.getresponse()
    return chat

checkpoint = "chat.jsonl"
msgs = ["%d" % i for i in range(1, 10)]
# process the data in batch, if the checkpoint file exists, it will continue from the last checkpoint
continue_chats = process_chats(msgs, msg2chat, checkpoint)

Example 3, process data in batch (asynchronous), print hello using different languages, and use two coroutines:

from chattool import async_chat_completion, load_chats

langs = ["python", "java", "Julia", "C++"]
def data2chat(msg):
    chat = Chat()
    chat.user("Please print hello world using %s" % msg)
    # Note that we don't need to call `getresponse` here, and leave it to the asynchronous processing
    return chat

async_chat_completion(langs, chkpoint="async_chat.jsonl", nproc=2, data2chat=data2chat)
chats = load_chats("async_chat.jsonl")

when using async_chat_completion in Jupyter notebook, you should use the await keyword and the wait=True parameter:

await async_chat_completion(langs, chkpoint="async_chat.jsonl", nproc=2, data2chat=data2chat, wait=True)

Tool Call

Define functions:

def add(a: int, b: int) -> int:
    """
    This function adds two numbers.

    Parameters:
        a (int): The first number.
        b (int): The second number.

    Returns:
        int: The sum of the two numbers.
    """
    return a + b

def mult(a: int, b: int) -> int:
    """This function multiplies two numbers.
    It is a useful calculator!

    Args:
        a (int): The first number.
        b (int): The second number.

    Returns:
        int: The product of the two numbers.
    """
    return a * b

Add functions to the Chat object:

from chattool import Chat
chat = Chat("find the value of (23723 * 1322312 ) + 12312")
chat.settools([add, mult])

Automatically execute the tool based on the return information. The default value for maxturns is 3:

chat.autoresponse(display=True, tool_type='tool_choice', maxturns=3)

Use the general function python:

from chattool.functioncall import python
chat = Chat("find the value of (23723 * 1322312 ) + 12312")
chat.settools([python])
chat.autoresponse(display=True, tool_type='tool_choice', maxturns=3)

Note that executing any code generated by the model has potential risks.

License

This package is licensed under the MIT license. See the LICENSE file for more details.

update log

Current version: 2.3.0. The features of function call, asynchronous processing, and finetuning are supported.

Beta version

  • Since version 0.2.0, Chat type is used to handle data
  • Since version 0.3.0, you can use different API Key to send requests.
  • Since version 0.4.0, this package is mantained by cubenlp.
  • Since version 0.5.0, one can use process_chats to process the data, with a customized msg2chat function and a checkpoint file.
  • Since version 0.6.0, the feature function call is added.
  • Since version 1.0.0, the feature function call is removed, and the asynchronous processing tool is added.
  • Since version 2.0.0, the package is renamed to chattool, and the asynchronous processing tool is improved.