From cd7837c245e3d646d00713382dd1ad1f2f921352 Mon Sep 17 00:00:00 2001 From: Ikko Eltociear Ashimine Date: Tue, 24 Sep 2024 16:39:04 +0900 Subject: [PATCH] docs: update README.md comptaible -> compatible --- README.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/README.md b/README.md index ae868cd6..2f26d044 100644 --- a/README.md +++ b/README.md @@ -114,8 +114,8 @@ Please note that the naming convention described above for the `model` attribute > [!TIP] > optillm is a transparent proxy and will work with any LLM API or provider that has an OpenAI API compatible chat completions endpoint, and in turn, optillm also exposes -the same OpenAI API comptaible chat completions endpoint. This should allow you to integrate it into any existing tools or frameworks easily. If the LLM you want to use -doesn't have an OpenAI API comptaible endpoint (like Google or Anthropic) you can use [LiteLLM proxy server](https://docs.litellm.ai/docs/proxy/quick_start) that supports most LLMs. +the same OpenAI API compatible chat completions endpoint. This should allow you to integrate it into any existing tools or frameworks easily. If the LLM you want to use +doesn't have an OpenAI API compatible endpoint (like Google or Anthropic) you can use [LiteLLM proxy server](https://docs.litellm.ai/docs/proxy/quick_start) that supports most LLMs. The following sequence diagram illustrates how the request and responses go through optillm.