Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: pytest includes function calls w/ api key when passed via litellm_params #6668

Open
gutzbenj opened this issue Nov 8, 2024 · 2 comments
Labels
bug Something isn't working good first issue Good for newcomers

Comments

@gutzbenj
Copy link

gutzbenj commented Nov 8, 2024

What happened?

Dear litellm developers,

we recently found that when using litellm's Router and passing an api_key via litellm_params it may be leaked when anything with the model call goes wrong.

Example:

from litellm import Router

AZURE_OPENAI_ENDPOINT = "<AZURE_OPENAI_ENDPOINT>"
AZURE_OPENAI_API_VERSION = "<AZURE_OPENAI_API_VERSION>"
AZURE_OPENAI_API_KEY = "<AZURE_OPENAI_API_KEY>" <-- wrong api key here

model_list = [
    {
        "model_name": "gpt-4o",
        "litellm_params": {
            "model": "azure/gpt-4o",
            "base_model": "gpt-4o",
            "api_version": AZURE_OPENAI_API_VERSION,
            "api_base": AZURE_OPENAI_ENDPOINT,
            "api_key": AZURE_OPENAI_API_KEY,
        },
    },
]
litellm_router = Router(
    model_list=model_list,
)

def test_litellm():
    model = "gpt-4o"
    messages = [{"role": "user", "content": "Hello World"}]
    response = litellm_router.chat.completions.create(messages=messages, model=model)  # <--- api_key is passed somewhere in the kwargs here
    assert response.choices[0].message.content

The issue seems to be that all arguments are passed into one function litellm.llms.AzureOpenAI.azure.AzureChatCompletion and if that fails pytest will print out the api_key argument as well.

In comparison openai has separated client creation and actual create request:

from openai import AzureOpenAI

client = AzureOpenAI(
    api_key=AZURE_OPENAI_API_KEY,  # <--- api_key is set here
    api_version=AZURE_OPENAI_API_VERSION,
    azure_endpoint=AZURE_OPENAI_ENDPOINT,
)

def test_azure():
    model = "gpt-4o"
    messages = [{"role": "user", "content": "Hello World"}]
    response = client.chat.completions.create(messages=messages, model=model)  # <--- error happens in another scope, api_key is not passed directly
    assert response.choices[0].message.content

How to prevent this? A possible solution could be wrapping the api_key in some kind of known dataclass like

from dataclasses import dataclass

@dataclass
class ApiKey:
    value: str

    def __str__(self):
        return "***"

    def __repr__(self):
        return "ApiKey(***)"

# wherever the api_key is actually used use the hidden value instead
AzureChatCompletion(
    ...,
    api_key=ApiKey(AZURE_OPENAI_API_KEY)
)

In the actual request to the api then the api_key has to be retrieved.

But probably the best solution would be doing it like openai and separating the setup of the client and sending the request.

Cheers
Benjamin

/CC @klemenstraeger

Relevant log output

No response

Twitter / LinkedIn details

No response

@gutzbenj gutzbenj added the bug Something isn't working label Nov 8, 2024
@ishaan-jaff
Copy link
Contributor

good idea @gutzbenj - would you be open to contributing on this change ?

@krrishdholakia krrishdholakia changed the title [Bug]: api key leaked when passed via litellm_params [Bug]: pytest function api key leaked when passed via litellm_params Nov 11, 2024
@krrishdholakia krrishdholakia changed the title [Bug]: pytest function api key leaked when passed via litellm_params [Bug]: pytest includes function calls w/ api key when passed via litellm_params Nov 11, 2024
@krrishdholakia
Copy link
Contributor

@gutzbenj this doesn't seem like litellm is leaking keys (interpreted as printing to console / returning in response in prod)

as much as the way pytest debugging works

i don't view this as a leak, but i do believe we can improve here. A PR would be welcome!

@krrishdholakia krrishdholakia added the good first issue Good for newcomers label Nov 11, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working good first issue Good for newcomers
Projects
None yet
Development

No branches or pull requests

3 participants