You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
we recently found that when using litellm's Router and passing an api_key via litellm_params it may be leaked when anything with the model call goes wrong.
The issue seems to be that all arguments are passed into one function litellm.llms.AzureOpenAI.azure.AzureChatCompletion and if that fails pytest will print out the api_key argument as well.
In comparison openai has separated client creation and actual create request:
fromopenaiimportAzureOpenAIclient=AzureOpenAI(
api_key=AZURE_OPENAI_API_KEY, # <--- api_key is set hereapi_version=AZURE_OPENAI_API_VERSION,
azure_endpoint=AZURE_OPENAI_ENDPOINT,
)
deftest_azure():
model="gpt-4o"messages= [{"role": "user", "content": "Hello World"}]
response=client.chat.completions.create(messages=messages, model=model) # <--- error happens in another scope, api_key is not passed directlyassertresponse.choices[0].message.content
How to prevent this? A possible solution could be wrapping the api_key in some kind of known dataclass like
fromdataclassesimportdataclass@dataclassclassApiKey:
value: strdef__str__(self):
return"***"def__repr__(self):
return"ApiKey(***)"# wherever the api_key is actually used use the hidden value insteadAzureChatCompletion(
...,
api_key=ApiKey(AZURE_OPENAI_API_KEY)
)
In the actual request to the api then the api_key has to be retrieved.
But probably the best solution would be doing it like openai and separating the setup of the client and sending the request.
good idea @gutzbenj - would you be open to contributing on this change ?
krrishdholakia
changed the title
[Bug]: api key leaked when passed via litellm_params
[Bug]: pytest function api key leaked when passed via litellm_paramsNov 11, 2024
krrishdholakia
changed the title
[Bug]: pytest function api key leaked when passed via litellm_params
[Bug]: pytest includes function calls w/ api key when passed via litellm_paramsNov 11, 2024
What happened?
Dear litellm developers,
we recently found that when using litellm's
Router
and passing anapi_key
vialitellm_params
it may be leaked when anything with the model call goes wrong.Example:
The issue seems to be that all arguments are passed into one function
litellm.llms.AzureOpenAI.azure.AzureChatCompletion
and if that fails pytest will print out theapi_key
argument as well.In comparison openai has separated client creation and actual
create
request:How to prevent this? A possible solution could be wrapping the api_key in some kind of known dataclass like
In the actual request to the api then the
api_key
has to be retrieved.But probably the best solution would be doing it like openai and separating the setup of the client and sending the request.
Cheers
Benjamin
/CC @klemenstraeger
Relevant log output
No response
Twitter / LinkedIn details
No response
The text was updated successfully, but these errors were encountered: