-
-
Notifications
You must be signed in to change notification settings - Fork 2.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: litellm.cost_calculator.py::response_cost_calculator - Returning None #5610
Comments
As a suggestion, shouldn't the warning above occur only once? And then on future requests we shouldn't get the same warning about the same issue in the same run? More generally, many people will ultimately use models that have not been added to the cost calculator. Should we really bombard them with warnings over that, on every request? I'm a lot less familiar than you guys are, but it seems like cost calculation should just return |
@okhat open to suggestions here the motivation was to make litellm more observable. since we support returning the calculated response cost in the response - Possible ideas (open to your suggestions too):
|
looks like we already have support for treating it as a debug error - i'll move to just doing this @okhat |
…ost calculation error Fixes #5610
Will also add the missing databricks model prices |
Databricks has moved their pricing to be in DBUs. So what we can do is store the dbu information, and apply a default conversion (which can be overridden by the user). |
All of these sound good to me! Thanks a ton @krrishdholakia |
* fix(cost_calculator.py): move to debug for noisy warning message on cost calculation error Fixes #5610 * fix(databricks/cost_calculator.py): Handles model name issues for databricks models * fix(main.py): fix stream chunk builder for multiple tool calls Fixes #5591 * fix: correctly set user_alias when passed in Fixes #5612 * fix(types/utils.py): allow passing role for message object #5621 * fix(litellm_logging.py): Fix langfuse logging across multiple projects Fixes issue where langfuse logger was re-using the old logging object * feat(proxy/_types.py): support adding key-based tags for tag-based routing Enable tag based routing at key-level * fix(proxy/_types.py): fix inheritance * test(test_key_generate_prisma.py): fix test * test: fix test * fix(litellm_logging.py): return used callback object
What happened?
This is an extension of #5597, which I can't re-open.
The fix by @krrishdholakia was great but it doesn't yet handle most models by Databricks, only one LM. Can we consider a more general fix? This is less pressing now because we only get a warning, not an error.
When using model=
databricks/databricks-meta-llama-3-1-70b-instruct
, the error complains aboutdatabricks/meta-llama-3.1-70b-instruct-082724
. The list ofdatabricks/*
models at this path https://github.com/BerriAI/litellm/blob/main/model_prices_and_context_window.json is great for reference.Relevant log output
Twitter / LinkedIn details
No response
The text was updated successfully, but these errors were encountered: