You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Since it's mentioned that the price should be price per 1k tokens. However soon I noticed that the price it shows is quite high. So i went deeper and found that the code commited via #2902 contains a bug. Specifically at
where it multiplies price per 1k with the actual number of tokens. Thus gicing wrong value. To fix this I have changed the price to be price per token in my llm config, but the code needs to be updated so that people dont face this issue.
Steps to reproduce
No response
Model Used
No response
Expected Behavior
The price should be computed as price per 1k token
Screenshots and logs
No response
Additional Information
No response
The text was updated successfully, but these errors were encountered:
Describe the bug
With the openai gpt-4o-mini, I went ahead and tried setting the price param as
Since it's mentioned that the price should be price per 1k tokens. However soon I noticed that the price it shows is quite high. So i went deeper and found that the code commited via #2902 contains a bug. Specifically at
autogen/autogen/oai/client.py
Line 798 in 0cdbc34
Steps to reproduce
No response
Model Used
No response
Expected Behavior
The price should be computed as price per 1k token
Screenshots and logs
No response
Additional Information
No response
The text was updated successfully, but these errors were encountered: