-
Notifications
You must be signed in to change notification settings - Fork 4.1k
Closed
Labels
bugSomething isn't workingSomething isn't working
Description
Confirm this is an issue with the Python library and not an underlying OpenAI API
- This is an issue with the Python library
Describe the bug
Calling the create
method on completions
introduces a memory leak, according to tracemalloc
.
Example:
client = OpenAI(MY_KEY)
client.chat.completions.create(
messages=[
{
"role": "user",
"content": "Can you write a poem?",
}
],
model="gpt-3.5-turbo"
)
How to determine it's a memory leak?
I use tracemalloc
with my flask application:
@blueprint.route("/admin/sys/stats")
def admin_sys_stats():
snapshot = tracemalloc.take_snapshot()
top_stats = snapshot.statistics('lineno')
from openai import OpenAI
client = OpenAI(KEY)
client.chat.completions.create(
messages=[
{
"role": "user",
"content": "Can you write a poem?",
}
],
model="gpt-3.5-turbo"
)
stats = ""
for stat in top_stats[:1000]:
if grep in str(stat):
stats += str(stat) + "\n"
return f"<pre>{stats}</pre>", 200
When running this endpoint multiple times, one line is at the very top (which means it's the most expensive one):
\venv\Lib\site-packages\openai\_response.py:227: size=103 KiB, count=1050, average=100 B
When I refresh, the size
increases. Of course, in a production environment, the numbers get high a lot quicker.
To Reproduce
There's no one way to prove there's a memory leak.
But what I did was:
- Setup a flask application
- Create the route provided in the bug description
- Hit the route multiple times, you'll see an increase in the
size
of the object
Code snippets
No response
OS
Linux, macOS
Python version
Python v3.11.2
Library version
openai v1.2.4
tofulimantont and OZRC
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working