Replies: 2 comments
-
I am not aware of any way to do this, currently. Even if you could, the cost would likely remain the same because the entire context needs to be presented to the model every time you want a response, and that large context is what drives the cost. |
Beta Was this translation helpful? Give feedback.
0 replies
-
It seems like you're doing some classification. Maybe consider using Embeddings? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I have a use case where I have a large set of categories, (around 1000 tokens), I want to send it one time, then ask ChatGPT to categorize company x, company y, etc.. from the categories set I have sent already.
The point is to reduce the cost of sending a fresh new prompt with the categories set every time.
How can I implement this?
Beta Was this translation helpful? Give feedback.
All reactions