-
Notifications
You must be signed in to change notification settings - Fork 274
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Dynamic FewShot GPTClassifier: does it cache Embd locally? #55
Comments
Hello, @KennyNg-19 |
Hi, @iryna-kondr If we cannot use the embeddings generated here, the embedding functions(especially paid API service) will be called again, which increases cost. |
I am having the same problem. I dont want to recreate the embeddings at every request. I wanna do it once and reuse (both embeddings + fitted classifier) it for future calls in my system. |
One additional point to consider: if we rerun experiments at a later date it would be nice to simply point to preexisting embeddings instead of re-embedding them. So same exact task, same exact data. @iryna-kondr is this something you might consider implementing? |
Hi, @AndreasKarasenko. You can pickle the estimator (with embeddings) and then load it at a later date. See our discussion here: https://discord.com/channels/1112768381406425138/1125476385750782012/1125478710427009044 |
Thanks for the info! Based off of that I figured out a way to get the data and embedding lists so I can store them locally. I think this issue can be closed now? |
I wonder:
DynamicFewShotGPTClassifier
will cache embeddings by OpenAI locally for the 1st time calling it.The text was updated successfully, but these errors were encountered: