-
Notifications
You must be signed in to change notification settings - Fork 22
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Extract client #72
Extract client #72
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Left a couple comments, mostly regarding the client having a different name. If you could change it to something that pertains the meaning of what it does more specifically, would be awesome 🙌 I left code change suggestions, but feel free to pick your name, or something even simpler like just "LLMClient".
Another point, is about how we pass the api client into the LLMQuestionProxy class instantiation now. I wonder if it would make sense to have a client for webhook and another one for OpenAI OpenAPI Spec compatible calls, and decide which client to pass when instantiating LLMQuestionProxy. I think at this point its ok to stay as is though, so no changes required here ;)
Co-authored-by: Paulo Truta <pinheirotruta5@gmail.com>
Co-authored-by: Paulo Truta <pinheirotruta5@gmail.com>
Co-authored-by: Paulo Truta <pinheirotruta5@gmail.com>
Updated with suggestions + merged main in and updated layout. |
No description provided.