-
Notifications
You must be signed in to change notification settings - Fork 4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update openai_ai_handler.py #74
Conversation
Reviewer's Guide by SourceryThis PR improves error logging in the OpenAI chat completion handler by adding more context information when API errors occur. The changes enhance debugging capabilities by including the model name and messages in the error log. Sequence diagram for error logging in OpenAI chat completionsequenceDiagram
participant User
participant OpenAIHandler
participant Logger
User->>OpenAIHandler: Request chat completion
OpenAIHandler->>OpenAIHandler: Process request
alt APIError or Timeout
OpenAIHandler->>Logger: Log error with model and messages
Logger-->>OpenAIHandler: Log entry created
OpenAIHandler->>User: Raise exception
else RateLimitError
OpenAIHandler->>Logger: Log rate limit error
Logger-->>OpenAIHandler: Log entry created
OpenAIHandler->>User: Raise exception
end
File-Level Changes
Tips and commandsInteracting with Sourcery
Customizing Your ExperienceAccess your dashboard to:
Getting Help
|
Important Review skippedAuto reviews are disabled on base/target branches other than the default branch. Please check the settings in the CodeRabbit UI or the You can disable this status message by setting the Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
PR Reviewer Guide 🔍Here are some key observations to aid the review process:
|
PR Code Suggestions ✨Explore these optional code suggestions:
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hey @NxPKG - I've reviewed your changes - here's some feedback:
Overall Comments:
- Please update the PR description to explain the motivation for this change and reference any related issues.
- Consider applying the same detailed error logging format to the RateLimitError case for consistency across error handling.
Here's what I looked at during the review
- 🟡 General issues: 2 issues found
- 🟢 Security: all looks good
- 🟢 Testing: all looks good
- 🟢 Complexity: all looks good
- 🟢 Documentation: all looks good
Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.
@@ -58,7 +58,7 @@ async def chat_completion(self, model: str, system: str, user: str, temperature: | |||
model=model, usage=usage) | |||
return resp, finish_reason | |||
except (APIError, Timeout) as e: | |||
get_logger().error("Error during OpenAI inference: ", e) | |||
get_logger().error("Error during OpenAI inference - Model: %s, Messages: %s", self.model, messages, exc_info=e) | |||
raise | |||
except (RateLimitError) as e: | |||
get_logger().error("Rate limit error during OpenAI inference: ", e) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
suggestion: Consider using consistent error logging patterns across different exception handlers
The APIError handler includes model and messages context, while the RateLimitError handler doesn't. Consider applying the same detailed logging pattern here for consistency in debugging information.
get_logger().error("Rate limit error during OpenAI inference: ", e) | |
get_logger().error(f"Rate limit error during OpenAI inference - Model: {self.model}, Messages: {messages}", e) |
@@ -58,7 +58,7 @@ async def chat_completion(self, model: str, system: str, user: str, temperature: | |||
model=model, usage=usage) | |||
return resp, finish_reason | |||
except (APIError, Timeout) as e: | |||
get_logger().error("Error during OpenAI inference: ", e) | |||
get_logger().error("Error during OpenAI inference - Model: %s, Messages: %s", self.model, messages, exc_info=e) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
suggestion (performance): Consider truncating or summarizing the messages payload in error logs
Logging the entire messages array could lead to very large log entries. Consider implementing a truncation or summary mechanism for the messages to keep logs concise while still maintaining useful debugging information.
get_logger().error("Error during OpenAI inference - Model: %s, Messages: %s", self.model, messages, exc_info=e) | |
get_logger().error("Error during OpenAI inference - Model: %s, Messages summary: %d messages, last message: %.100s...", | |
self.model, | |
len(messages), | |
messages[-1]["content"] if messages else "") |
Signed-off-by: NxPKG <iconmamundentist@gmail.com>
Signed-off-by: NxPKG <iconmamundentist@gmail.com>
Signed-off-by: NxPKG <iconmamundentist@gmail.com>
Signed-off-by: NxPKG <iconmamundentist@gmail.com>
* update openai api Signed-off-by: NxPKG <iconmamundentist@gmail.com> * Update openai_ai_handler.py Signed-off-by: NxPKG <iconmamundentist@gmail.com> * Update openai_ai_handler.py (#74) * Update openai_ai_handler.py Signed-off-by: NxPKG <iconmamundentist@gmail.com> * Update openai_ai_handler.py Signed-off-by: NxPKG <iconmamundentist@gmail.com> * Update openai_ai_handler.py Signed-off-by: NxPKG <iconmamundentist@gmail.com> * Update openai_ai_handler.py Signed-off-by: NxPKG <iconmamundentist@gmail.com> --------- Signed-off-by: NxPKG <iconmamundentist@gmail.com> * Update pr_insight/algo/ai_handlers/openai_ai_handler.py Co-authored-by: sourcery-ai[bot] <58596630+sourcery-ai[bot]@users.noreply.github.com> Signed-off-by: NxPKG <iconmamundentist@gmail.com> --------- Signed-off-by: NxPKG <iconmamundentist@gmail.com> Co-authored-by: sourcery-ai[bot] <58596630+sourcery-ai[bot]@users.noreply.github.com>
User description
Notes for Reviewers
This PR fixes #
Signed commits
PR Type
bug_fix, enhancement
Description
openai_ai_handler.py
to include model and message details when anAPIError
orTimeout
occurs.Changes walkthrough 📝
openai_ai_handler.py
Enhance error logging in OpenAI AI handler
pr_insight/algo/ai_handlers/openai_ai_handler.py