Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: add support for custom reasoning models #1551

Merged
merged 3 commits into from
Feb 18, 2025
Merged

Conversation

mrT23
Copy link
Collaborator

@mrT23 mrT23 commented Feb 18, 2025

User description

#1540


PR Type

Enhancement, Documentation


Description

  • Added support for custom reasoning models in AI handler.

  • Updated logic to handle models without temperature or chat-style inputs.

  • Documented configuration for custom reasoning models in usage guide.

  • Introduced a new configuration option custom_reasoning_model.


Changes walkthrough 📝

Relevant files
Enhancement
litellm_ai_handler.py
Enhance AI handler to support custom reasoning models       

pr_agent/algo/ai_handlers/litellm_ai_handler.py

  • Modified logic to combine system and user prompts for custom reasoning
    models.
  • Updated temperature handling to accommodate custom reasoning models.
  • +2/-2     
    Documentation
    changing_a_model.md
    Update usage guide for custom reasoning models                     

    docs/docs/usage-guide/changing_a_model.md

  • Added documentation for custom reasoning models.
  • Explained limitations of reasoning models regarding temperature and
    chat-style inputs.
  • +3/-0     
    Configuration changes
    configuration.toml
    Add configuration for custom reasoning models                       

    pr_agent/settings/configuration.toml

  • Introduced custom_reasoning_model configuration option.
  • Documented its purpose and default value.
  • +1/-0     

    Need help?
  • Type /help how to ... in the comments thread for any questions about Qodo Merge usage.
  • Check out the documentation for more information.
  • Copy link
    Contributor

    PR Reviewer Guide 🔍

    Here are some key observations to aid the review process:

    🎫 Ticket compliance analysis ✅

    1540 - Fully compliant

    Compliant requirements:

    • Handle Azure OpenAI Service models that don't support temperature parameter
    • Fix error when using o3-mini model with temperature parameter
    ⏱️ Estimated effort to review: 2 🔵🔵⚪⚪⚪
    🧪 No relevant tests
    🔒 No security concerns identified
    ⚡ Recommended focus areas for review

    Logic Validation

    The condition for skipping temperature parameter should be verified to ensure it correctly identifies models that don't support temperature

    if model not in self.no_support_temperature_models or get_settings().config.custom_reasoning_model:
        kwargs["temperature"] = temperature

    Copy link
    Contributor

    qodo-merge-pro-for-open-source bot commented Feb 18, 2025

    PR Code Suggestions ✨

    Latest suggestions up to 35059ca

    CategorySuggestion                                                                                                                                    Impact
    Possible issue
    Add null check for config

    Add a check to verify that get_settings().config exists before accessing
    custom_reasoning_model to prevent potential NullPointerException

    pr_agent/algo/ai_handlers/litellm_ai_handler.py [208]

    -if model in self.user_message_only_models or get_settings().config.custom_reasoning_model:
    +if model in self.user_message_only_models or (hasattr(get_settings(), 'config') and get_settings().config.custom_reasoning_model):
    • Apply this suggestion
    Suggestion importance[1-10]: 7

    __

    Why: The suggestion adds important defensive programming by checking for the existence of the config attribute before accessing it, which could prevent runtime errors if the settings object is not properly initialized.

    Medium
    Add consistent null check

    Apply the same null check pattern for the second occurrence of config access to
    maintain consistency and prevent potential crashes

    pr_agent/algo/ai_handlers/litellm_ai_handler.py [230]

    -if model not in self.no_support_temperature_models and not get_settings().config.custom_reasoning_model:
    +if model not in self.no_support_temperature_models and not (hasattr(get_settings(), 'config') and get_settings().config.custom_reasoning_model):
    • Apply this suggestion
    Suggestion importance[1-10]: 7

    __

    Why: This suggestion maintains consistency with the first fix and adds the same defensive programming pattern to prevent potential null pointer exceptions in another similar code location.

    Medium
    • More

    Previous suggestions

    ✅ Suggestions up to commit 0317951
    CategorySuggestion                                                                                                                                    Impact
    Possible issue
    Fix incorrect temperature handling logic

    The condition for handling temperature settings should be inverted - if a model
    doesn't support temperature or is a custom reasoning model, temperature should
    be excluded, not included.

    pr_agent/algo/ai_handlers/litellm_ai_handler.py [230-231]

    -if model not in self.no_support_temperature_models or get_settings().config.custom_reasoning_model:
    +if model not in self.no_support_temperature_models and not get_settings().config.custom_reasoning_model:
         kwargs["temperature"] = temperature

    [Suggestion has been applied]

    Suggestion importance[1-10]: 10

    __

    Why: The current logic is incorrect and would add temperature to models that don't support it. The suggested fix correctly excludes temperature for both unsupported models and custom reasoning models, preventing potential API errors.

    High
    ✅ Suggestions up to commit 4edb8b8
    CategorySuggestion                                                                                                                                    Impact
    General
    Add warning for custom model settings

    Add a warning log when custom_reasoning_model is enabled to track when system
    messages and temperature settings are being bypassed.

    pr_agent/algo/ai_handlers/litellm_ai_handler.py [208-212]

     if model in self.user_message_only_models or get_settings().config.custom_reasoning_model:
    +    if get_settings().config.custom_reasoning_model:
    +        get_logger().warning("Custom reasoning model enabled: bypassing system message and temperature settings")
         user = f"{system}\n\n\n{user}"
         system = ""
         get_logger().info(f"Using model {model}, combining system and user prompts")
         messages = [{"role": "user", "content": user}]
    Suggestion importance[1-10]: 7

    __

    Why: Adding a warning log when custom_reasoning_model is enabled provides better observability and helps track when important model settings are being bypassed, making it easier to debug and monitor system behavior.

    Medium
    Clarify configuration comment documentation

    Improve the configuration comment to clearly explain the implications of
    enabling custom_reasoning_model.

    pr_agent/settings/configuration.toml [20]

    -custom_reasoning_model = false # when true, no system message and not temperature will be used
    +custom_reasoning_model = false # when true, disables system messages and temperature controls for models that don't support chat-style inputs

    [Suggestion has been applied]

    Suggestion importance[1-10]: 6

    __

    Why: The improved comment provides clearer and more precise documentation about the implications of enabling custom_reasoning_model, helping users better understand its impact on chat-style inputs.

    Low

    Co-authored-by: qodo-merge-pro-for-open-source[bot] <189517486+qodo-merge-pro-for-open-source[bot]@users.noreply.github.com>
    Co-authored-by: qodo-merge-pro-for-open-source[bot] <189517486+qodo-merge-pro-for-open-source[bot]@users.noreply.github.com>
    @mrT23 mrT23 merged commit 9de9b39 into main Feb 18, 2025
    2 checks passed
    @mrT23 mrT23 deleted the tr/custom_reasoning_model branch February 18, 2025 10:04
    Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
    Projects
    None yet
    Development

    Successfully merging this pull request may close these issues.

    2 participants