Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Enhancement]: Integrate OLLAMA API Support in AI Handlers #836

Merged
merged 1 commit into from
Apr 2, 2024

Conversation

gregoryboue
Copy link
Contributor

@gregoryboue gregoryboue commented Apr 2, 2024

User description

Fix #657


Type

enhancement


Description

  • Introduced support for OLLAMA API base configuration in litellm_ai_handler.py, allowing for expanded model support.
  • Ensured that the AI handler correctly sets the api_base for both Huggingface and OLLAMA, enhancing the flexibility of AI model integrations.
  • Preserved existing configurations and functionalities for Huggingface and VertexAI, ensuring backward compatibility.

Changes walkthrough

Relevant files
Enhancement
litellm_ai_handler.py
Integrate OLLAMA API Base Configuration Support                   

pr_agent/algo/ai_handlers/litellm_ai_handler.py

  • Added support for OLLAMA API base configuration.
  • Ensured the api_base is set correctly for both Huggingface and OLLAMA
    configurations.
  • Maintained existing functionality and configurations for Huggingface
    and VertexAI.
  • +4/-1     

    PR-Agent usage:
    Comment /help on the PR to get a list of all available PR-Agent tools and their descriptions

    @codiumai-pr-agent-pro codiumai-pr-agent-pro bot added the enhancement New feature or request label Apr 2, 2024
    Copy link
    Contributor

    PR Description updated to latest commit (501b059)

    Copy link
    Contributor

    codiumai-pr-agent-pro bot commented Apr 2, 2024

    PR Review

    (Review updated until commit 501b059)

    ⏱️ Estimated effort to review [1-5]

    2, because the PR introduces a straightforward enhancement by adding support for an additional API base configuration. The changes are localized within a single file and involve basic conditional logic and assignment operations. The complexity is low, and the changes are easy to understand.

    🏅 Score

    85

    🧪 Relevant tests

    No

    🎫 Relevant ticket

    Yes

    🔍 Possible issues

    Possible Configuration Error: The PR adds support for OLLAMA.API_BASE but does not handle potential exceptions or errors that might occur if the ollama key is missing in the settings. This could lead to runtime errors if the configuration is not properly set up.

    🔒 Security concerns

    No

    🔀 Multiple PR themes

    No


    ✨ Review tool usage guide:

    Overview:
    The review tool scans the PR code changes, and generates a PR review which includes several types of feedbacks, such as possible PR issues, security threats and relevant test in the PR. More feedbacks can be added by configuring the tool.

    The tool can be triggered automatically every time a new PR is opened, or can be invoked manually by commenting on any PR.

    • When commenting, to edit configurations related to the review tool (pr_reviewer section), use the following template:
    /review --pr_reviewer.some_config1=... --pr_reviewer.some_config2=...
    
    [pr_reviewer]
    some_config1=...
    some_config2=...
    

    See the review usage page for a comprehensive guide on using this tool.

    Copy link
    Contributor

    PR Code Suggestions

    CategorySuggestions                                                                                                                                                       
    Best practice
    Add a check for get_settings().ollama being not None before accessing its attributes.

    Consider checking if get_settings().ollama is not None before accessing its api_base
    attribute. This adds an extra layer of safety in case the ollama settings are not properly
    configured.

    pr_agent/algo/ai_handlers/litellm_ai_handler.py [64-66]

    -if get_settings().get("OLLAMA.API_BASE", None) :
    +if get_settings().get("OLLAMA.API_BASE", None) and get_settings().ollama is not None:
         litellm.api_base = get_settings().ollama.api_base
         self.api_base = get_settings().ollama.api_base
     
    Add type hints to the return value of chat_completion.

    The method chat_completion returns a tuple, but there's no documentation or type hinting
    indicating the types of resp and finish_reason. Consider adding type hints for the return
    value to improve code readability and maintainability.

    pr_agent/algo/ai_handlers/litellm_ai_handler.py [156]

    -return resp, finish_reason
    +from typing import Tuple, Any
    +# Add the appropriate type in place of `Any`
    +def chat_completion(...) -> Tuple[Any, Any]:
    +    ...
    +    return resp, finish_reason
     
    Maintainability
    Refactor the setting of api_base to use a single source of truth.

    To ensure consistency and avoid potential errors, consider using a single source of truth
    for setting api_base. Currently, api_base is set multiple times which might lead to
    unexpected behavior if the settings change.

    pr_agent/algo/ai_handlers/litellm_ai_handler.py [61-66]

    -if get_settings().get("OLLAMA.API_BASE", None) :
    -    litellm.api_base = get_settings().ollama.api_base
    -    self.api_base = get_settings().ollama.api_base
    +api_base = None
    +if get_settings().get("OLLAMA.API_BASE", None):
    +    api_base = get_settings().ollama.api_base
    +elif get_settings().get("HUGGINGFACE.API_BASE", None) and 'huggingface' in get_settings().config.model:
    +    api_base = get_settings().huggingface.api_base
    +if api_base is not None:
    +    litellm.api_base = api_base
    +    self.api_base = api_base
     

    ✨ Improve tool usage guide:

    Overview:
    The improve tool scans the PR code changes, and automatically generates suggestions for improving the PR code. The tool can be triggered automatically every time a new PR is opened, or can be invoked manually by commenting on a PR.

    • When commenting, to edit configurations related to the improve tool (pr_code_suggestions section), use the following template:
    /improve --pr_code_suggestions.some_config1=... --pr_code_suggestions.some_config2=...
    
    [pr_code_suggestions]
    some_config1=...
    some_config2=...
    

    See the improve usage page for a comprehensive guide on using this tool.

    @mrT23
    Copy link
    Collaborator

    mrT23 commented Apr 2, 2024

    @gregoryboue
    looks good

    is the guide here still accurate:
    https://pr-agent-docs.codium.ai/usage-guide/additional_configurations/#huggingface
    ?

    @gregoryboue
    Copy link
    Contributor Author

    @gregoryboue looks good

    is the guide here still accurate: https://pr-agent-docs.codium.ai/usage-guide/additional_configurations/#huggingface ?

    Yes the guide still accurate

    @mrT23 mrT23 merged commit dfe8301 into Codium-ai:main Apr 2, 2024
    @gregoryboue
    Copy link
    Contributor Author

    @mrT23 , I did a mistake in my previous answer.

    With my PR change, if i do a /review command, then here it's writed : model_type=ModelType.TURBO so in pr_processing._get_all_models the model used is get_settings().config.model_turbo and not get_settings().config.model

    So the right configuration to use ollama model locally hosted is :

    [config] # in configuration.toml
    model = "ollama/llama2"
    model_turbo = "ollama/llama2"

    If i didn't do that the command uses fallback model and never try with config.model settings.

    So is it normal to you ? Is the code must be changed or only documentation ?

    @mrT23
    Copy link
    Collaborator

    mrT23 commented Apr 5, 2024

    @gregoryboue
    Only the documentation.
    i updated it

    @hussam789
    Copy link
    Collaborator

    /review
    --pr_reviewer.require_soc2_ticket=true
    --pr_reviewer.require_can_be_split_review=true
    --pr_reviewer.require_score_review=true
    --pr_reviewer.num_code_suggestions="2"
    --pr_reviewer.inline_code_comments=true

    Copy link
    Contributor

    Persistent review updated to latest commit 501b059

    @@ -61,6 +61,9 @@ def __init__(self):
    if get_settings().get("HUGGINGFACE.API_BASE", None) and 'huggingface' in get_settings().config.model:
    litellm.api_base = get_settings().huggingface.api_base
    self.api_base = get_settings().huggingface.api_base
    if get_settings().get("OLLAMA.API_BASE", None) :
    Copy link
    Contributor

    Choose a reason for hiding this comment

    The reason will be displayed to describe this comment to others. Learn more.

    Consider adding error handling or a default value for OLLAMA.API_BASE to ensure robustness in cases where the configuration might be missing or incorrect. This could prevent runtime errors and improve the application's stability. [important]

    @@ -61,6 +61,9 @@ def __init__(self):
    if get_settings().get("HUGGINGFACE.API_BASE", None) and 'huggingface' in get_settings().config.model:
    litellm.api_base = get_settings().huggingface.api_base
    self.api_base = get_settings().huggingface.api_base
    if get_settings().get("OLLAMA.API_BASE", None) :
    litellm.api_base = get_settings().ollama.api_base
    self.api_base = get_settings().ollama.api_base
    Copy link
    Contributor

    Choose a reason for hiding this comment

    The reason will be displayed to describe this comment to others. Learn more.

    Ensure consistency in accessing settings for different APIs by using a similar pattern or method. This can improve code maintainability and readability. For instance, you might consider abstracting the settings access logic into a method if the pattern becomes more complex with additional APIs. [medium]

    @hussam789
    Copy link
    Collaborator

    /describe
    --pr_description.extra_instructions="
    For the title, use the format: [type]: [summary]
    "
    --pr_description.keep_original_user_title=false
    --pr_description.final_update_message=false

    @codiumai-pr-agent-pro codiumai-pr-agent-pro bot changed the title feat: allows ollama usage [Enhancement]: Integrate OLLAMA API Support in AI Handlers Apr 15, 2024
    Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
    Labels
    Projects
    None yet
    Development

    Successfully merging this pull request may close these issues.

    OLLAMA Api Base not considered
    3 participants