Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Chat UI doesn't respect custom model/endpoint configuration #82

Open
eyalsela opened this issue Feb 14, 2025 · 2 comments
Open

Chat UI doesn't respect custom model/endpoint configuration #82

eyalsela opened this issue Feb 14, 2025 · 2 comments

Comments

@eyalsela
Copy link

eyalsela commented Feb 14, 2025

(Issue descrption generated by Claude)

Currently, there's an inconsistency between how model configuration is handled in the main DAILA operations versus the sidebar chat UI.

Current Behavior

  • Main DAILA operations (like function summarization, variable renaming, etc.) properly use custom model and endpoint configurations from DAILAConfig
  • The sidebar chat UI bypasses these configurations and directly uses litellm with just the model name, ignoring any custom endpoint settings

Expected Behavior

The sidebar chat should respect the same configuration as the main DAILA operations, including:

  • Custom endpoints
  • Custom model configurations
  • Any other LiteLLM-related settings from DAILAConfig

Technical Details

The issue is in dailalib/llm_chat/llm_chat_ui.py where LLMThread.run() directly uses litellm without accessing the configuration from the parent LiteLLMAIAPI instance:

def run(self):
    import litellm
    litellm.modify_params = True
    response = litellm.completion(
        model=self.model_name,
        messages=self.chat_history,
        timeout=60,
    )

Suggested Fix

Modify the chat UI to use the same configuration mechanism as the main DAILA operations by:

Passing the full configuration from LiteLLMAIAPI to the chat UI
Using the same completion call pattern in LLMThread as used in the main operations
Ensuring configuration changes are immediately reflected in both the main operations and chat UI

@mahaloz
Copy link
Owner

mahaloz commented Feb 14, 2025

Hey @eyalsela, what OS and what Decompiler are you on?

@eyalsela
Copy link
Author

Windows, ida

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants