-
Notifications
You must be signed in to change notification settings - Fork 191
Support overriding UI configuration for LlamaIndexServer #519
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
WalkthroughThe pull request updates the chat UI version constant in the chat component, enhances the server functionality by introducing a new optional Changes
Sequence Diagram(s)sequenceDiagram
participant S as LlamaIndexServer
participant F as config.js File
S->>S: mount_ui()
S->>S: _override_ui_config()
S->>S: _ui_config() returns {endpoint, starter_questions}
alt File exists
S->>F: Write UI configuration
else File missing
S->>S: Log error
end
Poem
✨ Finishing Touches
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
Documentation and Community
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (2)
llama-index-server/llama_index/server/server.py (2)
107-122: Consider adding validation for starter questions.The implementation of
_override_ui_configlooks good with proper error handling. However, consider adding validation for thestarter_questionsattribute to ensure it meets any format or length requirements before using it in the UI configuration.Also, the error message "Config file not found" could be more descriptive about what this means for functionality or what action should be taken.
def _override_ui_config(self) -> None: """ Override the UI config by writing a complete configuration file. """ + # Validate starter_questions if provided + if self.starter_questions is not None and not isinstance(self.starter_questions, list): + self.logger.warning("starter_questions must be a list, ignoring invalid value") + self.starter_questions = None + try: config_path = os.path.join(self.ui_path, "config.js") if not os.path.exists(config_path): - self.logger.error("Config file not found") + self.logger.error("UI config file not found in UI directory. UI configuration cannot be updated.") return config_content = ( f"window.LLAMAINDEX = {json.dumps(self._ui_config, indent=2)};" ) with open(config_path, "w") as f: f.write(config_content) except Exception as e: self.logger.error(f"Error overriding UI config: {e}")
1-164: Consider adding tests for the new UI configuration functionality.The changes look good overall, but I don't see tests added for this new functionality. Consider adding tests to verify:
- The
_ui_configproperty returns the expected values- The
_override_ui_configmethod writes the correct configuration- Error handling works as expected
📜 Review details
Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (3)
llama-index-server/llama_index/server/chat_ui.py(1 hunks)llama-index-server/llama_index/server/server.py(7 hunks)llama-index-server/llama_index/server/utils/agent_tool.py(1 hunks)
⏰ Context from checks skipped due to timeout of 90000ms (3)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --example-file)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --example-file)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --example-file)
🔇 Additional comments (10)
llama-index-server/llama_index/server/chat_ui.py (1)
8-8: Version update looks good.The version update from "0.0.2" to "0.0.3" aligns with the new UI configuration features being introduced in this PR.
llama-index-server/llama_index/server/utils/agent_tool.py (2)
4-4: Removal of unused import is good.Removing the unused
Callableimport from typing helps keep the codebase clean.
8-8: Removal of unused import is good.Removing the unused
MessageRoleimport helps maintain clean code.llama-index-server/llama_index/server/server.py (7)
1-1: Import addition is appropriate.The addition of the
jsonimport is necessary for the new UI configuration functionality.
19-19: Class attribute addition looks good.The new
starter_questionsattribute is well-integrated with the existing class structure.
30-30: Parameter addition is appropriate.Adding the optional
starter_questionsparameter with a default value ofNonemaintains backward compatibility.
44-44: Documentation is clear and helpful.The added documentation clearly explains the purpose of the
starter_questionsparameter.
53-53: Attribute initialization is correct.Properly initializes the class attribute with the parameter value.
68-74: Property implementation looks good.The
_ui_configproperty correctly builds the configuration dictionary. The naming with a leading underscore appropriately indicates this is an internal method.
105-105: Appropriate method call.Adding the call to
_override_ui_config()in themount_uimethod ensures the UI configuration is updated when the UI is mounted.
Summary by CodeRabbit
New Features
Chores