-
Notifications
You must be signed in to change notification settings - Fork 26
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: allow passing base_url in openai section of config.toml #328
Conversation
@CodeRabbit full review |
WalkthroughThe pull request introduces updates to the Changes
Possibly related PRs
Suggested reviewers
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
Documentation and Community
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
🧹 Outside diff range and nitpick comments (1)
src/rai/rai/utils/model_initialization.py (1)
54-58
: Consider adding default values for better usabilityWhile the structure is good, consider adding default values:
base_url
should default to "https://api.openai.com/v1/"api_key
should default to an empty string@dataclass class OpenAIConfig(ModelConfig): - base_url: str - api_key: str + base_url: str = "https://api.openai.com/v1/" + api_key: str = ""
📜 Review details
Configuration used: .coderabbit.yaml
Review profile: CHILL
📒 Files selected for processing (2)
config.toml
(1 hunks)src/rai/rai/utils/model_initialization.py
(5 hunks)
🔇 Additional comments (6)
config.toml (2)
17-17
: Document API key precedence and security best practices
The comment indicates that this overrides the environment variable, but consider:
- Adding a warning comment about avoiding API key storage in config files
- Documenting the precedence order (config vs env var) in the README
- Recommending the use of environment variables or secure vaults for production deployments
Let's check if there's existing documentation about API key handling:
16-16
: Enforce HTTPS for base_url in production environments
While the default OpenAI URL uses HTTPS, users might configure insecure URLs for local deployments. Consider adding a warning or validation when non-HTTPS URLs are used in production environments to prevent potential security risks.
Let's check if there's any URL validation logic in the codebase:
src/rai/rai/utils/model_initialization.py (4)
18-18
: LGTM! Good security practices with SecretStr
The addition of SecretStr
for handling sensitive data and cast
for type safety are appropriate improvements.
Also applies to: 23-23
82-82
: LGTM! Proper type annotation
The update to use OpenAIConfig
is consistent with the new configuration structure.
120-120
: LGTM! Consistent type casting implementation
The type casting is properly implemented across all vendors, improving type safety and maintainability.
Also applies to: 135-135, 144-144
Line range hint 82-93
: Verify documentation and migration impact
The configuration structure changes might require updates to documentation and could affect existing users.
✅ Verification successful
Configuration changes are well-documented and supported
The changes to the configuration structure are properly supported across the codebase:
- The configuration is managed through a comprehensive configurator tool (
src/rai/rai/utils/configurator.py
) that guides users through setup - Documentation in
docs/developer_guide.md
clearly references the configuration usage and provides examples - The configuration structure in
model_initialization.py
is well-defined with proper dataclasses and type hints - The changes are backward compatible as they add new options (Ollama, tracing) while maintaining existing ones (OpenAI, AWS)
No additional documentation updates are needed as the existing documentation and configuration tool already provide sufficient guidance for users.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Check for potential documentation files that might need updates
fd -e md -e rst | xargs rg -l "config\.toml|configuration|openai"
# Check for potential test files that might need updates
fd -e py | xargs rg -l "config\.toml|ModelConfig|OpenAIConfig"
Length of output: 487
Script:
#!/bin/bash
# Check the content of documentation files for specific configuration examples
echo "=== Checking docs/developer_guide.md ==="
cat docs/developer_guide.md | rg -A 5 -B 5 "config\.toml|configuration"
echo "=== Checking configurator.py for configuration structure ==="
cat src/rai/rai/utils/configurator.py | rg -A 10 -B 10 "class.*Config"
echo "=== Checking for recent changes in configuration files ==="
git diff HEAD~5 -- src/rai/rai/utils/model_initialization.py src/rai/rai/utils/configurator.py
Length of output: 3070
Script:
#!/bin/bash
# Let's check the actual content of the configuration files
echo "=== Checking model_initialization.py ==="
cat src/rai/rai/utils/model_initialization.py
echo "=== Checking configurator.py ==="
cat src/rai/rai/utils/configurator.py
# Let's also check if there are any example config files
echo "=== Checking for example config files ==="
fd "config.*\.toml" -t f
Length of output: 37264
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@maciejmajek
Haven't tested the functionallity yet, but I have some questions. Could you have a look?
050c31f
to
09f483b
Compare
aa0e206
to
1d064f0
Compare
@maciejmajek
|
7039ed4
to
0a37ab7
Compare
0a37ab7
to
64180c1
Compare
I tested
and tool calling worked great through openai API! |
Purpose
Some local model vendors support OpenAI api.
Proposed Changes
This Pr introduces changes that allow user to specify
in the openai section of config.toml to use the langchain's OpenAI abstractions in ollama/vllm and so on.
Issues
Testing
config.toml
Summary by CodeRabbit
New Features
base_url
.Bug Fixes
api_key
is not set, ensuring better user feedback.