-
Notifications
You must be signed in to change notification settings - Fork 537
support custom llm endpoint that lack /models #932
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
support custom llm endpoint that lack /models #932
Conversation
📝 WalkthroughWalkthroughThe changes update the localization infrastructure and UI text for custom LLM endpoint configuration in the desktop application. They add new translation keys and messages, update example URLs, improve user guidance for model selection, and enhance runtime translation support. Localization files for English and Korean are expanded and synchronized with the new UI messages. Changes
Sequence Diagram(s)sequenceDiagram
participant User
participant SettingsUI
participant Lingui
participant LocaleFiles
User->>SettingsUI: Open LLM settings
SettingsUI->>Lingui: Request localized strings (e.g., placeholders, descriptions)
Lingui->>LocaleFiles: Fetch translations for current language
LocaleFiles-->>Lingui: Return localized messages
Lingui-->>SettingsUI: Provide localized UI text
SettingsUI-->>User: Display updated UI with new guidance and placeholders
User->>SettingsUI: Enter or select model name / endpoint URL
SettingsUI-->>User: Show input fields and guidance based on endpoint/model availability
📜 Recent review detailsConfiguration used: .coderabbit.yaml 📒 Files selected for processing (1)
🚧 Files skipped from review as they are similar to previous changes (1)
✨ Finishing Touches
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
SupportNeed help? Create a ticket on our support page for assistance with any issues or questions. Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
Documentation and Community
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
🔭 Outside diff range comments (1)
apps/desktop/src/components/settings/components/ai/llm-view.tsx (1)
122-150: 🛠️ Refactor suggestionAlign
RadioGroupvalue with option identifiers.
currentLLMuses"Llama3p2_3bQ4"while the correspondingRadioGroupItemis"llama-3.2-3b-q4".
Because the values differ, the default option is never selected and toggling logic becomes inconsistent.-const currentLLM: SupportedModelLLM | "custom" = customLLMEnabled.data ? "custom" : "Llama3p2_3bQ4"; +const LLAMA_DEFAULT_VALUE = "llama-3.2-3b-q4"; +const currentLLM: SupportedModelLLM | "custom" = + customLLMEnabled.data ? "custom" : LLAMA_DEFAULT_VALUE; … -<RadioGroupItem value="llama-3.2-3b-q4" id="llama-3.2-3b-q4" className="peer sr-only" /> +<RadioGroupItem + value={LLAMA_DEFAULT_VALUE} + id={LLAMA_DEFAULT_VALUE} + className="peer sr-only" />
🧹 Nitpick comments (4)
apps/desktop/src/components/settings/components/ai/llm-view.tsx (2)
204-211: Localisation placeholder still hard-coded.The placeholder
"http://localhost:11434"is user-visible but not wrapped for translation, whereas neighbouring strings are.
Consider:-placeholder="http://localhost:11434" +placeholder={t`http://localhost:11434`}
269-277: Make “Select model” placeholder translatable.-<SelectValue placeholder="Select model" /> +<SelectValue placeholder={t`Select model`} />apps/desktop/src/locales/ko/messages.ts (1)
1-1: Duplicate & outdated English strings in Korean catalogue.Several English fallback strings (e.g. the obsolete
http://localhost:8080/v1message) remain in the KO bundle.
Runninglingui extract && lingui compilewith prune would drop them.apps/desktop/src/locales/en/messages.ts (1)
1-1: Consider dropping compiled Lingui message files from VCS
messages.tsis a generated artifact (lingui extract/compile). Storing the compiled JSON bloats diffs, makes reviews noisy, and risks merge-conflicts when two branches regenerate messages. Prefer committing only the source.pofiles and letting CI/packaging regenerate the compiled bundles.
📜 Review details
Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (6)
apps/desktop/src/components/settings/components/ai/llm-view.tsx(5 hunks)apps/desktop/src/locales/en/messages.po(10 hunks)apps/desktop/src/locales/en/messages.ts(1 hunks)apps/desktop/src/locales/ko/messages.po(10 hunks)apps/desktop/src/locales/ko/messages.ts(1 hunks)apps/docs/data/i18n.json(1 hunks)
🧰 Additional context used
📓 Path-based instructions (1)
`**/*.{js,ts,tsx,rs}`: 1. No error handling. 2. No unused imports, variables, or functions. 3. For comments, keep it minimal. It should be about "Why", not "What".
**/*.{js,ts,tsx,rs}: 1. No error handling.
2. No unused imports, variables, or functions.
3. For comments, keep it minimal. It should be about "Why", not "What".
apps/desktop/src/components/settings/components/ai/llm-view.tsxapps/desktop/src/locales/ko/messages.tsapps/desktop/src/locales/en/messages.ts
🧬 Code Graph Analysis (2)
apps/desktop/src/locales/ko/messages.ts (1)
apps/desktop/src/locales/en/messages.ts (1)
messages(1-1)
apps/desktop/src/locales/en/messages.ts (1)
apps/desktop/src/locales/ko/messages.ts (1)
messages(1-1)
⏰ Context from checks skipped due to timeout of 90000ms (2)
- GitHub Check: ci (macos, macos-latest)
- GitHub Check: ci (windows, windows-latest)
🔇 Additional comments (26)
apps/desktop/src/components/settings/components/ai/llm-view.tsx (2)
254-259: Untranslated helper text.The explanatory sentence was updated but remains inside
<Trans>– great. ✅
289-294: Synchronise manual-entry input with model mutation.Typing directly in the input updates
formbut mutation only happens viaform.watchonce validation passes, which is acceptable.
No change needed.apps/docs/data/i18n.json (1)
4-5: Confirm updated counters are intentional.
"total": 266while"missing": 266for Korean implies zero translated strings after this PR.
Double-check the script that updatesi18n.json; counts often drift when obsolete keys aren’t purged.Also applies to: 9-10
apps/desktop/src/locales/en/messages.ts (1)
1-1:http://localhost:8080/v1example URL is still present – contradicts PR descriptionThe PR summary says the example base URL was updated to
http://localhost:11434, but the English messageQKOjtCstill embedshttp://localhost:8080/v1. This will surface the outdated hint in the UI.-"Enter the base URL for your custom LLM endpoint (e.g., http://localhost:8080/v1)" +"Enter the base URL for your custom LLM endpoint (e.g., http://localhost:11434)"Make sure the same change is mirrored in the
.poand other locale files to avoid mismatched guidance.Likely an incorrect or invalid review comment.
apps/desktop/src/locales/ko/messages.po (11)
349-352: New translation key for “API Base URL” added
This matches the updated UI label inllm-view.tsx. Leavingmsgstrblank is correct for pending translation.
353-356: New translation key for “API Key” added
Consistent with the corresponding input field;msgstrintentionally empty awaiting translation.
445-448: Added description for connecting to a custom LLM endpoint
The msgid reflects the updated guidance in the UI. Blankmsgstris expected.
491-494: New translation key “Custom Endpoint” added
Aligns with the dropdown label in the component.
554-557: New placeholder prompt for entering model name
Matches the example in the React component.
558-561: New placeholder for “Enter the API key for your custom LLM endpoint”
Accurately reflects the UI change.
562-565: Updated placeholder for custom LLM API base URL example
Example URL has been updated tohttp://localhost:11434per code changes.
696-699: New loading state message “Loading available models…” added
Matches the dynamic state inllm-view.tsx.
738-741: New translation key for “Model Name” label added
Consistent with the new form field.
934-937: New instructional text for model selection
Reflects the updated UI guidance to select or manually enter a model.
1095-1098: New translation key for local Llama model description
Accurately informs users about offline capability.apps/desktop/src/locales/en/messages.po (11)
349-352: New key “API Base URL” introduced
Correctly synchronized with the UI; msgstr mirrors msgid.
353-356: New key “API Key” introduced
Aligned with the new input field; msgstr is appropriate.
445-448: Instructional text for custom LLM endpoint connection added
Matches the updated component copy.
491-494: New key “Custom Endpoint” added
Reflects the UI dropdown label.
554-557: Placeholder prompt for entering model name added
Consistent with the example in code.
558-561: Placeholder for API key entry added
Matches the UI update.
562-565: Example base URL updated tohttp://localhost:11434
Correctly reflects the code change.
696-699: “Loading available models…” key added
Supports the new loading state in the UI.
738-741: New label “Model Name” introduced
Aligned with the new form field.
934-937: Detailed model selection instruction added
Reflects the enhanced UI guidance.
1095-1098: Key for local Llama 3.2 model description added
Accurately explains offline/privacy benefits.
6c1c453 to
71c4042
Compare
No description provided.