Skip to content

Conversation

@plyght
Copy link
Contributor

@plyght plyght commented Jun 13, 2025

No description provided.

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Jun 13, 2025

📝 Walkthrough

Walkthrough

The changes update the localization infrastructure and UI text for custom LLM endpoint configuration in the desktop application. They add new translation keys and messages, update example URLs, improve user guidance for model selection, and enhance runtime translation support. Localization files for English and Korean are expanded and synchronized with the new UI messages.

Changes

Files/Paths Change Summary
apps/desktop/src/components/settings/components/ai/llm-view.tsx Integrated useLingui for runtime translation, updated placeholder/example URLs, enhanced model selection guidance, and improved input behavior for unavailable models.
apps/desktop/src/locales/en/messages.po
apps/desktop/src/locales/ko/messages.po
Updated translation references, added new messages for model selection and guidance, changed example URLs, commented obsolete messages.
apps/desktop/src/locales/en/messages.ts
apps/desktop/src/locales/ko/messages.ts
Added new localization keys and values for UI elements, time phrases, and model selection; expanded message content.
apps/docs/data/i18n.json Increased total/missing translation entry counts for English and Korean to reflect new messages.

Sequence Diagram(s)

sequenceDiagram
    participant User
    participant SettingsUI
    participant Lingui
    participant LocaleFiles

    User->>SettingsUI: Open LLM settings
    SettingsUI->>Lingui: Request localized strings (e.g., placeholders, descriptions)
    Lingui->>LocaleFiles: Fetch translations for current language
    LocaleFiles-->>Lingui: Return localized messages
    Lingui-->>SettingsUI: Provide localized UI text
    SettingsUI-->>User: Display updated UI with new guidance and placeholders
    User->>SettingsUI: Enter or select model name / endpoint URL
    SettingsUI-->>User: Show input fields and guidance based on endpoint/model availability
Loading

📜 Recent review details

Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 71c4042 and 6c1c453.

📒 Files selected for processing (1)
  • apps/desktop/src/components/settings/components/ai/llm-view.tsx (5 hunks)
🚧 Files skipped from review as they are similar to previous changes (1)
  • apps/desktop/src/components/settings/components/ai/llm-view.tsx
✨ Finishing Touches
  • 📝 Generate Docstrings

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share
🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

‼️ IMPORTANT
Auto-reply has been disabled for this repository in the CodeRabbit settings. The CodeRabbit bot will not respond to your replies unless it is explicitly tagged.

  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai explain this code block.
    • @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read src/utils.ts and explain its main purpose.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.
    • @coderabbitai help me debug CodeRabbit configuration file.

Support

Need help? Create a ticket on our support page for assistance with any issues or questions.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (Invoked using PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger an incremental review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai full review to do a full review from scratch and review all the files again.
  • @coderabbitai summary to regenerate the summary of the PR.
  • @coderabbitai generate docstrings to generate docstrings for this PR.
  • @coderabbitai generate sequence diagram to generate a sequence diagram of the changes in this PR.
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai configuration to show the current CodeRabbit configuration for the repository.
  • @coderabbitai help to get help.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🔭 Outside diff range comments (1)
apps/desktop/src/components/settings/components/ai/llm-view.tsx (1)

122-150: 🛠️ Refactor suggestion

Align RadioGroup value with option identifiers.

currentLLM uses "Llama3p2_3bQ4" while the corresponding RadioGroupItem is "llama-3.2-3b-q4".
Because the values differ, the default option is never selected and toggling logic becomes inconsistent.

-const currentLLM: SupportedModelLLM | "custom" = customLLMEnabled.data ? "custom" : "Llama3p2_3bQ4";
+const LLAMA_DEFAULT_VALUE = "llama-3.2-3b-q4";
+const currentLLM: SupportedModelLLM | "custom" =
+  customLLMEnabled.data ? "custom" : LLAMA_DEFAULT_VALUE;-<RadioGroupItem value="llama-3.2-3b-q4" id="llama-3.2-3b-q4" className="peer sr-only" />
+<RadioGroupItem
+  value={LLAMA_DEFAULT_VALUE}
+  id={LLAMA_DEFAULT_VALUE}
+  className="peer sr-only"
/>
🧹 Nitpick comments (4)
apps/desktop/src/components/settings/components/ai/llm-view.tsx (2)

204-211: Localisation placeholder still hard-coded.

The placeholder "http://localhost:11434" is user-visible but not wrapped for translation, whereas neighbouring strings are.
Consider:

-placeholder="http://localhost:11434"
+placeholder={t`http://localhost:11434`}

269-277: Make “Select model” placeholder translatable.

-<SelectValue placeholder="Select model" />
+<SelectValue placeholder={t`Select model`} />
apps/desktop/src/locales/ko/messages.ts (1)

1-1: Duplicate & outdated English strings in Korean catalogue.

Several English fallback strings (e.g. the obsolete http://localhost:8080/v1 message) remain in the KO bundle.
Running lingui extract && lingui compile with prune would drop them.

apps/desktop/src/locales/en/messages.ts (1)

1-1: Consider dropping compiled Lingui message files from VCS

messages.ts is a generated artifact (lingui extract/compile). Storing the compiled JSON bloats diffs, makes reviews noisy, and risks merge-conflicts when two branches regenerate messages. Prefer committing only the source .po files and letting CI/packaging regenerate the compiled bundles.

📜 Review details

Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 0bf210c and 71c4042.

📒 Files selected for processing (6)
  • apps/desktop/src/components/settings/components/ai/llm-view.tsx (5 hunks)
  • apps/desktop/src/locales/en/messages.po (10 hunks)
  • apps/desktop/src/locales/en/messages.ts (1 hunks)
  • apps/desktop/src/locales/ko/messages.po (10 hunks)
  • apps/desktop/src/locales/ko/messages.ts (1 hunks)
  • apps/docs/data/i18n.json (1 hunks)
🧰 Additional context used
📓 Path-based instructions (1)
`**/*.{js,ts,tsx,rs}`: 1. No error handling. 2. No unused imports, variables, or functions. 3. For comments, keep it minimal. It should be about "Why", not "What".

**/*.{js,ts,tsx,rs}: 1. No error handling.
2. No unused imports, variables, or functions.
3. For comments, keep it minimal. It should be about "Why", not "What".

  • apps/desktop/src/components/settings/components/ai/llm-view.tsx
  • apps/desktop/src/locales/ko/messages.ts
  • apps/desktop/src/locales/en/messages.ts
🧬 Code Graph Analysis (2)
apps/desktop/src/locales/ko/messages.ts (1)
apps/desktop/src/locales/en/messages.ts (1)
  • messages (1-1)
apps/desktop/src/locales/en/messages.ts (1)
apps/desktop/src/locales/ko/messages.ts (1)
  • messages (1-1)
⏰ Context from checks skipped due to timeout of 90000ms (2)
  • GitHub Check: ci (macos, macos-latest)
  • GitHub Check: ci (windows, windows-latest)
🔇 Additional comments (26)
apps/desktop/src/components/settings/components/ai/llm-view.tsx (2)

254-259: Untranslated helper text.

The explanatory sentence was updated but remains inside <Trans> – great. ✅


289-294: Synchronise manual-entry input with model mutation.

Typing directly in the input updates form but mutation only happens via form.watch once validation passes, which is acceptable.
No change needed.

apps/docs/data/i18n.json (1)

4-5: Confirm updated counters are intentional.

"total": 266 while "missing": 266 for Korean implies zero translated strings after this PR.
Double-check the script that updates i18n.json; counts often drift when obsolete keys aren’t purged.

Also applies to: 9-10

apps/desktop/src/locales/en/messages.ts (1)

1-1: http://localhost:8080/v1 example URL is still present – contradicts PR description

The PR summary says the example base URL was updated to http://localhost:11434, but the English message QKOjtC still embeds http://localhost:8080/v1. This will surface the outdated hint in the UI.

-"Enter the base URL for your custom LLM endpoint (e.g., http://localhost:8080/v1)"
+"Enter the base URL for your custom LLM endpoint (e.g., http://localhost:11434)"

Make sure the same change is mirrored in the .po and other locale files to avoid mismatched guidance.

Likely an incorrect or invalid review comment.

apps/desktop/src/locales/ko/messages.po (11)

349-352: New translation key for “API Base URL” added
This matches the updated UI label in llm-view.tsx. Leaving msgstr blank is correct for pending translation.


353-356: New translation key for “API Key” added
Consistent with the corresponding input field; msgstr intentionally empty awaiting translation.


445-448: Added description for connecting to a custom LLM endpoint
The msgid reflects the updated guidance in the UI. Blank msgstr is expected.


491-494: New translation key “Custom Endpoint” added
Aligns with the dropdown label in the component.


554-557: New placeholder prompt for entering model name
Matches the example in the React component.


558-561: New placeholder for “Enter the API key for your custom LLM endpoint”
Accurately reflects the UI change.


562-565: Updated placeholder for custom LLM API base URL example
Example URL has been updated to http://localhost:11434 per code changes.


696-699: New loading state message “Loading available models…” added
Matches the dynamic state in llm-view.tsx.


738-741: New translation key for “Model Name” label added
Consistent with the new form field.


934-937: New instructional text for model selection
Reflects the updated UI guidance to select or manually enter a model.


1095-1098: New translation key for local Llama model description
Accurately informs users about offline capability.

apps/desktop/src/locales/en/messages.po (11)

349-352: New key “API Base URL” introduced
Correctly synchronized with the UI; msgstr mirrors msgid.


353-356: New key “API Key” introduced
Aligned with the new input field; msgstr is appropriate.


445-448: Instructional text for custom LLM endpoint connection added
Matches the updated component copy.


491-494: New key “Custom Endpoint” added
Reflects the UI dropdown label.


554-557: Placeholder prompt for entering model name added
Consistent with the example in code.


558-561: Placeholder for API key entry added
Matches the UI update.


562-565: Example base URL updated to http://localhost:11434
Correctly reflects the code change.


696-699: “Loading available models…” key added
Supports the new loading state in the UI.


738-741: New label “Model Name” introduced
Aligned with the new form field.


934-937: Detailed model selection instruction added
Reflects the enhanced UI guidance.


1095-1098: Key for local Llama 3.2 model description added
Accurately explains offline/privacy benefits.

@plyght plyght force-pushed the plyght/command-box-for-custom-llms branch from 6c1c453 to 71c4042 Compare June 13, 2025 05:48
@yujonglee yujonglee changed the title plyght/command box for custom llms support custom llm endpoint that lack /models Jun 13, 2025
@yujonglee yujonglee merged commit 3fcca24 into fastrepl:main Jun 13, 2025
10 checks passed
@coderabbitai coderabbitai bot mentioned this pull request Sep 2, 2025
1 task
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants