Fix toast message issues + Changed the ThankYou Note #1196
Fix toast message issues + Changed the ThankYou Note #1196duckduckhero merged 4 commits intomainfrom
Conversation
📝 WalkthroughWalkthroughThe changes replace the previous logic for detecting downloaded models by removing the Changes
Sequence Diagram(s)sequenceDiagram
participant UI as ModelDownloadNotification
participant Query as QueryClient
UI->>Query: sttModelExists()
Query-->>UI: Boolean (STT model exists)
UI->>Query: llmModelExists()
Query-->>UI: Boolean (LLM model exists)
UI->>UI: Update UI and button labels based on existence flags
Note right of UI: Initiate download or dismiss as before
Estimated code review effort2 (~15 minutes) Possibly related PRs
📜 Recent review detailsConfiguration used: .coderabbit.yaml 📒 Files selected for processing (1)
🚧 Files skipped from review as they are similar to previous changes (1)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (3)
✨ Finishing Touches
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
SupportNeed help? Create a ticket on our support page for assistance with any issues or questions. Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
Documentation and Community
|
There was a problem hiding this comment.
Actionable comments posted: 1
🧹 Nitpick comments (2)
apps/desktop/src/components/toast/model-download.tsx (2)
62-78: Consider extracting hard-coded model names to constants.The STT model names are hard-coded in the query function, which makes maintenance difficult if model names change in the future.
+const STT_MODEL_NAMES = [ + "QuantizedTiny", + "QuantizedTinyEn", + "QuantizedBase", + "QuantizedBaseEn", + "QuantizedSmall", + "QuantizedSmallEn", + "QuantizedLargeTurbo" +] as const; const sttModelExists = useQuery({ queryKey: ["stt-model-exists"], queryFn: async () => { - const results = await Promise.all([ - localSttCommands.isModelDownloaded("QuantizedTiny"), - localSttCommands.isModelDownloaded("QuantizedTinyEn"), - localSttCommands.isModelDownloaded("QuantizedBase"), - localSttCommands.isModelDownloaded("QuantizedBaseEn"), - localSttCommands.isModelDownloaded("QuantizedSmall"), - localSttCommands.isModelDownloaded("QuantizedSmallEn"), - localSttCommands.isModelDownloaded("QuantizedLargeTurbo"), - ]); + const results = await Promise.all( + STT_MODEL_NAMES.map(modelName => localSttCommands.isModelDownloaded(modelName)) + ); return results.some(Boolean); }, refetchInterval: 3000, });
80-91: Extract hard-coded LLM model names to constants.Similar to the STT models, the LLM model names should be extracted to constants for better maintainability.
+const LLM_MODEL_NAMES = [ + "Llama3p2_3bQ4", + "HyprLLM" +] as const; const llmModelExists = useQuery({ queryKey: ["llm-model-exists"], queryFn: async () => { - const results = await Promise.all([ - localLlmCommands.isModelDownloaded("Llama3p2_3bQ4"), - localLlmCommands.isModelDownloaded("HyprLLM"), - ]); + const results = await Promise.all( + LLM_MODEL_NAMES.map(modelName => localLlmCommands.isModelDownloaded(modelName)) + ); return results.some(Boolean); }, refetchInterval: 3000, });
📜 Review details
Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (2)
apps/desktop/src/components/toast/model-download.tsx(3 hunks)crates/db-user/assets/thank-you.md(1 hunks)
📓 Path-based instructions (1)
**/*.{js,ts,tsx,rs}
⚙️ CodeRabbit Configuration File
**/*.{js,ts,tsx,rs}: 1. No error handling.
2. No unused imports, variables, or functions.
3. For comments, keep it minimal. It should be about "Why", not "What".
Files:
apps/desktop/src/components/toast/model-download.tsx
🧰 Additional context used
📓 Path-based instructions (1)
**/*.{js,ts,tsx,rs}
⚙️ CodeRabbit Configuration File
**/*.{js,ts,tsx,rs}: 1. No error handling.
2. No unused imports, variables, or functions.
3. For comments, keep it minimal. It should be about "Why", not "What".
Files:
apps/desktop/src/components/toast/model-download.tsx
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (3)
- GitHub Check: ci (macos, macos-latest)
- GitHub Check: ci (windows, windows-latest)
- GitHub Check: ci
🔇 Additional comments (2)
apps/desktop/src/components/toast/model-download.tsx (1)
120-124: LGTM! Button label improvements enhance UX.The updated button labels ("Download Models" and "Download Model") are more concise and clearer than the previous versions.
crates/db-user/assets/thank-you.md (1)
1-5: Excellent UX improvement with contextual messaging.The expanded thank-you message provides valuable context about the model download process and gives users meaningful activities during the wait time. The flow from acknowledgment → suggested activities → community engagement is well-structured.
No description provided.