Skip to content

New onboarding#1277

Merged
duckduckhero merged 4 commits intomainfrom
new-onboarding
Aug 2, 2025
Merged

New onboarding#1277
duckduckhero merged 4 commits intomainfrom
new-onboarding

Conversation

@duckduckhero
Copy link
Contributor

@duckduckhero duckduckhero commented Aug 2, 2025

Summary by cubic

Added a new onboarding flow that lets users choose between a local language model (HyprLLM) or bring their own model, with guided setup for custom endpoints and improved step navigation.

  • New Features
    • Added LLM selection step with options for local or custom models.
    • Built a custom endpoint configuration view supporting OpenAI, Gemini, OpenRouter, and other providers.
    • Updated download progress and model selection steps to reflect the chosen LLM path.
    • Improved onboarding navigation with back buttons and clearer step transitions.
    • Updated English and Korean translations for new onboarding screens.

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Aug 2, 2025

📝 Walkthrough

Walkthrough

This update introduces a multi-step onboarding flow in the welcome modal for configuring large language models (LLMs). It adds new UI components for LLM selection and custom endpoint configuration, integrates form validation for multiple providers, and updates localization files with new strings. Conditional logic for downloading and setting up LLMs is also introduced.

Changes

Cohort / File(s) Change Summary
LLM Selection and Custom Endpoint Components
apps/desktop/src/components/welcome-modal/llm-selection-view.tsx, apps/desktop/src/components/welcome-modal/custom-endpoint-view.tsx
Introduced new React components for LLM selection (LLMSelectionView) and custom endpoint configuration (CustomEndpointView), supporting multiple providers and dynamic forms with validation and model fetching.
Welcome Modal Flow and State Management
apps/desktop/src/components/welcome-modal/index.tsx
Added multi-step onboarding flow with new steps for LLM selection and custom endpoint configuration, extended state management, integrated new forms, and updated navigation and backend configuration logic.
Download Progress Conditional Logic
apps/desktop/src/components/welcome-modal/download-progress-view.tsx
Updated to conditionally handle LLM downloads and progress display based on selected LLM type via a new prop.
Transcribing Model Selection UI
apps/desktop/src/components/welcome-modal/model-selection-view.tsx
Updated heading text to clarify model selection purpose.
Localization Updates (English and Korean)
apps/desktop/src/locales/en/messages.po, apps/desktop/src/locales/ko/messages.po
Added and updated translation entries for new UI elements and onboarding steps, with updated source references.

Sequence Diagram(s)

sequenceDiagram
    participant User
    participant WelcomeModal
    participant LLMSelectionView
    participant CustomEndpointView
    participant Backend

    User->>WelcomeModal: Opens onboarding
    WelcomeModal->>LLMSelectionView: Render LLM selection step
    User->>LLMSelectionView: Selects LLM type and continues
    LLMSelectionView->>WelcomeModal: onContinue(selection)
    alt selection == "hyprllm"
        WelcomeModal->>WelcomeModal: Proceed to model selection & download steps
    else selection == "byom"
        WelcomeModal->>CustomEndpointView: Render custom endpoint step
        User->>CustomEndpointView: Selects provider, enters config, continues
        CustomEndpointView->>WelcomeModal: onContinue()
        WelcomeModal->>Backend: configureCustomEndpoint(config)
        Backend-->>WelcomeModal: Ack
        WelcomeModal->>WelcomeModal: Proceed to model selection step
    end
Loading

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~40 minutes

Possibly related PRs

  • Others endpoint smooth #1251: Refactors and improves model fetching for the "Others" custom endpoint in a different component, related to the handling of custom LLM endpoints and model retrieval logic.

Note

⚡️ Unit Test Generation is now available in beta!

Learn more here, or try it out under "Finishing Touches" below.


📜 Recent review details

Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between bf42320 and fa73ce7.

📒 Files selected for processing (3)
  • apps/desktop/src/components/welcome-modal/custom-endpoint-view.tsx (1 hunks)
  • apps/desktop/src/locales/en/messages.po (15 hunks)
  • apps/desktop/src/locales/ko/messages.po (15 hunks)
✅ Files skipped from review due to trivial changes (1)
  • apps/desktop/src/locales/ko/messages.po
🚧 Files skipped from review as they are similar to previous changes (2)
  • apps/desktop/src/components/welcome-modal/custom-endpoint-view.tsx
  • apps/desktop/src/locales/en/messages.po
✨ Finishing Touches
  • 📝 Generate Docstrings
🧪 Generate unit tests
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch new-onboarding

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share
🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

‼️ IMPORTANT
Auto-reply has been disabled for this repository in the CodeRabbit settings. The CodeRabbit bot will not respond to your replies unless it is explicitly tagged.

  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai explain this code block.
    • @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read src/utils.ts and explain its main purpose.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.
    • @coderabbitai help me debug CodeRabbit configuration file.

Support

Need help? Create a ticket on our support page for assistance with any issues or questions.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (Invoked using PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger an incremental review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai full review to do a full review from scratch and review all the files again.
  • @coderabbitai summary to regenerate the summary of the PR.
  • @coderabbitai generate docstrings to generate docstrings for this PR.
  • @coderabbitai generate sequence diagram to generate a sequence diagram of the changes in this PR.
  • @coderabbitai generate unit tests to generate unit tests for this PR.
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai configuration to show the current CodeRabbit configuration for the repository.
  • @coderabbitai help to get help.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

🧹 Nitpick comments (4)
apps/desktop/src/components/welcome-modal/llm-selection-view.tsx (1)

22-36: Consider extracting options to a constant outside the component

The options array is recreated on every render, which is unnecessary. Consider moving it outside the component or using useMemo for better performance.

+const LLM_OPTIONS = [
+  {
+    id: "hyprllm" as const,
+    title: "HyprLLM (Local)",
+    subtitle: "Privacy matters more than anything to me",
+    icon: ShieldIcon,
+  },
+  {
+    id: "byom" as const,
+    title: "Bring Your Own Model",
+    subtitle: "I want first-in-class meeting summarization",
+    icon: Network,
+  },
+];

 export function LLMSelectionView({ onContinue }: LLMSelectionViewProps) {
   const [selected, setSelected] = useState<"hyprllm" | "byom" | null>(null);

   const handleContinue = () => {
     if (selected) {
       onContinue(selected);
     }
   };

-  const options = [
-    {
-      id: "hyprllm",
-      title: "HyprLLM (Local)",
-      subtitle: "Privacy matters more than anything to me",
-      icon: ShieldIcon,
-    },
-    {
-      id: "byom",
-      title: "Bring Your Own Model",
-      subtitle: "I want first-in-class meeting summarization",
-      icon: Network,
-    },
-  ] as const;
+  const options = LLM_OPTIONS;
apps/desktop/src/components/welcome-modal/custom-endpoint-view.tsx (2)

22-46: Consider fetching model lists dynamically.

Hardcoded model lists may become outdated as providers release new models. Consider fetching these lists from a configuration API or making them configurable.


194-196: Document the model filtering logic.

The code filters out certain model types but doesn't explain why. Add a comment explaining the reasoning.

+// Filter out non-chat models (image generation, audio, code completion)
 const excludeKeywords = ["dall-e", "codex", "whisper"];
apps/desktop/src/components/welcome-modal/index.tsx (1)

4-4: Remove unnecessary comment.

The comment is stating the obvious.

-import { ArrowLeft } from "lucide-react"; // Add this import
+import { ArrowLeft } from "lucide-react";
📜 Review details

Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 70f35bd and bf42320.

📒 Files selected for processing (7)
  • apps/desktop/src/components/welcome-modal/custom-endpoint-view.tsx (1 hunks)
  • apps/desktop/src/components/welcome-modal/download-progress-view.tsx (7 hunks)
  • apps/desktop/src/components/welcome-modal/index.tsx (11 hunks)
  • apps/desktop/src/components/welcome-modal/llm-selection-view.tsx (1 hunks)
  • apps/desktop/src/components/welcome-modal/model-selection-view.tsx (1 hunks)
  • apps/desktop/src/locales/en/messages.po (15 hunks)
  • apps/desktop/src/locales/ko/messages.po (15 hunks)
🧰 Additional context used
📓 Path-based instructions (1)
**/*.{js,ts,tsx,rs}

⚙️ CodeRabbit Configuration File

**/*.{js,ts,tsx,rs}: 1. No error handling.
2. No unused imports, variables, or functions.
3. For comments, keep it minimal. It should be about "Why", not "What".

Files:

  • apps/desktop/src/components/welcome-modal/model-selection-view.tsx
  • apps/desktop/src/components/welcome-modal/download-progress-view.tsx
  • apps/desktop/src/components/welcome-modal/llm-selection-view.tsx
  • apps/desktop/src/components/welcome-modal/custom-endpoint-view.tsx
  • apps/desktop/src/components/welcome-modal/index.tsx
🪛 GitHub Actions: .github/workflows/fmt.yaml
apps/desktop/src/components/welcome-modal/custom-endpoint-view.tsx

[error] 4-11: dprint formatting check failed. The import statements are not formatted correctly. Run 'dprint fmt' to fix code style issues.

⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (3)
  • GitHub Check: cubic · AI code reviewer
  • GitHub Check: ci (windows, windows-latest)
  • GitHub Check: ci (macos, macos-latest)
🔇 Additional comments (17)
apps/desktop/src/components/welcome-modal/llm-selection-view.tsx (3)

1-8: Check import usage and remove unused imports

All imports appear to be used correctly in the component. The imports are well-organized with external libraries first, then UI components, and finally React hooks.


13-21: LGTM! Clean component initialization and event handling

The component properly initializes state and handles the continue action with appropriate null checking. The conditional call to onContinue prevents potential issues.


97-102: Good conditional UX pattern

The conditional message display when no option is selected provides clear user guidance. This is a good UX pattern.

apps/desktop/src/components/welcome-modal/model-selection-view.tsx (1)

73-73: Good clarification in heading text

Adding "(STT)" to the heading clarifies that this is for Speech-to-Text model selection, which helps distinguish it from the new LLM selection step in the onboarding flow.

apps/desktop/src/locales/ko/messages.po (4)

263-265: LGTM! Proper localization entries added

The new localization entries support the custom endpoint configuration UI with appropriate source references.


439-443: LGTM! Back button localization added

Proper localization entries for the Back button functionality in the welcome modal.


521-524: LGTM! LLM selection localization

Appropriate localization entries for the new LLM selection component.


549-556: LGTM! Custom endpoint configuration localization

Good localization entries for the custom endpoint configuration flow.

apps/desktop/src/components/welcome-modal/download-progress-view.tsx (5)

20-24: LGTM! Proper prop interface extension

The interface correctly adds the new llmSelection prop with appropriate typing to support the conditional LLM download flow.


112-117: LGTM! Clean conditional LLM download initiation

The conditional logic properly initiates the LLM download only when "hyprllm" is selected, and marks it as completed immediately for other cases. This prevents unnecessary downloads for the "bring your own model" flow.


131-145: LGTM! Consistent conditional event handling

The event handler setup follows the same conditional pattern as the download initiation, ensuring progress tracking only occurs when needed.


185-197: LGTM! Proper conditional model setup

The LLM completion handler correctly checks both completion status and selection type before setting up the model and starting the server. The effect dependencies are properly updated to include llmSelection.


244-252: LGTM! Conditional UI rendering matches logic

The UI correctly shows the LLM progress card only when llmSelection is "hyprllm", maintaining consistency with the download logic.

apps/desktop/src/locales/en/messages.po (1)

263-266: LGTM!

The localization strings for the new LLM onboarding flow are appropriately added.

apps/desktop/src/components/welcome-modal/custom-endpoint-view.tsx (1)

71-78: Add error handling for configuration.

The configureCustomEndpoint call lacks error handling. If the configuration fails, users won't receive any feedback.

Consider wrapping in try-catch and showing an error message:

-configureCustomEndpoint({
-  provider: "openai",
-  api_base: "",
-  api_key: values.api_key,
-  model: values.model,
-});
-setIsConfigured(true);
+try {
+  await configureCustomEndpoint({
+    provider: "openai",
+    api_base: "",
+    api_key: values.api_key,
+    model: values.model,
+  });
+  setIsConfigured(true);
+} catch (error) {
+  console.error('Failed to configure OpenAI endpoint:', error);
+  // Show error to user
+  setIsConfigured(false);
+}
⛔ Skipped due to learnings
Learnt from: plyght
PR: fastrepl/hyprnote#921
File: plugins/location-connectivity/src/error.rs:1-41
Timestamp: 2025-06-06T16:31:46.457Z
Learning: In the location-connectivity plugin (plugins/location-connectivity/), comprehensive error handling via LocationConnectivityError is necessary despite the "no error handling" guideline, as the plugin performs system-level operations (WiFi detection, persistent storage) that require proper error propagation to the frontend.
apps/desktop/src/components/welcome-modal/index.tsx (2)

320-326: Good conditional logic!

The code correctly checks if HyprLLM was selected before attempting to download the model.


158-161: Improve error handling with user feedback.

The error is only logged to console without informing the user. Consider propagating the error or showing a user-friendly message.

 } catch (error) {
   console.error("Failed to configure custom endpoint:", error);
+  // Show error message to user
+  throw error; // Let the caller handle the error
 }
⛔ Skipped due to learnings
Learnt from: plyght
PR: fastrepl/hyprnote#921
File: plugins/location-connectivity/src/error.rs:1-41
Timestamp: 2025-06-06T16:31:46.457Z
Learning: In the location-connectivity plugin (plugins/location-connectivity/), comprehensive error handling via LocationConnectivityError is necessary despite the "no error handling" guideline, as the plugin performs system-level operations (WiFi detection, persistent storage) that require proper error propagation to the frontend.

<div className="flex items-center gap-2 mb-3">
<svg
fill="currentColor"
fill-rule="evenodd"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Fix SVG attribute naming.

Use camelCase for React SVG attributes.

-fill-rule="evenodd"
+fillRule="evenodd"
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
fill-rule="evenodd"
fillRule="evenodd"
🤖 Prompt for AI Agents
In apps/desktop/src/components/welcome-modal/custom-endpoint-view.tsx at line
423, the SVG attribute "fill-rule" is used with a hyphen, which is not valid in
React. Change the attribute name to camelCase "fillRule" to comply with React's
SVG attribute naming conventions.

Copy link

@cubic-dev-ai cubic-dev-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

cubic analysis

4 issues found across 7 files • Review in cubic

React with 👍 or 👎 to teach cubic. You can also tag @cubic-dev-ai to give feedback, ask questions, or re-run the review.

const options = [
{
id: "hyprllm",
title: "HyprLLM (Local)",
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

User-visible string is hard-coded without a wrapper, so it will not be included in translation extraction.

Prompt for AI agents
Address the following comment on apps/desktop/src/components/welcome-modal/llm-selection-view.tsx at line 25:

<comment>User-visible string is hard-coded without a &lt;Trans&gt; wrapper, so it will not be included in translation extraction.</comment>

<file context>
@@ -0,0 +1,104 @@
+import { Trans } from &quot;@lingui/react/macro&quot;;
+import { Network, ShieldIcon } from &quot;lucide-react&quot;;
+
+import { Card, CardContent } from &quot;@hypr/ui/components/ui/card&quot;;
+import PushableButton from &quot;@hypr/ui/components/ui/pushable-button&quot;;
+import { cn } from &quot;@hypr/ui/lib/utils&quot;;
+import { useState } from &quot;react&quot;;
+
+interface LLMSelectionViewProps {
</file context>

? "ring-2 ring-blue-500 border-blue-500 bg-blue-50"
: "hover:border-gray-400",
)}
onClick={() => setSelected(option.id)}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Interactive Card is not keyboard-accessible because it lacks role and tabIndex attributes (Based on your team's feedback about ensuring all interactive elements are accessible via keyboard).

Prompt for AI agents
Address the following comment on apps/desktop/src/components/welcome-modal/llm-selection-view.tsx at line 63:

<comment>Interactive Card is not keyboard-accessible because it lacks role and tabIndex attributes (Based on your team&#39;s feedback about ensuring all interactive elements are accessible via keyboard).</comment>

<file context>
@@ -0,0 +1,104 @@
+import { Trans } from &quot;@lingui/react/macro&quot;;
+import { Network, ShieldIcon } from &quot;lucide-react&quot;;
+
+import { Card, CardContent } from &quot;@hypr/ui/components/ui/card&quot;;
+import PushableButton from &quot;@hypr/ui/components/ui/pushable-button&quot;;
+import { cn } from &quot;@hypr/ui/lib/utils&quot;;
+import { useState } from &quot;react&quot;;
+
+interface LLMSelectionViewProps {
</file context>

if (llmSelection === "hyprllm") {
localLlmCommands.downloadModel("HyprLLM", llmDownload.channel);
} else {
setLlmDownload(prev => ({ ...prev, completed: true }));
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The else-branch marks the LLM download as completed even though no download occurred; if the user later changes the selection to "hyprllm", llmDownload.completed remains true and handleLlmCompletion will start the LLM server before the model is actually present, leading to runtime failures. A more robust approach would reset llmDownload state (progress 0, completed false) when switching from BYOM to HyprLLM instead of permanently setting completed true.

Prompt for AI agents
Address the following comment on apps/desktop/src/components/welcome-modal/download-progress-view.tsx at line 115:

<comment>The else-branch marks the LLM download as completed even though no download occurred; if the user later changes the selection to &quot;hyprllm&quot;, llmDownload.completed remains true and handleLlmCompletion will start the LLM server before the model is actually present, leading to runtime failures. A more robust approach would reset llmDownload state (progress 0, completed false) when switching from BYOM to HyprLLM instead of permanently setting completed true.</comment>

<file context>
@@ -107,7 +109,11 @@ export const DownloadProgressView = ({
   useEffect(() =&gt; {
     localSttCommands.downloadModel(selectedSttModel, sttDownload.channel);
 
-    localLlmCommands.downloadModel(&quot;HyprLLM&quot;, llmDownload.channel);
+    if (llmSelection === &quot;hyprllm&quot;) {
+      localLlmCommands.downloadModel(&quot;HyprLLM&quot;, llmDownload.channel);
+    } else {
+      setLlmDownload(prev =&gt; ({ ...prev, completed: true }));
+    }
</file context>

}));
};
}
}, [selectedSttModel, sttDownload.channel, llmDownload.channel, llmSelection]);
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Adding llmSelection to the dependency array forces this effect to run every time the user toggles the LLM option, which re-invokes localSttCommands.downloadModel and begins a new STT download even when the STT model has not changed. This can create duplicate downloads and waste bandwidth. (Based on your team's feedback about preventing unnecessary re-downloads in effects)

Prompt for AI agents
Address the following comment on apps/desktop/src/components/welcome-modal/download-progress-view.tsx at line 145:

<comment>Adding llmSelection to the dependency array forces this effect to run every time the user toggles the LLM option, which re-invokes localSttCommands.downloadModel and begins a new STT download even when the STT model has not changed. This can create duplicate downloads and waste bandwidth. (Based on your team&#39;s feedback about preventing unnecessary re-downloads in effects)</comment>

<file context>
@@ -122,19 +128,21 @@ export const DownloadProgressView = ({
       }));
     };
 
-    llmDownload.channel.onmessage = (progress) =&gt; {
-      if (progress &lt; 0) {
-        setLlmDownload(prev =&gt; ({ ...prev, error: true }));
-        return;
-      }
+    if (llmSelection === &quot;hyprllm&quot;) {
</file context>
Suggested change
}, [selectedSttModel, sttDownload.channel, llmDownload.channel, llmSelection]);
}, [selectedSttModel, sttDownload.channel, llmDownload.channel]);

@duckduckhero duckduckhero merged commit 4acdc42 into main Aug 2, 2025
7 checks passed
@ComputelessComputer ComputelessComputer deleted the new-onboarding branch December 14, 2025 15:21
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant