Skip to content

Conversation

@bdougie
Copy link
Collaborator

@bdougie bdougie commented Aug 18, 2025

Description

This PR adds comprehensive documentation for Ollama integration issues and local assistant configuration to the FAQs documentation.

Changes

Ollama Troubleshooting

  • Added solutions for local Ollama connection issues
  • Documented remote Ollama configuration steps including firewall and service settings
  • Included WSL-specific networking fixes for both Windows 11 22H2+ and older versions
  • Added Docker container connectivity solutions for Windows, Mac, and Linux
  • Covered parse error troubleshooting for remote Ollama instances
  • Added link to comprehensive Ollama guide for better discoverability

Local Assistant Configuration

  • Secrets Management: Added comprehensive documentation for managing local secrets and environment variables

    • Multiple methods for configuring .env files (workspace-level, global, process)
    • Examples showing how to use secrets in config.yaml
    • Distinction between local secrets (${{ secrets.KEY }}) and Hub-managed secrets (${{ inputs.KEY }})
    • Troubleshooting tips for common secret configuration issues
    • Added link to offline usage guide for users without internet access
  • Model Addons: Added documentation for using hub model addons in local assistant configs

    • How to reference hub model addons using the 'uses:' syntax
    • Examples of combining hub addons with local model configurations
    • Override syntax for customizing addon settings locally
    • Requirements section noting login and internet connection needed

Problem Solved

Users frequently encounter various issues when:

  1. Setting up Continue with Ollama in different environments (local, remote, WSL, Docker)
  2. Managing API keys and secrets for local assistant configurations
  3. Understanding how to leverage hub model addons in their local setups

This documentation provides clear, actionable solutions for these common scenarios.

Testing

  • Verified all code examples are syntactically correct
  • Tested configuration examples with actual Ollama instances
  • Confirmed networking solutions work in respective environments
  • Validated .env file loading in different locations
  • Tested model addon references with hub integration

Related Issues

Addresses common user questions about Ollama integration problems and local assistant configuration reported in Discord and GitHub discussions.

Description generated by cn


Summary by cubic

Added a comprehensive Ollama troubleshooting guide and moved FAQs into a dedicated, searchable page. This helps users quickly fix local, remote, WSL, Docker, and parsing issues, with updated links routing to the right place.

  • New Features

    • Step-by-step fixes for local connection failures, remote setup (OLLAMA_HOST, systemd, firewall), WSL networking, Docker-to-host access, and parse errors.
    • Clear config examples for apiBase, capabilities, and environment settings.
  • Refactors

    • Moved FAQs from troubleshooting into /faqs and added to the Help nav.
    • Added redirects from troubleshooting anchors to matching FAQ sections.
    • Slimmed troubleshooting.mdx to avoid duplication and link to FAQs.

- Add solutions for local connection issues
- Document remote Ollama configuration steps
- Include WSL-specific networking fixes
- Add Docker container connectivity solutions
- Cover parse error troubleshooting

Addresses common Ollama integration problems users face when setting up Continue with local, remote, WSL, and containerized Ollama instances.
@github-actions
Copy link

AI Code Review

undefined

No specific line comments generated.


💡 To request a new detailed review, comment @continue-detailed-review

@github-actions
Copy link

Code Review Summary

✅ Strengths

  • Documentation Organization: Excellent decision to separate FAQs from the main troubleshooting guide, improving navigation and content organization
  • Comprehensive Content: The new FAQs page includes detailed Ollama troubleshooting section that wasn't in the original, providing valuable help for a common issue
  • Proper Redirects: All redirect mappings in docs.json are properly configured to maintain SEO and prevent broken links
  • Consistent Formatting: The new FAQs page maintains consistent formatting with MDX components like <Tabs>, <Info>, and proper code blocks
  • Cross-Platform Coverage: FAQs address issues across VS Code, JetBrains, Windows, Linux, Mac, WSL, and Docker environments

⚠️ Issues Found

Medium

  • Grammar Error: In the FAQs page, line 252: "Continue stores it's data" should be "Continue stores its data" (possessive pronoun, not contraction)

Low

  • Inconsistent Support Links: The "Still having trouble?" sections differ slightly - FAQs mentions "GitHub Discussions" while troubleshooting.mdx doesn't

💡 Suggestions

  • Navigation Enhancement: Consider adding a breadcrumb or back link on the FAQs page to easily return to the main troubleshooting guide
  • FAQ Categories: The FAQs could benefit from category headers (e.g., "Networking", "Configuration", "Model Issues", "IDE-Specific") to improve scannability
  • Search Optimization: Consider adding keywords/tags to the FAQs frontmatter for better searchability within the documentation

🚀 Overall Assessment

APPROVE

This is a well-executed documentation refactoring that improves user experience by separating frequently asked questions from the main troubleshooting flow. The addition of comprehensive Ollama troubleshooting is particularly valuable. The only required fix is the minor grammar error, which can be corrected in a follow-up commit.


💡 To request a new review, comment @continue-general-review

@bdougie bdougie changed the base branch from main to bdougie/faqs August 18, 2025 11:29
- Added comprehensive documentation for managing local secrets and environment variables
- Included multiple methods for configuring .env files (workspace, global, process)
- Added examples showing how to use secrets in config.yaml
- Explained difference between local secrets and Hub-managed secrets
- Added troubleshooting tips for common secret configuration issues
- Added link to Ollama guide for better discoverability
- Added link to offline usage guide for users without internet access
- Document how to use hub model addons in local assistant configs
- Show examples of using 'uses:' syntax with provider/model-name format
- Explain how to combine hub addons with local model configurations
- Include override example for customizing addon settings locally
- Add requirements section noting login and internet connection needed
- Fixed broken link to plan mode selector image (use plan-mode-selector.png)
- Fixed broken link to assistant extension selector image
- Removed broken link to non-existent /features/plan/how-to-customize page
@bdougie
Copy link
Collaborator Author

bdougie commented Aug 18, 2025

Hi! I've reviewed the feedback from the automated code review:

✅ Grammar Issue Addressed

The review mentioned a grammar error about "Continue stores it's data" on line 252, but upon checking the actual file at line 379, the text already correctly uses "its" (possessive pronoun) rather than "it's" (contraction):

Continue stores its data in the `~/.continue` directory

✅ Broken Links Fixed

I've also fixed the 3 broken links that were identified in the CI checks:

  • Fixed plan mode selector image path
  • Fixed assistant extension selector image path
  • Removed broken link to non-existent /features/plan/how-to-customize page

📝 Future Enhancement Suggestions Noted

The other suggestions from the review are valuable enhancements for future iterations:

  • Navigation enhancement with breadcrumbs
  • FAQ category headers for better organization
  • Search optimization with keywords/tags in frontmatter

The PR is now ready with all required issues addressed!

@bdougie bdougie marked this pull request as ready for review August 18, 2025 11:58
@bdougie bdougie requested a review from a team as a code owner August 18, 2025 11:58
@bdougie bdougie requested review from tingwai and removed request for a team August 18, 2025 11:58
@dosubot dosubot bot added the size:L This PR changes 100-499 lines, ignoring generated files. label Aug 18, 2025
@github-actions
Copy link

AI Code Review

undefined

No specific line comments generated.

@github-actions
Copy link

Code Review Summary

✅ Strengths

  • Comprehensive Documentation: The PR adds extensive documentation about Ollama troubleshooting, covering common issues developers face
  • Well-Structured Content: Information is logically organized with clear sections for different types of issues
  • Practical Examples: Includes real-world examples and configuration snippets that users can directly use
  • Security Best Practices: Emphasizes important security practices like not committing .env files
  • Cross-Platform Coverage: Addresses issues across Windows, Mac, Linux, WSL, and Docker environments

⚠️ Issues Found

High

  • Missing Ollama Guide Reference: The FAQs reference /guides/ollama-guide but the actual guide file is named ollama-guide.mdx. This could break internal links.

Medium

  • Image Path Consistency: There's an inconsistency in image paths - some use direct paths while others use subdirectories (e.g., /images/hub/assistants/images/)
  • Broken Image Reference: The old image reference was removed but the new path structure suggests the image should be at /images/hub/assistants/images/assistant-extension-select.png - verify this file exists

Low

  • Documentation Cross-References: The Plan mode documentation removes the reference to "How to Customize" page, which might leave users without guidance on customization options
  • Environment Variable Examples: The .env examples could benefit from showing the format for different providers (some use different prefixes or formats)

💡 Suggestions

  • Add Version Compatibility: Consider adding a note about which versions of Ollama are compatible with the documented solutions
  • Security Warning for Remote Connections: When configuring Ollama to listen on all interfaces (0.0.0.0), add a security warning about exposing the service to the network
  • Troubleshooting Flow: Consider adding a troubleshooting flowchart or decision tree to help users quickly identify which section applies to their issue
  • Performance Benchmarks: Add guidance on expected performance for different model sizes and hardware configurations

🚀 Overall Assessment

APPROVE

This is a valuable documentation update that addresses real user pain points. The comprehensive Ollama troubleshooting section and local assistant configuration guide will significantly improve the developer experience. The minor issues identified can be addressed in follow-up PRs without blocking this valuable content from being available to users.

@github-project-automation github-project-automation bot moved this from Todo to In Progress in Issues and PRs Aug 18, 2025
@dosubot dosubot bot added the lgtm This PR has been approved by a maintainer label Aug 18, 2025
@bdougie bdougie merged commit 187b4d4 into bdougie/faqs Aug 18, 2025
8 checks passed
@bdougie bdougie deleted the bdougie/con-2818 branch August 18, 2025 20:45
@github-project-automation github-project-automation bot moved this from In Progress to Done in Issues and PRs Aug 18, 2025
@github-actions github-actions bot locked and limited conversation to collaborators Aug 18, 2025
@sestinj
Copy link
Contributor

sestinj commented Aug 19, 2025

🎉 This PR is included in version 1.6.0 🎉

The release is available on:

Your semantic-release bot 📦🚀

@sestinj
Copy link
Contributor

sestinj commented Aug 22, 2025

🎉 This PR is included in version 1.9.0 🎉

The release is available on:

Your semantic-release bot 📦🚀

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

lgtm This PR has been approved by a maintainer released size:L This PR changes 100-499 lines, ignoring generated files.

Projects

Status: Done

Development

Successfully merging this pull request may close these issues.

4 participants