The fully private, local-first AI document assistant.
QueryDoc is a high-performance, privacy-focused PDF analysis platform. It combines the power of local LLMs via Ollama with a sophisticated client-side RAG (Retrieval-Augmented Generation) engine to give you instant answers about your documents without your data ever leaving your machine.
- 100% Client-Side Processing: Your PDFs are processed in your browser.
- Local AI Integration: Connects to your local Ollama instance for inference. No subscriptions, no data tracking.
- Library View: Upload and manage multiple documents in a single session.
- Cross-Doc Q&A: Ask a question once and QueryDoc searches across your entire library to find the most relevant context.
- State Persistence: Documents and chat history are saved securely in your browser's IndexedDB and LocalStorage, surviving page reloads.
- Intelligent Chunking: Preserves document structure and coordinates for pixel-perfect source highlighting.
- Query Expansion: Uses conversation history to understand follow-up questions (e.g., "What did he say about the budget?").
- Source Pinpointing: When AI answers, it highlights the exact sentence or phrase in the PDF viewer.
- Local Default: Optimized for
Qwen 2.5andGemma 2via Ollama. - Cloud Fallback: Integrated with Groq Cloud for blazing-fast responses when you're away from your local GPU.
- Auto-Fallback: Seamlessly switches to Cloud models if Ollama is unreachable.
- Auto-OCR: Scanned PDFs are automatically detected and processed via client-side Tesseract.js.
- Context-Aware Summaries: Specialized summarization for Resumes, Contracts, and Research Papers.
- Voice Reader (TTS): Listen to your documents with built-in Text-to-Speech support.
- Quick Actions: Select any text in the PDF to instantly Explain, Summarize, or Search.
- Node.js 18+
- Ollama (for local processing)
Download Ollama from ollama.com and pull the recommended models:
ollama pull qwen2.5:3b
ollama pull gemma2:2b# Clone the repository
git clone https://github.com/your-username/querydoc.git
cd querydoc
# Install dependencies
npm install
# Start development server
npm run devOpen http://localhost:3000 to start querying!
If you're using the Live Demo and want to connect it to your local Ollama:
- Expose Ollama: Use a tunnel like ngrok:
ngrok http 11434
- Set Origins: Ensure Ollama accepts the connection:
- Windows:
$env:OLLAMA_ORIGINS="*"; ollama serve - Linux/Mac:
OLLAMA_ORIGINS="*" ollama serve
- Windows:
- Connect: Click "Change" next to the Ollama URL in the app and paste your tunnel URL.
- Framework: Next.js 15 (App Router)
- UI: Tailwind CSS
- PDF Engine: PDF.js
- OCR: Tesseract.js
- Database: IndexedDB (via
idb) - AI Connectivity: Groq Cloud SDK & Ollama API
This project is licensed under the MIT License - see the LICENSE file for details.
Built with ❤️ for privacy and productivity.