Skip to content

GauravFrontend/querydoc

Repository files navigation

QueryDoc 📄🤖

The fully private, local-first AI document assistant.

QueryDoc is a high-performance, privacy-focused PDF analysis platform. It combines the power of local LLMs via Ollama with a sophisticated client-side RAG (Retrieval-Augmented Generation) engine to give you instant answers about your documents without your data ever leaving your machine.

Live Demo Next.js TypeScript Ollama License


🌟 Key Features

🏦 Local-First & Private

  • 100% Client-Side Processing: Your PDFs are processed in your browser.
  • Local AI Integration: Connects to your local Ollama instance for inference. No subscriptions, no data tracking.

📚 Multi-Document Management

  • Library View: Upload and manage multiple documents in a single session.
  • Cross-Doc Q&A: Ask a question once and QueryDoc searches across your entire library to find the most relevant context.
  • State Persistence: Documents and chat history are saved securely in your browser's IndexedDB and LocalStorage, surviving page reloads.

🧠 Advanced RAG Engine

  • Intelligent Chunking: Preserves document structure and coordinates for pixel-perfect source highlighting.
  • Query Expansion: Uses conversation history to understand follow-up questions (e.g., "What did he say about the budget?").
  • Source Pinpointing: When AI answers, it highlights the exact sentence or phrase in the PDF viewer.

⚡ Hybrid AI Strategy

  • Local Default: Optimized for Qwen 2.5 and Gemma 2 via Ollama.
  • Cloud Fallback: Integrated with Groq Cloud for blazing-fast responses when you're away from your local GPU.
  • Auto-Fallback: Seamlessly switches to Cloud models if Ollama is unreachable.

🔍 Smart Features

  • Auto-OCR: Scanned PDFs are automatically detected and processed via client-side Tesseract.js.
  • Context-Aware Summaries: Specialized summarization for Resumes, Contracts, and Research Papers.
  • Voice Reader (TTS): Listen to your documents with built-in Text-to-Speech support.
  • Quick Actions: Select any text in the PDF to instantly Explain, Summarize, or Search.

🚀 Getting Started

1. Prerequisites

  • Node.js 18+
  • Ollama (for local processing)

2. Setup Ollama

Download Ollama from ollama.com and pull the recommended models:

ollama pull qwen2.5:3b
ollama pull gemma2:2b

3. Install & Run

# Clone the repository
git clone https://github.com/your-username/querydoc.git
cd querydoc

# Install dependencies
npm install

# Start development server
npm run dev

Open http://localhost:3000 to start querying!


☁️ Using Deployed Version (Remote Ollama)

If you're using the Live Demo and want to connect it to your local Ollama:

  1. Expose Ollama: Use a tunnel like ngrok:
    ngrok http 11434
  2. Set Origins: Ensure Ollama accepts the connection:
    • Windows: $env:OLLAMA_ORIGINS="*"; ollama serve
    • Linux/Mac: OLLAMA_ORIGINS="*" ollama serve
  3. Connect: Click "Change" next to the Ollama URL in the app and paste your tunnel URL.

🛠️ Technical Stack


📄 License

This project is licensed under the MIT License - see the LICENSE file for details.


Built with ❤️ for privacy and productivity.

About

Query your PDFs, privately. https://querydoc-five.vercel.app/

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •