LLM-powered summarizer for Omnivore App articles using Ollama and LangChain.
For now works for subscriptions only.
(Optional) Install uv package manager
Then run:
make init # or copy the pip commands from the Makefile if no uv
After that:
- Fill in
.env-example
and rename it to.env
.OMNIVORE_API_KEY
- your Omnivore API keyOLLAMA_MODEL_ID
- your Ollama model ID
- Create
summarized
andread later
labels in your Omnivore App (will be automated in the future) - (Optional) Create saved searches in your Omnivore App for the labels you created
- Download LLaMa3 model:
- Ollama LlaMa3
- also works with other models, but tokenization estimation might be off (for now)
- run
ollama serve
to start the Ollama server
make run # or python main.py
Work in progress!
- Fetches all subscriptions from the last 24 hours
- Filters out artcles that are larger then context of the model (8192)
- Summarizes the articles with later options to
read later
orarchive
- TBA