Skip to content

AssertionError on filetype=anki + Local LLM Integration Issues #16

Closed
@SiHaag

Description

@SiHaag

Hello dear developer, I am unfortunately encountering issues with wdoc Querying Anki Database Using Local LLM.

System Information
• OS: macOS (M1 MacBook Air, 8GB RAM)
• Python Version: 3.11
• Wdoc Version: 2.5.7
• LLM Backend: Ollama (running deepseek-coder:1.3b as API on http://localhost:4000)
• Anki Profile Name: Neumed
• Command Used:
wdoc --task=query --filetype=anki --anki_profile "Neumed" --llms_api_bases='{"model": "http://localhost:4000"}'

Issue Description

I am trying to query my Anki collection (Neumed - name of one of my anki profiles) using wdoc while leveraging Ollama’s DeepSeek model as the LLM backend. However, I encountered an AssertionError stating:

  • AssertionError: Could not infer filetype of --filetype=anki. Use the 'filetype' argument.

Additionally, when running without --llms_api_bases, I get:

  • Exception: No environment variable named OPENAI_API_KEY found

indicating that wdoc defaults to OpenAI’s API even when a custom API (Ollama) is specified.

Steps to Reproduce
1. Installed and configured:
• Ollama via Homebrew (brew install ollama)
• Wdoc (v2.5.7)
• DeepSeek Coder (1.3B) as my local LLM backend
2. Verified that the model is running via:

  • curl -X GET http://localhost:4000/v1/models
    Response:
  • {"data":[{"id":"ollama/deepseek-coder:1.3b","object":"model","created":1677610602,"owned_by":"openai"}],"object":"list"}
    3. Tried querying Anki with:
  • wdoc --task=query --filetype=anki --anki_profile "Neumed" --llms_api_bases='{"model": "http://localhost:4000"}'
    4. Result: AssertionError for filetype=anki.

Expected Behavior
• The command should correctly load and search the Anki collection for relevant cards.
• The custom LLM (DeepSeek via Ollama) should be used without requiring OPENAI_API_KEY.
• The filetype=anki argument should be inferred correctly.

Observed Issues
1. filetype=anki is not recognized despite being a documented option.
2. Custom LLM backend (Ollama) is not seamlessly integrated; wdoc still expects an OpenAI API key.
3. There are no clear examples in the docs for querying Anki decks. The documentation could use an explicit example of querying Anki with a local LLM instead of OpenAI.

Suggested Fixes
• Explicitly document how to query Anki with a local LLM backend.
• Improve filetype inference for anki or allow manual specification.
• Ensure that llms_api_bases fully overrides OpenAI as the default LLM backend.

Additional Debugging Information
• When running:

  • wdoc --task=list --filetype=anki
    Error: AssertionError: invalid task value

    • I also tested with:

  • wdoc --task=query --filetype=anki --anki_profile "Neumed"
    Error: Exception: No environment variable named OPENAI_API_KEY found

What I Need Help With
1. Is filetype=anki fully implemented in wdoc v2.5.7, or does it require a workaround?
2. How do I properly specify an Ollama backend for wdoc queries?
3. Could you provide an example of a working Anki query using a local LLM?

Thank you once again @thiswillbeyourgithub! ❤ I hope you find the time to help with this issue.

Sihaag

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions