VimLM brings the power of AI directly into your Vim workflow. Maintain focus with keyboard-driven interactions while leveraging AI for code generation, refactoring, and documentation.
Get started quickly with the tutorial.
- Native Vim Integration - Split-window responses & intuitive keybindings
- Offline First - 100% local execution with MLX-compatible models
- Contextual Awareness - Integrates seamlessly with your codebase and external resources
- Conversational Workflow - Iterate on responses with follow-up queries
- Project Scaffolding - Generate and deploy code blocks to directories
- Extensible - Create custom LLM workflows with command chains
- Apple Silicon (M-series)
- Python 3.12.8
- Vim 9.1
pip install vimlm
vimlm
Key Binding | Mode | Action |
---|---|---|
Ctrl-l |
Normal/Visual | Prompt LLM |
Ctrl-j |
Normal | Continue conversation |
Ctrl-p |
Normal/Visual | Import generated code |
Esc |
Prompt | Cancel input |
Ctrl-l
to prompt LLM with context:
- Normal mode: Current file + line
- Visual mode: Current file + selected block
Example Prompt: Create a Chrome extension
Ctrl-j
to continue current thread.
Example Prompt: Use manifest V3 instead
Ctrl-p
to insert generated code block
- In Normal mode: Into last visual selection
- In Visual mode: Into current visual selection
Example Workflow:
- Select a block of code in Visual mode
- Prompt with
Ctrl-l
:Use regex to remove html tags from item.content
- Press
Ctrl-p
to replace selection with generated code
:VimLM [PROMPT] [!command1] [!command2]...
!
prefix to embed inline directives in prompts:
Directive | Description |
---|---|
!include PATH |
Add file/directory/shell output to context |
!deploy DEST |
Save code blocks to directory |
!continue N |
Continue stopped response |
!followup |
Continue conversation |
!include [PATH] # Add files/folders to context
!include
(no path): Current folder!include ~/projects/utils.py
: Specific file!include ~/docs/api-specs/
: Entire folder!include $(...)
: Shell command output
Example: Summarize recent changes !include $(git log --oneline -n 50)
!deploy [DEST_DIR] # Extract code blocks to directory
!deploy
(no path): Current directory!deploy ./src
: Specific directory
Example: Create REST API endpoint !deploy ./api
!continue [MAX_TOKENS] # Continue stopped response
!continue
: Default 2000 tokens!continue 3000
: Custom token limit
Example: tl;dr !include large-file.txt !continue 5000
:VimLM prompt [!command1] [!command2]...
Simplify complex tasks by chaining multiple commands together into a single, reusable Vim command.
Examples:
" Debug CI failures using error logs
:VimLM Fix Dockerfile !include .gitlab-ci.yml !include $(tail -n 20 ci.log)
" Generate unit tests for selected functions and save to test/
:VimLM Write pytest tests for this !include ./src !deploy ./test
" Add docstrings to all Python functions in file
:VimLM Add Google-style docstrings !include % !continue 4000
Edit ~/vimlm/cfg.json
:
{
"LLM_MODEL": "mlx-community/DeepSeek-R1-Distill-Qwen-7B-4bit",
"NUM_TOKEN": 32768
}
{
"USE_LEADER": true,
"KEY_MAP": {
"l": "]",
"j": "[",
"p": "p"
}
}
Apache 2.0 - See LICENSE for details.