Skip to content

feat(context)!: switch from contexts to function calling #1029

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 46 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
46 commits
Select commit Hold shift + click to select a range
52dfc5e
feat(context)!: switch from contexts to tools with function calling
deathbeam Mar 17, 2025
46f6f3c
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Apr 7, 2025
793f9fb
refactor!: remove copilot extension agents
deathbeam Apr 7, 2025
d8939e9
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Apr 7, 2025
f75b28d
fix: handle empty arguments in tool calls
deathbeam Apr 7, 2025
176c4e6
feat: add support for tool groups
deathbeam Apr 7, 2025
da42dd6
feat: simplify tool groups
deathbeam Apr 7, 2025
94acafa
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Apr 7, 2025
4097286
cleanup readme
deathbeam Apr 7, 2025
6aad1ad
add back references virtual
deathbeam Apr 7, 2025
462c21e
fix default params
deathbeam Apr 7, 2025
623026c
fix system prompt
deathbeam Apr 8, 2025
f7ec7eb
improve tool usage system prompt slightly
deathbeam Apr 8, 2025
cc1b28a
cleanup edit file instructions a bit
deathbeam Apr 8, 2025
013991d
fix gemini api
deathbeam Apr 8, 2025
d52d146
fix outline cache
deathbeam Apr 8, 2025
31fbef4
group tools properly in completion
deathbeam Apr 8, 2025
8738955
fix diagnostics
deathbeam Apr 8, 2025
96e1abf
remove actions as they are deprecated for a while now
deathbeam Apr 8, 2025
c2414d5
remove more deprecation
deathbeam Apr 8, 2025
23f6085
improve tool errors
deathbeam Apr 8, 2025
d4b04c0
remove default system tool, use vim.uri_from_fname
deathbeam Apr 9, 2025
9f9bbfc
improve model listing
deathbeam Apr 9, 2025
bb9bf0d
add min severity to diagnostics
deathbeam Apr 9, 2025
12bdc25
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Apr 9, 2025
1280dcb
improve resource resolution, add better headers
deathbeam Apr 10, 2025
13a4bf2
rename tools to functions, add resource support
deathbeam Apr 11, 2025
d978555
reorganize a bit
deathbeam Apr 12, 2025
43cf120
simplify function output format
deathbeam Apr 12, 2025
2a66e89
improve info
deathbeam Apr 12, 2025
09bec05
improve resources display
deathbeam Apr 12, 2025
886a3fd
fix issues with selection
deathbeam Apr 12, 2025
e5b9f05
fix filetype conversions
deathbeam Apr 12, 2025
81fe707
start updating docs
deathbeam Apr 12, 2025
813c5b1
update config desc
deathbeam Apr 12, 2025
74a7835
more readme updates
deathbeam Apr 12, 2025
595c75e
remove unused stuff
deathbeam Apr 12, 2025
2f553ef
improve selection and resources display
deathbeam Apr 12, 2025
65c66f8
fix token count and simplify stuff
deathbeam Apr 14, 2025
49336a9
remove unused stuff
deathbeam Apr 14, 2025
c090c76
start working on allowing sharing single tool
deathbeam Apr 15, 2025
1150c2a
rename agent to group, properly allow single tool @
deathbeam Apr 15, 2025
076b203
fix empty resource sharing
deathbeam Apr 15, 2025
793eb21
show groups in @ select
deathbeam Apr 15, 2025
4b5e445
use add_highlight instead of vim.hl.range
deathbeam Apr 16, 2025
38e583c
let people handle their diff coloring by themselves
deathbeam Apr 20, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
175 changes: 80 additions & 95 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,14 +18,14 @@ https://github.com/user-attachments/assets/8cad5643-63b2-4641-a5c4-68bc313f20e6

CopilotChat.nvim is a Neovim plugin that brings GitHub Copilot Chat capabilities directly into your editor. It provides:

- 🤖 GitHub Copilot Chat integration with official model and agent support (GPT-4o, Claude 3.7 Sonnet, Gemini 2.0 Flash, and more)
- 🤖 GitHub Copilot Chat integration with official model support (GPT-4o, Claude 3.7 Sonnet, Gemini 2.0 Flash, and more)
- 💻 Rich workspace context powered by smart embeddings system
- 🔒 Explicit context sharing - only sends what you specifically request, either as context or selection (by default visual selection)
- 🔌 Modular provider architecture supporting both official and custom LLM backends (Ollama, LM Studio, Mistral.ai and more)
- 🔒 Explicit data sharing - only sends what you specifically request, either as resource or selection (by default visual selection)
- 🔌 Modular provider architecture supporting both official and custom LLM backends (Ollama, Gemini, Mistral.ai and more)
- 📝 Interactive chat UI with completion, diffs and quickfix integration
- 🎯 Powerful prompt system with composable templates and sticky prompts
- 🔄 Extensible context providers for granular workspace understanding (buffers, files, git diffs, URLs, and more)
- ⚡ Efficient token usage with tiktoken token counting and memory management
- 🔄 Extensible function calling system for granular workspace understanding (buffers, files, git diffs, URLs, and more)
- ⚡ Efficient token usage with tiktoken token counting and history management

# Requirements

Expand Down Expand Up @@ -62,8 +62,7 @@ Plugin features that use picker:

- `:CopilotChatPrompts` - for selecting prompts
- `:CopilotChatModels` - for selecting models
- `:CopilotChatAgents` - for selecting agents
- `#<context>:<input>` - for selecting context input
- `#<function>:<input>` - for selecting function input

# Installation

Expand Down Expand Up @@ -148,7 +147,6 @@ Commands are used to control the chat interface:
| `:CopilotChatLoad <name>?` | Load chat history |
| `:CopilotChatPrompts` | View/select prompt templates |
| `:CopilotChatModels` | View/select available models |
| `:CopilotChatAgents` | View/select available agents |
| `:CopilotChat<PromptName>` | Use specific prompt template |

## Key Mappings
Expand Down Expand Up @@ -253,7 +251,7 @@ Define your own system prompts in the configuration (similar to `prompts`):

### Sticky Prompts

Sticky prompts persist across chat sessions. They're useful for maintaining context or agent selection. They work as follows:
Sticky prompts persist across chat sessions. They're useful for maintaining model or resource selection. They work as follows:

1. Prefix text with `> ` using markdown blockquote syntax
2. The prompt will be copied at the start of every new chat prompt
Expand All @@ -262,7 +260,7 @@ Sticky prompts persist across chat sessions. They're useful for maintaining cont
Examples:

```markdown
> #files
> #glob:`*.lua`
> List all files in the workspace

> @models Using Mistral-small
Expand All @@ -274,15 +272,12 @@ You can also set default sticky prompts in the configuration:
```lua
{
sticky = {
'@models Using Mistral-small',
'#files',
'#glob:*.lua',
}
}
```

## Models and Agents

### Models
## Models

You can control which AI model to use in three ways:

Expand All @@ -295,69 +290,65 @@ For supported models, see:
- [Copilot Chat Models](https://docs.github.com/en/copilot/using-github-copilot/ai-models/changing-the-ai-model-for-copilot-chat#ai-models-for-copilot-chat)
- [GitHub Marketplace Models](https://github.com/marketplace/models) (experimental, limited usage)

### Agents

Agents determine the AI assistant's capabilities. Control agents in three ways:

1. List available agents with `:CopilotChatAgents`
2. Set agent in prompt with `@agent_name`
3. Configure default agent via `agent` config key

The default "noop" agent is `none`. For more information:

- [Extension Agents Documentation](https://docs.github.com/en/copilot/using-github-copilot/using-extensions-to-integrate-external-tools-with-copilot-chat)
- [Available Agents](https://github.com/marketplace?type=apps&copilot_app=true)

## Contexts

Contexts provide additional information to the chat. Add context using `#context_name[:input]` syntax:

| Context | Input Support | Description |
| ----------- | ------------- | ----------------------------------- |
| `buffer` | ✓ (number) | Current or specified buffer content |
| `buffers` | ✓ (type) | All buffers content (listed/all) |
| `file` | ✓ (path) | Content of specified file |
| `files` | ✓ (glob) | Workspace files |
| `filenames` | ✓ (glob) | Workspace file names |
| `git` | ✓ (ref) | Git diff (unstaged/staged/commit) |
| `url` | ✓ (url) | Content from URL |
| `register` | ✓ (name) | Content of vim register |
| `quickfix` | - | Quickfix list file contents |
| `system` | ✓ (command) | Output of shell command |

> [!TIP]
> The AI is aware of these context providers and may request additional context
> if needed by asking you to input a specific context command like `#file:path/to/file.js`.
## Functions

Functions provide additional information and behaviour to the chat.
Tools can be organized into groups by setting the `group` property. Tools assigned to a group are not automatically made available to the LLM - they must be explicitly activated.
To use grouped tools in your prompt, include `@group_name` in your message. This allows the LLM to access and use all tools in that group during the current interaction.
Add tools using `#tool_name[:input]` syntax:

| Function | Input Support | Description |
| ------------- | ------------- | ------------------------------------------------------ |
| `buffer` | ✓ (name) | Retrieves content from a specific buffer |
| `buffers` | ✓ (scope) | Fetches content from multiple buffers (listed/visible) |
| `diagnostics` | ✓ (scope) | Collects code diagnostics (errors, warnings) |
| `file` | ✓ (path) | Reads content from a specified file path |
| `gitdiff` | ✓ (sha) | Retrieves git diff information (unstaged/staged/sha) |
| `gitstatus` | - | Retrieves git status information |
| `glob` | ✓ (pattern) | Lists filenames matching a pattern in workspace |
| `grep` | ✓ (pattern) | Searches for a pattern across files in workspace |
| `quickfix` | - | Includes content of files in quickfix list |
| `register` | ✓ (register) | Provides access to specified Vim register |
| `url` | ✓ (url) | Fetches content from a specified URL |

Examples:

```markdown
> #buffer
> #buffer:2
> #files:\*.lua
> #filenames
> #buffer:init.lua
> #buffers:visible
> #diagnostics:current
> #file:path/to/file.js
> #git:staged
> #glob:`**/*.lua`
> #grep:`function setup`
> #quickfix
> #register:+
> #url:https://example.com
> #system:`ls -la | grep lua`
```

Define your own contexts in the configuration with input handling and resolution:
Define your own functions in the configuration with input handling and schema:

```lua
{
contexts = {
functions = {
birthday = {
input = function(callback)
vim.ui.select({ 'user', 'napoleon' }, {
prompt = 'Select birthday> ',
}, callback)
end,
description = "Retrieves birthday information for a person",
schema = {
type = 'object',
required = { 'name' },
properties = {
name = {
type = 'string',
enum = { 'Alice', 'Bob', 'Charlie' },
description = "Person's name",
},
},
},
resolve = function(input)
return {
{
content = input .. ' birthday info',
filename = input .. '_birthday',
filetype = 'text',
type = 'text',
data = input.name .. ' birthday info',
}
}
end
Expand All @@ -366,9 +357,9 @@ Define your own contexts in the configuration with input handling and resolution
}
```

### External Contexts
### External Functions

For external contexts, see the [contexts discussion page](https://github.com/CopilotC-Nvim/CopilotChat.nvim/discussions/categories/contexts).
For external functions implementations, see the [discussion page](https://github.com/CopilotC-Nvim/CopilotChat.nvim/discussions/categories/functions).

## Selections

Expand Down Expand Up @@ -430,9 +421,6 @@ Custom providers can implement these methods:

-- Optional: Get available models
get_models?(headers: table): table<CopilotChat.Provider.model>,

-- Optional: Get available agents
get_agents?(headers: table): table<CopilotChat.Provider.agent>,
}
```

Expand All @@ -454,15 +442,14 @@ Below are all available configuration options with their default values:
system_prompt = 'COPILOT_INSTRUCTIONS', -- System prompt to use (can be specified manually in prompt via /).

model = 'gpt-4o-2024-11-20', -- Default model to use, see ':CopilotChatModels' for available models (can be specified manually in prompt via $).
agent = 'copilot', -- Default agent to use, see ':CopilotChatAgents' for available agents (can be specified manually in prompt via @).
context = nil, -- Default context or array of contexts to use (can be specified manually in prompt via #).
sticky = nil, -- Default sticky prompt or array of sticky prompts to use at start of every new chat.
group = nil, -- Default group of tools or array of groups to use (can be specified manually in prompt via @).
sticky = nil, -- Default sticky prompt or array of sticky prompts to use at start of every new chat (can be specified manually in prompt via >).

temperature = 0.1, -- GPT result temperature
headless = false, -- Do not write to chat buffer and use history (useful for using custom processing)
stream = nil, -- Function called when receiving stream updates (returned string is appended to the chat buffer)
callback = nil, -- Function called when full response is received (retuned string is stored to history)
remember_as_sticky = true, -- Remember model/agent/context as sticky prompts when asking questions
remember_as_sticky = true, -- Remember model as sticky prompts when asking questions

-- default selection
-- see select.lua for implementation
Expand Down Expand Up @@ -520,27 +507,31 @@ Below are all available configuration options with their default values:
},
},

-- default contexts
-- see config/contexts.lua for implementation
contexts = {
-- default tools
-- see config/tools.lua for implementation
tools = {
buffer = {
},
buffers = {
},
file = {
},
files = {
glob = {
},
git = {
grep = {
},
quickfix = {
},
diagnostics = {
},
gitdiff = {
},
gitstatus = {
},
url = {
},
register = {
},
quickfix = {
},
system = {
}
},

-- default prompts
Expand Down Expand Up @@ -568,7 +559,7 @@ Below are all available configuration options with their default values:
},
Commit = {
prompt = 'Write commit message for the change with commitizen convention. Keep the title under 50 characters and wrap message at 72 characters. Format as a gitcommit code block.',
context = 'git:staged',
sticky = '#git:staged',
},
},

Expand Down Expand Up @@ -618,9 +609,6 @@ Below are all available configuration options with their default values:
full_diff = false, -- Show full diff instead of unified diff when showing diff window
},
show_info = {
normal = 'gi',
},
show_context = {
normal = 'gc',
},
show_help = {
Expand Down Expand Up @@ -660,8 +648,8 @@ Types of copilot highlights:
- `CopilotChatStatus` - Status and spinner in chat buffer
- `CopilotChatHelp` - Help messages in chat buffer (help, references)
- `CopilotChatSelection` - Selection highlight in source buffer
- `CopilotChatKeyword` - Keyword highlight in chat buffer (e.g. prompts, contexts)
- `CopilotChatInput` - Input highlight in chat buffer (for contexts)
- `CopilotChatKeyword` - Keyword highlight in chat buffer (e.g. prompts, tools)
- `CopilotChatAnnotation` - Annotation highlight in chat buffer (file headers, tool call headers, tool call body)

# API Reference

Expand All @@ -674,8 +662,7 @@ local chat = require("CopilotChat")
chat.ask(prompt, config) -- Ask a question with optional config
chat.response() -- Get the last response text
chat.resolve_prompt() -- Resolve prompt references
chat.resolve_context() -- Resolve context embeddings (WARN: async, requires plenary.async.run)
chat.resolve_agent() -- Resolve agent from prompt (WARN: async, requires plenary.async.run)
chat.resolve_tools() -- Resolve tools that are available for automatic use by LLM
chat.resolve_model() -- Resolve model from prompt (WARN: async, requires plenary.async.run)

-- Window Management
Expand All @@ -693,10 +680,9 @@ chat.set_source(winnr) -- Set the source window
chat.get_selection() -- Get the current selection
chat.set_selection(bufnr, start_line, end_line, clear) -- Set or clear selection

-- Prompt & Context Management
-- Prompt & Model Management
chat.select_prompt(config) -- Open prompt selector with optional config
chat.select_model() -- Open model selector
chat.select_agent() -- Open agent selector
chat.prompts() -- Get all available prompts

-- Completion
Expand Down Expand Up @@ -747,22 +733,21 @@ window:overlay(opts) -- Show overlay with specified options
```lua
-- Open chat, ask a question and handle response
require("CopilotChat").open()
require("CopilotChat").ask("Explain this code", {
require("CopilotChat").ask("#buffer Explain this code", {
callback = function(response)
vim.notify("Got response: " .. response:sub(1, 50) .. "...")
return response
end,
context = "buffer"
})

-- Save and load chat history
require("CopilotChat").save("my_debugging_session")
require("CopilotChat").load("my_debugging_session")

-- Use custom context and model
-- Use custom sticky and model
require("CopilotChat").ask("How can I optimize this?", {
model = "gpt-4o",
context = {"buffer", "git:staged"}
sticky = {"#buffer", "#git:staged"}
})
```

Expand Down
49 changes: 0 additions & 49 deletions lua/CopilotChat/actions.lua

This file was deleted.

Loading
Loading