Ever wonder where all your AI tokens are going? This plugin gives you a clear, visual breakdown of exactly how tokens are being used in your OpenCode sessions.
- See Your Token Usage: Get instant insights into how tokens are distributed across your conversations
- Track Individual Tools: Find out which tools (
read,bash,webfetch, etc.) consume the most tokens - Visual Charts: Easy-to-read bar charts show percentages and counts at a glance
- Smart Analysis: Automatically identifies different types of content (system prompts, user messages, tools, etc.)
- Works Everywhere: Compatible with OpenAI, Claude, Llama, Mistral, DeepSeek, and more
-
Clone the plugin
git clone https://github.com/IgorWarzocha/Opencode-Context-Analysis-Plugin.git
-
Copy .opencode folder to your project
cp -r Opencode-Context-Analysis-Plugin/.opencode ./
-
Restart OpenCode and type
/context
Quick Installation: Just paste this entire README into OpenCode and ask it to install the plugin for you!
That's it! You'll see a detailed breakdown like this:
-
Clone the plugin
git clone https://github.com/IgorWarzocha/Opencode-Context-Analysis-Plugin.git
-
Copy .opencode folder to your project
cp -r Opencode-Context-Analysis-Plugin/.opencode ./
-
Restart OpenCode -
/contextcommand will be available
Verify it worked: Type / in OpenCode and you should see /context in suggestions.
Want /context available everywhere? Copy the plugin to your global OpenCode config:
-
Clone the plugin (if you haven't already)
git clone https://github.com/IgorWarzocha/Opencode-Context-Analysis-Plugin.git
-
Copy .opencode folder to global config location
cp -r Opencode-Context-Analysis-Plugin/.opencode ~/.config/opencode/ -
Restart OpenCode -
/contextwill work in any project
Note: Creates ~/.config/opencode/ if it doesn't exist.
| Method | Scope | Location | Use Case |
|---|---|---|---|
| Project | Single project | your-project/.opencode/ |
Project-specific context analysis |
| Global | All projects | ~/.config/opencode/.opencode/ |
Universal access across projects |
Plugin not loading: Check that .opencode/plugin/context-usage.ts exists
Command not found: Make sure you copied the .opencode folder to your project root
Git clone failed: Check your internet connection and GitHub access
/context # Standard analysis
/context detailed # More detailed breakdown
/context short # Quick summary
/context verbose # Everything includedCustom verbosity - Use any description you want:
/context "extremely detailed" # Maximum detail
/context "just the basics" # Minimal info
/context "focus on tools" # Whatever you preferSpecific sessions:
/context sessionID:your-session-idLimit analysis depth:
/context limitMessages:5 # Only analyze last 5 messages- Which tools cost the most - See if
bash,read, orwebfetchare using the most tokens - System prompt impact - Understand how much context is being set up
- Your conversation patterns - See if you're writing long prompts or getting long responses
- Reasoning costs - For models that support it, see how much reasoning tokens cost
Dependencies: The plugin uses two main libraries for accurate token counting:
js-tiktoken- Official OpenAI tokenizer for GPT models@huggingface/transformers- Hugging Face tokenizers for Claude, Llama, Mistral, etc.
Installation Process: The plugin automatically handles tokenizer dependencies when you first use it:
- Downloads tokenizer libraries to a local
vendordirectory - Sets up everything without affecting your main project
- All token counting happens locally on your machine
Privacy: All token counting happens locally on your machine. No data is sent to external services.
If you prefer to set things up yourself:
- Clone the plugin and copy
.opencodedirectory to your OpenCode project - Install tokenizer dependencies manually:
npm install js-tiktoken@latest @huggingface/transformers@^3.3.3 --prefix .opencode/plugin/vendor
.
βββ .opencode/
β βββ command/
β β βββ context.md # Command definition
β βββ plugin/
β βββ context-usage.ts # Main plugin implementation
βββ README.md # This file
The plugin is written in TypeScript and runs directly in the OpenCode environment. No build step is required.
To test locally:
- Install in a test OpenCode project
- Start a session and run
/context - Verify token analysis appears correctly
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
This project is open source. See the repository for license details.
For issues, questions, or contributions:
- Open an issue on GitHub
- Check OpenCode documentation for plugin development
- Review the source code for implementation details
Made for OpenCode - Enhance your AI development workflow with detailed context analysis.


