Skip to content

A powerful OpenCode plugin that provides detailed token usage analysis for your AI sessions. Track and understand how tokens are distributed across system prompts, user messages, assistant responses, tool outputs, and reasoning traces.

Notifications You must be signed in to change notification settings

IgorWarzocha/Opencode-Context-Analysis-Plugin

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

12 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

OpenCode Context Analysis Plugin

Ever wonder where all your AI tokens are going? This plugin gives you a clear, visual breakdown of exactly how tokens are being used in your OpenCode sessions.

🎯 What It Does

  • See Your Token Usage: Get instant insights into how tokens are distributed across your conversations
  • Track Individual Tools: Find out which tools (read, bash, webfetch, etc.) consume the most tokens
  • Visual Charts: Easy-to-read bar charts show percentages and counts at a glance
  • Smart Analysis: Automatically identifies different types of content (system prompts, user messages, tools, etc.)
  • Works Everywhere: Compatible with OpenAI, Claude, Llama, Mistral, DeepSeek, and more

πŸš€ Quick Start (2 Steps)

  1. Clone the plugin

    git clone https://github.com/IgorWarzocha/Opencode-Context-Analysis-Plugin.git
  2. Copy .opencode folder to your project

    cp -r Opencode-Context-Analysis-Plugin/.opencode ./
  3. Restart OpenCode and type /context

Quick Installation: Just paste this entire README into OpenCode and ask it to install the plugin for you!

That's it! You'll see a detailed breakdown like this:

No arguments:

image

/context extremely detailed:

image image

πŸ› οΈ Installation Options

For a Single Project

  1. Clone the plugin

    git clone https://github.com/IgorWarzocha/Opencode-Context-Analysis-Plugin.git
  2. Copy .opencode folder to your project

    cp -r Opencode-Context-Analysis-Plugin/.opencode ./
  3. Restart OpenCode - /context command will be available

Verify it worked: Type / in OpenCode and you should see /context in suggestions.

For All Projects (Global)

Want /context available everywhere? Copy the plugin to your global OpenCode config:

  1. Clone the plugin (if you haven't already)

    git clone https://github.com/IgorWarzocha/Opencode-Context-Analysis-Plugin.git
  2. Copy .opencode folder to global config location

    cp -r Opencode-Context-Analysis-Plugin/.opencode ~/.config/opencode/
  3. Restart OpenCode - /context will work in any project

Note: Creates ~/.config/opencode/ if it doesn't exist.

Installation Summary

Method Scope Location Use Case
Project Single project your-project/.opencode/ Project-specific context analysis
Global All projects ~/.config/opencode/.opencode/ Universal access across projects

Troubleshooting

Plugin not loading: Check that .opencode/plugin/context-usage.ts exists Command not found: Make sure you copied the .opencode folder to your project root Git clone failed: Check your internet connection and GitHub access

πŸ“– Usage Guide

Basic Commands

/context                    # Standard analysis
/context detailed            # More detailed breakdown
/context short               # Quick summary
/context verbose             # Everything included

Advanced Options

Custom verbosity - Use any description you want:

/context "extremely detailed"  # Maximum detail
/context "just the basics"     # Minimal info
/context "focus on tools"      # Whatever you prefer

Specific sessions:

/context sessionID:your-session-id

Limit analysis depth:

/context limitMessages:5    # Only analyze last 5 messages

What You'll Learn

  • Which tools cost the most - See if bash, read, or webfetch are using the most tokens
  • System prompt impact - Understand how much context is being set up
  • Your conversation patterns - See if you're writing long prompts or getting long responses
  • Reasoning costs - For models that support it, see how much reasoning tokens cost

πŸ”§ How It Works

Dependencies: The plugin uses two main libraries for accurate token counting:

  • js-tiktoken - Official OpenAI tokenizer for GPT models
  • @huggingface/transformers - Hugging Face tokenizers for Claude, Llama, Mistral, etc.

Installation Process: The plugin automatically handles tokenizer dependencies when you first use it:

  1. Downloads tokenizer libraries to a local vendor directory
  2. Sets up everything without affecting your main project
  3. All token counting happens locally on your machine

Privacy: All token counting happens locally on your machine. No data is sent to external services.

Manual Installation (Advanced)

If you prefer to set things up yourself:

  1. Clone the plugin and copy .opencode directory to your OpenCode project
  2. Install tokenizer dependencies manually:
    npm install js-tiktoken@latest @huggingface/transformers@^3.3.3 --prefix .opencode/plugin/vendor

πŸ› οΈ Development

Project Structure

.
β”œβ”€β”€ .opencode/
β”‚ β”œβ”€β”€ command/
β”‚ β”‚ └── context.md # Command definition
β”‚ └── plugin/
β”‚ └── context-usage.ts # Main plugin implementation
└── README.md # This file

Building and Testing

The plugin is written in TypeScript and runs directly in the OpenCode environment. No build step is required.

To test locally:

  1. Install in a test OpenCode project
  2. Start a session and run /context
  3. Verify token analysis appears correctly

Contributing

  1. Fork the repository
  2. Create a feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

License

This project is open source. See the repository for license details.

Support

For issues, questions, or contributions:

  • Open an issue on GitHub
  • Check OpenCode documentation for plugin development
  • Review the source code for implementation details

Made for OpenCode - Enhance your AI development workflow with detailed context analysis.

About

A powerful OpenCode plugin that provides detailed token usage analysis for your AI sessions. Track and understand how tokens are distributed across system prompts, user messages, assistant responses, tool outputs, and reasoning traces.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published