Skip to content

enyst/OpenHands-Tab

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

932 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

OpenHands-Tab Prototype

image image image

A VS Code extension for interacting with OpenHands AI agents directly in your IDE.

Features

  • UX first: switching LLM at runtime
  • Streaming event display and most other OpenHands features
  • Local mode (runs agent in VS Code) or remote mode (connects to agent-server)
  • Might also have a lil' cheesy Easter Egg (because why not?)

Installation for AI agents:

git clone https://github.com/enyst/OpenHands-Tab.git
cd OpenHands-Tab
npm install
npm run build

Installation for humans:

Open VSCode Extensions tab > Install from .vsix > point to the downloaded .vsix

Alternatively, for humans in development mode:

  • Run OpenHands-CLI, or another agent of your choice, including itself
  • Tell it to clone, build, install, or ask it questions about this repo if you'd like. It can tell stories.

Development (recommended)

  1. Run OpenHands-CLI in the extension directory
  2. Tell it to build, install, and run VSCode in dev/debug on cwd
  3. Have fun!

Development (old style)

  1. Open the project in VS Code
  2. Press F5 to launch Extension Development Host
  3. Click the OpenHands icon in the Activity Bar to reveal the chat sidebar view

Configuration

  • You can use LLM Profiles View or regular VS Code Settings to set LLM Providers API key(s)
  • Set Gemini API key (used for summarizations, highly recommended)
  • Set GitHub token
  • Leave server URL blank for local mode, or set it to connect to an agent-server

Using Gemini: Gemini can be used in three ways:

  • As the main agent LLM: set openhands.llm.profileId to a Gemini profile id (e.g., gemini-flash) and configure your API key via OpenHands: Set Gemini API Key
  • As utility for summarization: highly recommended, just set the key and it will be used for the built-in Gemini profiles
  • For HAL voice confirmation (optional): HAL uses its own Gemini profile specified by openhands.hal.llmProfileId (default: gemini-flash-hal) for audio understanding in voice_confirm mode

Documentation

Document Description
AGENTS.md Quick reference for AI agents
docs/PRD.md Product requirements and architecture
docs/llm_profiles.md LLM Profiles
docs/agent-sdk-architecture.md SDK architecture details
docs/vscode_local_setup.md Local development setup
docs/vscode_remote_setup.md Headless/remote setup

Architecture

This is an npm workspace with two packages:

  1. Root package - VS Code extension (src/)
  2. @smolpaws/agent-sdk - TypeScript SDK (packages/agent-sdk/)

The SDK provides the Conversation API, LLM clients, tools, and protocol types.

Commands

npm run build           # Build SDK + extension + webview
npm run compile         # Compile TypeScript + Tailwind + webview (faster)
npm test                # Run all tests
npm run lint            # Lint all code
npm run package         # Package extension as VSIX

License

MIT

About

OpenHands Tab VS Code extension prototype

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Contributors 8