Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
84 changes: 84 additions & 0 deletions crates/goose-mcp/src/developer/editor_models/EDITOR_API_EXAMPLE.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,84 @@
# Enhanced Code Editing with AI Models

The developer extension now supports using AI models for enhanced code editing through the `str_replace` command. When configured, it will use an AI model to intelligently apply code changes instead of simple string replacement.

## Configuration

Set these environment variables to enable AI-powered code editing:

```bash
export GOOSE_EDITOR_API_KEY="your-api-key-here"
export GOOSE_EDITOR_HOST="https://api.openai.com/v1"
export GOOSE_EDITOR_MODEL="gpt-4o"
```

**All three environment variables must be set and non-empty for the feature to activate.**

### Supported Providers

Any OpenAI-compatible API endpoint should work. Examples:

**OpenAI:**
```bash
export GOOSE_EDITOR_API_KEY="sk-..."
export GOOSE_EDITOR_HOST="https://api.openai.com/v1"
export GOOSE_EDITOR_MODEL="gpt-4o"
```

**Anthropic (via OpenAI-compatible proxy):**
```bash
export GOOSE_EDITOR_API_KEY="sk-ant-..."
export GOOSE_EDITOR_HOST="https://api.anthropic.com/v1"
export GOOSE_EDITOR_MODEL="claude-3-5-sonnet-20241022"
```

**Morph:**
```bash
export GOOSE_EDITOR_API_KEY="sk-..."
export GOOSE_EDITOR_HOST="https://api.morphllm.com/v1"
export GOOSE_EDITOR_MODEL="morph-v0"
```

**Relace**
```bash
export GOOSE_EDITOR_API_KEY="rlc-..."
export GOOSE_EDITOR_HOST="https://instantapply.endpoint.relace.run/v1/apply"
export GOOSE_EDITOR_MODEL="auto"
```

**Local/Custom endpoints:**
```bash
export GOOSE_EDITOR_API_KEY="your-key"
export GOOSE_EDITOR_HOST="http://localhost:8000/v1"
export GOOSE_EDITOR_MODEL="your-model"
```

## How it works

When you use the `str_replace` command in the text editor:

1. **Configuration check**: The system first checks if all three environment variables are properly set and non-empty.

2. **With AI enabled**: If configured, the system sends the original code and your requested change to the configured AI model, which intelligently applies the change while maintaining code structure, formatting, and context.

3. **Fallback**: If the AI API is not configured or the API call fails, it falls back to simple string replacement as before.

4. **User feedback**: The first time you use `str_replace` without AI configuration, you'll see a helpful message explaining how to enable the feature.

## Benefits

- **Context-aware editing**: The AI understands code structure and can make more intelligent changes
- **Better formatting**: Maintains consistent code style and formatting
- **Error prevention**: Can catch and fix potential issues during the edit
- **Flexible**: Works with any OpenAI-compatible API
- **Clean implementation**: Uses proper control flow instead of exception handling for configuration checks

## Implementation Details

The implementation follows idiomatic Rust patterns:
- Environment variables are checked upfront before attempting API calls
- No exceptions are used for normal control flow
- Clear separation between configured and unconfigured states
- Graceful fallback behavior in all cases

The feature is completely optional and backwards compatible - if not configured, the system works exactly as before with simple string replacement.
98 changes: 98 additions & 0 deletions crates/goose-mcp/src/developer/editor_models/mod.rs
Original file line number Diff line number Diff line change
@@ -0,0 +1,98 @@
mod morphllm_editor;
mod openai_compatible_editor;
mod relace_editor;

use anyhow::Result;

pub use morphllm_editor::MorphLLMEditor;
pub use openai_compatible_editor::OpenAICompatibleEditor;
pub use relace_editor::RelaceEditor;

/// Enum for different editor models that can perform intelligent code editing
#[derive(Debug)]
pub enum EditorModel {
MorphLLM(MorphLLMEditor),
OpenAICompatible(OpenAICompatibleEditor),
Relace(RelaceEditor),
}

impl EditorModel {
/// Call the editor API to perform intelligent code replacement
pub async fn edit_code(
&self,
original_code: &str,
old_str: &str,
update_snippet: &str,
) -> Result<String, String> {
match self {
EditorModel::MorphLLM(editor) => {
editor
.edit_code(original_code, old_str, update_snippet)
.await
}
EditorModel::OpenAICompatible(editor) => {
editor
.edit_code(original_code, old_str, update_snippet)
.await
}
EditorModel::Relace(editor) => {
editor
.edit_code(original_code, old_str, update_snippet)
.await
}
}
}

/// Get the description for the str_replace command when this editor is active
pub fn get_str_replace_description(&self) -> &'static str {
match self {
EditorModel::MorphLLM(editor) => editor.get_str_replace_description(),
EditorModel::OpenAICompatible(editor) => editor.get_str_replace_description(),
EditorModel::Relace(editor) => editor.get_str_replace_description(),
}
}
}

/// Trait for individual editor implementations
pub trait EditorModelImpl {
/// Call the editor API to perform intelligent code replacement
async fn edit_code(
&self,
original_code: &str,
old_str: &str,
update_snippet: &str,
) -> Result<String, String>;

/// Get the description for the str_replace command when this editor is active
fn get_str_replace_description(&self) -> &'static str;
}

/// Factory function to create the appropriate editor model based on environment variables
pub fn create_editor_model() -> Option<EditorModel> {
// Don't use Editor API during tests
if cfg!(test) {
return None;
}

// Check if basic editor API variables are set
let api_key = std::env::var("GOOSE_EDITOR_API_KEY").ok()?;
let host = std::env::var("GOOSE_EDITOR_HOST").ok()?;
let model = std::env::var("GOOSE_EDITOR_MODEL").ok()?;

if api_key.is_empty() || host.is_empty() || model.is_empty() {
return None;
}

// Determine which editor to use based on the host
if host.contains("relace.run") {
Some(EditorModel::Relace(RelaceEditor::new(api_key, host, model)))
} else if host.contains("api.morphllm") {
Some(EditorModel::MorphLLM(MorphLLMEditor::new(
api_key, host, model,
)))
} else {
Some(EditorModel::OpenAICompatible(OpenAICompatibleEditor::new(
api_key, host, model,
)))
}
}
119 changes: 119 additions & 0 deletions crates/goose-mcp/src/developer/editor_models/morphllm_editor.rs
Original file line number Diff line number Diff line change
@@ -0,0 +1,119 @@
use super::EditorModelImpl;
use anyhow::Result;
use reqwest::Client;
use serde_json::{json, Value};

/// MorphLLM editor that uses the standard chat completions format
#[derive(Debug)]
pub struct MorphLLMEditor {
api_key: String,
host: String,
model: String,
}

impl MorphLLMEditor {
pub fn new(api_key: String, host: String, model: String) -> Self {
Self {
api_key,
host,
model,
}
}
}

impl EditorModelImpl for MorphLLMEditor {
async fn edit_code(
&self,
original_code: &str,
_old_str: &str,
update_snippet: &str,
) -> Result<String, String> {
eprintln!("Calling MorphLLM Editor API");

// Construct the full URL
let provider_url = if self.host.ends_with("/chat/completions") {
self.host.clone()
} else if self.host.ends_with('/') {
format!("{}chat/completions", self.host)
} else {
format!("{}/chat/completions", self.host)
};

// Create the client
let client = Client::new();

// Format the prompt as specified in the Python example
let user_prompt = format!(
"<code>{}</code>\n<update>{}</update>",
original_code, update_snippet
);

// Prepare the request body for OpenAI-compatible API
let body = json!({
"model": self.model,
"messages": [
{
"role": "user",
"content": user_prompt
}
]
});

// Send the request
let response = match client
.post(&provider_url)
.header("Content-Type", "application/json")
.header("Authorization", format!("Bearer {}", self.api_key))
.json(&body)
.send()
.await
{
Ok(resp) => resp,
Err(e) => return Err(format!("Request error: {}", e)),
};

// Process the response
if !response.status().is_success() {
return Err(format!("API error: HTTP {}", response.status()));
}

// Parse the JSON response
let response_json: Value = match response.json().await {
Ok(json) => json,
Err(e) => return Err(format!("Failed to parse response: {}", e)),
};

// Extract the content from the response
let content = response_json
.get("choices")
.and_then(|choices| choices.get(0))
.and_then(|choice| choice.get("message"))
.and_then(|message| message.get("content"))
.and_then(|content| content.as_str())
.ok_or_else(|| "Invalid response format".to_string())?;

eprintln!("MorphLLM Editor API worked");
Ok(content.to_string())
}

fn get_str_replace_description(&self) -> &'static str {
"Use the edit_file to propose an edit to an existing file.
This will be read by a less intelligent model, which will quickly apply the edit. You should make it clear what the edit is, while also minimizing the unchanged code you write.
When writing the edit, you should specify each edit in sequence, with the special comment // ... existing code ... to represent unchanged code in between edited lines.

For example:
// ... existing code ...
FIRST_EDIT
// ... existing code ...
SECOND_EDIT
// ... existing code ...
THIRD_EDIT
// ... existing code ...

You should bias towards repeating as few lines of the original file as possible to convey the change.
Each edit should contain sufficient context of unchanged lines around the code you're editing to resolve ambiguity.
If you plan on deleting a section, you must provide surrounding context to indicate the deletion.
DO NOT omit spans of pre-existing code without using the // ... existing code ... comment to indicate its absence.
"
}
}
Loading
Loading