Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions crates/goose-server/src/routes/mod.rs
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,7 @@ pub mod prompts;
pub mod recipe;
pub mod recipe_utils;
pub mod reply;
pub mod sampling;
pub mod schedule;
pub mod session;
pub mod setup;
Expand Down Expand Up @@ -39,4 +40,5 @@ pub fn configure(state: Arc<crate::state::AppState>, secret_key: String) -> Rout
.merge(tunnel::routes(state.clone()))
.merge(mcp_ui_proxy::routes(secret_key.clone()))
.merge(mcp_app_proxy::routes(secret_key))
.merge(sampling::routes(state))
}
87 changes: 87 additions & 0 deletions crates/goose-server/src/routes/sampling.rs
Original file line number Diff line number Diff line change
@@ -0,0 +1,87 @@
use axum::{
extract::{Path, State},
http::StatusCode,
routing::post,
Json, Router,
};
use goose::conversation::message::Message;
use rmcp::model::{
CreateMessageRequestParams, CreateMessageResult, Role, SamplingContent, SamplingMessage,
SamplingMessageContent,
};
use std::sync::Arc;

use crate::state::AppState;

pub fn routes(state: Arc<AppState>) -> Router {
Router::new()
.route(
"/sessions/{session_id}/sampling/message",
post(create_message),
)
Comment on lines +17 to +21
Copy link

Copilot AI Feb 6, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This endpoint is intended to support image sampling, but there’s no explicit body-size limit layer here (unlike /reply and dictation), so base64 image requests may be rejected by the default request body limit; consider adding a DefaultBodyLimit::max(...) appropriate for image payloads.

Copilot uses AI. Check for mistakes.
Comment on lines +17 to +21
Copy link

Copilot AI Feb 11, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Add an integration test for /sessions/{session_id}/sampling/message (similar to other route tests) to lock in the MCP request/response shape and prevent regressions.

Copilot uses AI. Check for mistakes.
.with_state(state)
}

Copy link

Copilot AI Feb 18, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This new HTTP endpoint isn't annotated with #[utoipa::path] and isn't added to crates/goose-server/src/openapi.rs's paths(...) list, so it won't appear in the generated OpenAPI schema unlike the other routes. If this is intended to be a supported API for the desktop app, add the utoipa path annotation (and register it in ApiDoc) so schema validation/clients stay in sync.

Suggested change
#[utoipa::path(
post,
path = "/sessions/{session_id}/sampling/message",
params(
("session_id" = String, Path, description = "Session identifier")
),
request_body = CreateMessageRequestParams,
responses(
(status = 200, description = "Sampling message created", body = CreateMessageResult),
(status = 500, description = "Internal server error")
)
)]

Copilot uses AI. Check for mistakes.
async fn create_message(
State(state): State<Arc<AppState>>,
Path(session_id): Path<String>,
Json(request): Json<CreateMessageRequestParams>,
) -> Result<Json<CreateMessageResult>, StatusCode> {
let agent = state.get_agent_for_route(session_id.clone()).await?;

let provider = agent.provider().await.map_err(|e| {
tracing::error!("Failed to get provider: {}", e);
StatusCode::INTERNAL_SERVER_ERROR
})?;

let messages: Vec<Message> = request
.messages
.iter()
.map(|msg| {
let base = match msg.role {
Role::User => Message::user(),
Role::Assistant => Message::assistant(),
};
content_to_message(base, &msg.content)
Copy link

Copilot AI Feb 11, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In this repo, rmcp sampling message content is handled as a list of blocks (msg.content.first() in crates/goose/src/agents/mcp_client.rs), but this route treats msg.content as a single Content; align the conversion with mcp_client.rs (iterate/take first block).

Suggested change
content_to_message(base, &msg.content)
if let Some(first_content) = msg.content.first() {
content_to_message(base, first_content)
} else {
base
}

Copilot uses AI. Check for mistakes.
})
.collect();

let system = request
.system_prompt
.as_deref()
.unwrap_or("You are a helpful AI assistant.");

let model_config = provider.get_model_config();
let (response, usage) = provider
.complete(&model_config, &session_id, system, &messages, &[])
.await
.map_err(|e| {
tracing::error!("Sampling completion failed: {}", e);
Comment on lines 55 to 59
Copy link

Copilot AI Feb 6, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The request accepts maxTokens but it’s currently ignored (the provider call uses the default model config), so clients won’t be able to constrain output length as requested; either plumb request.max_tokens into an overridden ModelConfig via complete_with_model(...) or remove the parameter from the client contract.

Copilot uses AI. Check for mistakes.
StatusCode::INTERNAL_SERVER_ERROR
})?;
Comment on lines 55 to 61
Copy link

Copilot AI Feb 13, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The max_tokens parameter from the sampling request is received but never used. The CreateMessageRequestParams includes this field, but it's not passed to the provider's complete method. This means MCP apps cannot control the maximum length of responses. Consider passing request.max_tokens to the provider if the API supports it, or document that this parameter is currently ignored.

Copilot uses AI. Check for mistakes.

let text = response.as_concat_text();

Ok(Json(CreateMessageResult {
model: usage.model,
stop_reason: Some(CreateMessageResult::STOP_REASON_END_TURN.to_string()),
message: SamplingMessage::new(Role::Assistant, SamplingMessageContent::text(&text)),
}))
}

fn content_to_message(base: Message, content: &SamplingContent<SamplingMessageContent>) -> Message {
let items = match content {
SamplingContent::Single(item) => vec![item],
SamplingContent::Multiple(items) => items.iter().collect(),
};

let mut msg = base;
for item in items {
msg = match item {
SamplingMessageContent::Text(text) => msg.with_text(&text.text),
SamplingMessageContent::Image(image) => msg.with_image(&image.data, &image.mime_type),
_ => msg,
};
}
msg
}
Copy link

Copilot AI Feb 13, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The sampling route lacks test coverage. Other similar routes in the codebase (e.g., action_required.rs at lines 63-100, reply.rs at lines 468-506) have integration tests. Consider adding tests to verify the sampling endpoint works correctly, especially given the data transformation between MCP protocol types and the response format expected by the frontend.

Suggested change
}
}
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn content_to_message_with_empty_multiple_does_not_panic() {
let base = Message::user();
let content: SamplingContent<SamplingMessageContent> = SamplingContent::Multiple(vec![]);
let _result = content_to_message(base, &content);
}
}

Copilot uses AI. Check for mistakes.
11 changes: 7 additions & 4 deletions crates/goose/src/goose_apps/cache.rs
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,7 @@ use tracing::warn;
use super::app::GooseApp;

static CLOCK_HTML: &str = include_str!("../goose_apps/clock.html");
static CHAT_HTML: &str = include_str!("../goose_apps/chat.html");
const APPS_EXTENSION_NAME: &str = "apps";

pub struct McpAppCache {
Expand All @@ -23,10 +24,12 @@ impl McpAppCache {
}

fn ensure_default_apps(&self) {
if self.get_app(APPS_EXTENSION_NAME, "apps://clock").is_none() {
if let Ok(mut clock_app) = GooseApp::from_html(CLOCK_HTML) {
clock_app.mcp_servers = vec![APPS_EXTENSION_NAME.to_string()];
let _ = self.store_app(&clock_app);
for (uri, html) in [("apps://clock", CLOCK_HTML), ("apps://chat", CHAT_HTML)] {
if self.get_app(APPS_EXTENSION_NAME, uri).is_none() {
if let Ok(mut app) = GooseApp::from_html(html) {
app.mcp_servers = vec![APPS_EXTENSION_NAME.to_string()];
Comment on lines +27 to +30
Copy link

Copilot AI Feb 6, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This default-app cache check uses apps://... URIs, but GooseApp::from_html() sets resource.uri to ui://apps/{name} and store_app() keys the cache by app.resource.uri, so this get_app() call will never hit and will rewrite the default apps on every startup.

Suggested change
for (uri, html) in [("apps://clock", CLOCK_HTML), ("apps://chat", CHAT_HTML)] {
if self.get_app(APPS_EXTENSION_NAME, uri).is_none() {
if let Ok(mut app) = GooseApp::from_html(html) {
app.mcp_servers = vec![APPS_EXTENSION_NAME.to_string()];
for html in [CLOCK_HTML, CHAT_HTML] {
if let Ok(mut app) = GooseApp::from_html(html) {
app.mcp_servers = vec![APPS_EXTENSION_NAME.to_string()];
if self
.get_app(APPS_EXTENSION_NAME, &app.resource.uri)
.is_none()
{

Copilot uses AI. Check for mistakes.
let _ = self.store_app(&app);
}
}
}
}
Expand Down
184 changes: 184 additions & 0 deletions crates/goose/src/goose_apps/chat.html
Original file line number Diff line number Diff line change
@@ -0,0 +1,184 @@
<!DOCTYPE html>
<html>
<head>
<meta charset="UTF-8">
<title>Chat</title>
<script type="application/ld+json">
{
"@context": "https://goose.ai/schema",
"@type": "GooseApp",
"name": "chat",
"description": "Simple Chat UI",
"width": 400,
"height": 500,
"resizable": true
}
</script>
<style>
* { box-sizing: border-box; margin: 0; padding: 0; }
html, body { height: 100%; font-family: -apple-system, BlinkMacSystemFont, sans-serif; }
body { display: flex; flex-direction: column; background: #fff; }

.messages {
flex: 1;
overflow-y: auto;
padding: 16px;
display: flex;
flex-direction: column;
gap: 12px;
}

.message {
max-width: 80%;
padding: 10px 14px;
border-radius: 16px;
line-height: 1.4;
font-size: 14px;
word-wrap: break-word;
}

.message.user {
align-self: flex-end;
background: #000;
color: #fff;
}

.message.assistant {
align-self: flex-start;
background: #f0f0f0;
color: #000;
}

.message.loading {
font-style: italic;
color: #666;
}

.input-area {
display: flex;
gap: 8px;
padding: 12px;
border-top: 1px solid #e0e0e0;
background: #fafafa;
}

#messageInput {
flex: 1;
padding: 10px 14px;
border: 1px solid #ddd;
border-radius: 20px;
font-size: 14px;
outline: none;
}

#messageInput:focus { border-color: #999; }

#sendBtn {
padding: 10px 20px;
background: #000;
color: #fff;
border: none;
border-radius: 20px;
font-size: 14px;
cursor: pointer;
}

#sendBtn:disabled {
background: #ccc;
cursor: not-allowed;
}
</style>
</head>
<body>
<div class="messages" id="messages"></div>
<div class="input-area">
<input type="text" id="messageInput" placeholder="Type a message..." />
<button id="sendBtn">Send</button>
</div>

<script>
const messagesEl = document.getElementById('messages');
const inputEl = document.getElementById('messageInput');
const sendBtn = document.getElementById('sendBtn');
const conversationHistory = [];
const pendingRequests = new Map();
let requestId = 0;

function addMessage(role, text, isLoading = false) {
const div = document.createElement('div');
div.className = `message ${role}${isLoading ? ' loading' : ''}`;
div.textContent = text;
messagesEl.appendChild(div);
messagesEl.scrollTop = messagesEl.scrollHeight;
return div;
}

function request(method, params) {
return new Promise((resolve, reject) => {
const id = ++requestId;
pendingRequests.set(id, { resolve, reject });
window.parent.postMessage({ jsonrpc: '2.0', id, method, params }, '*');
});
}

window.addEventListener('message', (event) => {
const data = event.data;
if (!data || typeof data !== 'object') return;

if ('id' in data && pendingRequests.has(data.id)) {
const { resolve, reject } = pendingRequests.get(data.id);
pendingRequests.delete(data.id);
if (data.error) {
reject(new Error(data.error.message || 'Unknown error'));
} else {
resolve(data.result);
}
}
});
Comment on lines +124 to +137
Copy link

Copilot AI Feb 18, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The message event handler accepts postMessages from any source/origin, which lets unrelated frames/scripts spoof JSON-RPC responses and resolve/reject pending requests. Restrict handling to event.source === window.parent (and, if possible, validate event.origin against the expected sandbox/host origin).

Copilot uses AI. Check for mistakes.

request('ui/initialize', {}).then(() => {
window.parent.postMessage({ jsonrpc: '2.0', method: 'ui/notifications/initialized', params: {} }, '*');
});

async function sendMessage() {
const text = inputEl.value.trim();
if (!text) return;

inputEl.value = '';
sendBtn.disabled = true;

addMessage('user', text);
conversationHistory.push({ role: 'user', content: { type: 'text', text } });

const loadingEl = addMessage('assistant', 'Thinking...', true);

try {
const response = await request('sampling/createMessage', {
messages: conversationHistory,
systemPrompt: 'You are a helpful assistant. Keep responses concise.',
maxTokens: 1000
});

const responseText = response.content.text;
conversationHistory.push({ role: 'assistant', content: { type: 'text', text: responseText } });

loadingEl.textContent = responseText;
Comment on lines +162 to +165
Copy link

Copilot AI Feb 13, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The response structure doesn't match what the frontend expects. The CreateMessageResult from rmcp returns { model, stop_reason, message: { role, content } }, but the code tries to access response.content.text. This should be response.message.content.text or similar, depending on how SamplingMessage serializes its content field (it may be an array or object).

Suggested change
const responseText = response.content.text;
conversationHistory.push({ role: 'assistant', content: { type: 'text', text: responseText } });
loadingEl.textContent = responseText;
const message = response && response.message ? response.message : response;
let responseText = '';
if (message && Array.isArray(message.content)) {
const textPart = message.content.find(
(part) => part && typeof part === 'object' && typeof part.text === 'string'
);
if (textPart) {
responseText = textPart.text;
}
} else if (message && message.content && typeof message.content === 'object') {
if (typeof message.content.text === 'string') {
responseText = message.content.text;
}
}
conversationHistory.push({ role: 'assistant', content: { type: 'text', text: responseText } });
loadingEl.textContent = responseText || '[no response text]';

Copilot uses AI. Check for mistakes.
loadingEl.classList.remove('loading');
Comment on lines +155 to +166
Copy link

Copilot AI Feb 18, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The response from sampling/createMessage is an MCP CreateMessageResult with a nested message field, so response.content.text will be undefined and the demo UI won't render replies. Read response.message.content (and handle non-text content) or adjust the host to return the flattened shape.

Copilot uses AI. Check for mistakes.
} catch (err) {
loadingEl.textContent = 'Error: ' + err.message;
loadingEl.classList.remove('loading');
}

sendBtn.disabled = false;
inputEl.focus();
}

sendBtn.addEventListener('click', sendMessage);
inputEl.addEventListener('keypress', (e) => {
if (e.key === 'Enter') sendMessage();
});

inputEl.focus();
</script>
</body>
</html>
Loading