Skip to content

When Call MultimodalWebSurfer after using tool, error occurred #6084

Closed
@SongChiYoung

Description

@SongChiYoung

What happened?

Describe the bug
When using MultimodalWebSurfer after a tool call, the agent crashes due to an unhandled message type in on_messages_stream.

To Reproduce

from autogen_ext.agents.web_surfer import MultimodalWebSurfer
from autogen_agentchat.agents import AssistantAgent
from autogen_agentchat.teams import RoundRobinGroupChat
from autogen_agentchat.conditions import MaxMessageTermination, TextMentionTermination
from autogen_agentchat.ui import Console


termination = TextMentionTermination("TERMINATE")

def test():
    return "Hello, World!"

multi_modal_web_surfer = MultimodalWebSurfer(
    name="web_surfer",
    description="A web surfer agent that can surf the web for information.",
    model_client=anthropic_client,
    # system_message="You are a web surfer agent and your task is to surf the web for information. Return `TERMINATE` once the information is found.",
)

test_agent = AssistantAgent(
    name="test_agent",
    description="A test agent that returns a string.",
    model_client=anthropic_client,
    system_message="You are a test agent that run test function.",
    tools = [test],
)

team = RoundRobinGroupChat(
    participants=[test_agent, multi_modal_web_surfer],
    termination_condition=termination,
)

async def team_run():
    await Console(
        team.run_stream(
            task="try test function"
        )
    )

asyncio.run(team_run())

Observed Behavior The agent throws:

/Users/cysong/Documents/MOTOV_업무/codes/TIPS/ai-agent/autogen/python/packages/autogen-ext/src/autogen_ext/models/openai/_openai_client.py:397: UserWarning: Missing required field 'structured_output' in ModelInfo. This field will be required in a future version of AutoGen.
  validate_model_info(self._model_info)
---------- user ----------
try test function
---------- test_agent ----------
I'll help you run the test function. Let me do that for you now.
---------- test_agent ----------
Error processing publish message for web_surfer_d5a78cc6-f002-46ef-9eae-ab0ccfe113a9/d5a78cc6-f002-46ef-9eae-ab0ccfe113a9
Traceback (most recent call last):
  File "/Users/cysong/Documents/ai-agent/autogen/python/packages/autogen-core/src/autogen_core/_single_threaded_agent_runtime.py", line 533, in _on_message
    return await agent.on_message(
           ^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/cysong/Documents/ai-agent/autogen/python/packages/autogen-core/src/autogen_core/_base_agent.py", line 113, in on_message
    return await self.on_message_impl(message, ctx)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/cysong/Documents/ai-agent/autogen/python/packages/autogen-agentchat/src/autogen_agentchat/teams/_group_chat/_sequential_routed_agent.py", line 67, in on_message_impl
    return await super().on_message_impl(message, ctx)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/cysong/Documents/ai-agent/autogen/python/packages/autogen-core/src/autogen_core/_routed_agent.py", line 485, in on_message_impl
    return await h(self, message, ctx)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/cysong/Documents/ai-agent/autogen/python/packages/autogen-core/src/autogen_core/_routed_agent.py", line 268, in wrapper
    return_value = await func(self, message, ctx)  # type: ignore
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/cysong/Documents/ai-agent/autogen/python/packages/autogen-agentchat/src/autogen_agentchat/teams/_group_chat/_chat_agent_container.py", line 69, in handle_request
    async for msg in self._agent.on_messages_stream(self._message_buffer, ctx.cancellation_token):
  File "/Users/cysong/Documents/ai-agent/autogen/python/packages/autogen-ext/src/autogen_ext/agents/web_surfer/_multimodal_web_surfer.py", line 438, in on_messages_stream
    raise ValueError(f"Unexpected message in MultiModalWebSurfer: {chat_message}")
ValueError: Unexpected message in MultiModalWebSurfer: source='test_agent' models_usage=None metadata={} content='Hello, World!' type='ToolCallSummaryMessage'
[FunctionCall(id='toolu_01PgBqKGtjgWJ4ib6x2jx5T5', arguments='{}', name='test')]
---------- test_agent ----------
[FunctionExecutionResult(content='Hello, World!', name='test', call_id='toolu_01PgBqKGtjgWJ4ib6x2jx5T5', is_error=False)]
---------- test_agent ----------
Hello, World!
Traceback (most recent call last):
  File "/Users/cysong/Documents/ai-agent/test-agent/my-app/errortest.py", line 72, in <module>
    asyncio.run(team_run())
  File "/Users/cysong/.local/share/uv/python/cpython-3.11.11-macos-aarch64-none/lib/python3.11/asyncio/runners.py", line 190, in run
    return runner.run(main)
           ^^^^^^^^^^^^^^^^
  File "/Users/cysong/.local/share/uv/python/cpython-3.11.11-macos-aarch64-none/lib/python3.11/asyncio/runners.py", line 118, in run
    return self._loop.run_until_complete(task)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/cysong/.local/share/uv/python/cpython-3.11.11-macos-aarch64-none/lib/python3.11/asyncio/base_events.py", line 654, in run_until_complete
    return future.result()
           ^^^^^^^^^^^^^^^
  File "/Users/cysong/Documents/ai-agent/test-agent/my-app/errortest.py", line 66, in team_run
    await Console(
  File "/Users/cysong/Documents/ai-agent/autogen/python/packages/autogen-agentchat/src/autogen_agentchat/ui/_console.py", line 117, in Console
    async for message in stream:
  File "/Users/cysong/Documents/ai-agent/autogen/python/packages/autogen-agentchat/src/autogen_agentchat/teams/_group_chat/_base_group_chat.py", line 482, in run_stream
    await shutdown_task
  File "/Users/cysong/Documents/ai-agent/autogen/python/packages/autogen-agentchat/src/autogen_agentchat/teams/_group_chat/_base_group_chat.py", line 426, in stop_runtime
    await self._runtime.stop_when_idle()
  File "/Users/cysong/Documents/ai-agent/autogen/python/packages/autogen-core/src/autogen_core/_single_threaded_agent_runtime.py", line 769, in stop_when_idle
    await self._run_context.stop_when_idle()
  File "/Users/cysong/Documents/ai-agent/autogen/python/packages/autogen-core/src/autogen_core/_single_threaded_agent_runtime.py", line 120, in stop_when_idle
    await self._run_task
  File "/Users/cysong/Documents/ai-agent/autogen/python/packages/autogen-core/src/autogen_core/_single_threaded_agent_runtime.py", line 109, in _run
    await self._runtime._process_next()  # type: ignore
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/cysong/Documents/ai-agent/autogen/python/packages/autogen-core/src/autogen_core/_single_threaded_agent_runtime.py", line 604, in _process_next
    raise e from None
  File "/Users/cysong/Documents/ai-agent/autogen/python/packages/autogen-core/src/autogen_core/_single_threaded_agent_runtime.py", line 551, in _process_publish
    await asyncio.gather(*responses)
  File "/Users/cysong/Documents/ai-agent/autogen/python/packages/autogen-core/src/autogen_core/_single_threaded_agent_runtime.py", line 546, in _on_message
    raise e
  File "/Users/cysong/Documents/ai-agent/autogen/python/packages/autogen-core/src/autogen_core/_single_threaded_agent_runtime.py", line 533, in _on_message
    return await agent.on_message(
           ^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/cysong/Documents/ai-agent/autogen/python/packages/autogen-core/src/autogen_core/_base_agent.py", line 113, in on_message
    return await self.on_message_impl(message, ctx)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/cysong/Documents/ai-agent/autogen/python/packages/autogen-agentchat/src/autogen_agentchat/teams/_group_chat/_sequential_routed_agent.py", line 67, in on_message_impl
    return await super().on_message_impl(message, ctx)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/cysong/Documents/ai-agent/autogen/python/packages/autogen-core/src/autogen_core/_routed_agent.py", line 485, in on_message_impl
    return await h(self, message, ctx)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/cysong/Documents/ai-agent/autogen/python/packages/autogen-core/src/autogen_core/_routed_agent.py", line 268, in wrapper
    return_value = await func(self, message, ctx)  # type: ignore
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/cysong/Documents/ai-agent/autogen/python/packages/autogen-agentchat/src/autogen_agentchat/teams/_group_chat/_chat_agent_container.py", line 69, in handle_request
    async for msg in self._agent.on_messages_stream(self._message_buffer, ctx.cancellation_token):
  File "/Users/cysong/Documents/ai-agent/autogen/python/packages/autogen-ext/src/autogen_ext/agents/web_surfer/_multimodal_web_surfer.py", line 438, in on_messages_stream
    raise ValueError(f"Unexpected message in MultiModalWebSurfer: {chat_message}")
ValueError: Unexpected message in MultiModalWebSurfer: source='test_agent' models_usage=None metadata={} content='Hello, World!' type='ToolCallSummaryMessage'

Expected behavior
Root Cause In _multimodal_web_surfer.py:

    async def on_messages_stream(
        self, messages: Sequence[ChatMessage], cancellation_token: CancellationToken
    ) -> AsyncGenerator[AgentEvent | ChatMessage | Response, None]:
        for chat_message in messages:
            if isinstance(chat_message, TextMessage | MultiModalMessage):
                self._chat_history.append(UserMessage(content=chat_message.content, source=chat_message.source))
            else:
                raise ValueError(f"Unexpected message in MultiModalWebSurfer: {chat_message}")

Currently, only TextMessage and MultiModalMessage are allowed. However, in team-based workflows (especially after tool calls), messages like ToolCallSummaryMessage, FunctionExecutionResultMessage may also be streamed.

Expected Behavior MultimodalWebSurfer should either:

Gracefully skip unsupported message types, or

Handle/append fallback content from those messages (e.g., str(chat_message.content))

Suggestion Would it make sense to generalize this logic to handle all ChatMessage types, or at least skip the unrecognized ones with a warning?

Also, could you clarify the rationale for limiting _chat_history to only TextMessage and MultiModalMessage? Would broader support break assumptions elsewhere?

Which packages was the bug in?

Python Extensions (autogen-ext)

AutoGen library version.

Python dev (main branch)

Other library version.

No response

Model used

No response

Model provider

Anthropic

Other model provider

No response

Python version

3.11

.NET version

None

Operating system

MacOS

Activity

added theissue type on Mar 24, 2025
ekzhu

ekzhu commented on Mar 24, 2025

@ekzhu
Collaborator

This will be resolved by #5998 which abstract away the implementation of how a message is converted into LLM context

SongChiYoung

SongChiYoung commented on Mar 27, 2025

@SongChiYoung
ContributorAuthor

@ekzhu I was able to confirm that the test case works well after your commit.
Thanks for your great work!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Metadata

Metadata

Assignees

No one assigned

    Type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

      Participants

      @ekzhu@SongChiYoung

      Issue actions

        When Call MultimodalWebSurfer after using tool, error occurred · Issue #6084 · microsoft/autogen