Skip to content

Conversation

ashwinb
Copy link
Contributor

@ashwinb ashwinb commented Oct 14, 2025

This PR updates the Conversation item related types and improves a
couple critical parts of the implemenation:

  • it creates a streaming output item for the final assistant message output by
    the model. until now we only added content parts and included that
    message in the final response.

  • rewrites the conversation update code completely to account for items
    other than messages (tool calls, outputs, etc.)

Test Plan

Used the test script from llamastack/llama-stack-client-python#281 for this

TEST_API_BASE_URL=http://localhost:8321/v1 \
  pytest tests/integration/test_agent_turn_step_events.py::test_client_side_function_tool -xvs

This PR updates the Conversation item related types and improves a
couple critical parts of the implemenation:

- it creates a streaming output item for the final assistant message output by
  the model. until now we only added content parts and included that
  message in the final response.

- rewrites the conversation update code completely to account for items
  other than messages (tool calls, outputs, etc.)
@meta-cla meta-cla bot added the CLA Signed This label is managed by the Meta Open Source bot. label Oct 14, 2025
|
# Fallback to the generic message type as a last resort
OpenAIResponseMessage,
| OpenAIResponseOutputMessageMCPCall
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is similar to the change I'm proposing at #3385, but there I'm adding diractly OpenAIResponseOutput as a possible input, to always keep both list in sync (based on a comment I received in that PR).

As the other PR is simpler/shorter, perhaps we can go with it and rebase this on top

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@luis5tb It is actually far more complex, even my changes are only a patch. We need to do a thorough review of everything carefully. My changes here were not motivated by trying to override your PR but came via an independent motivation. Unfortunate your PR went languishing for a long time. I also need to run tests still.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@luis5tb in fact I went your route first but then looked at OpenAI's definitions. They are very subtly different! It may be you are right but we need to do an automatic thorough check on all Responses type definitions -- there are way too many holes there right now.

@leseb
Copy link
Collaborator

leseb commented Oct 15, 2025

@github-actions run precommit

Copy link
Contributor

⏳ Running pre-commit hooks on PR #3810...

Copy link
Collaborator

@leseb leseb left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Did a pass and nothing caught my eyes. Thanks!

Copy link
Contributor

✅ Pre-commit hooks completed successfully!

🔧 Changes have been committed and pushed to the PR branch.

| OpenAIResponseOutputMessageFunctionToolCall
| OpenAIResponseOutputMessageFileSearchToolCall
| OpenAIResponseOutputMessageWebSearchToolCall
| OpenAIResponseOutputMessageFileSearchToolCall
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nice i was planning to follow up on these after the latest changes. thanks!

Copy link
Collaborator

@franciscojavierarceo franciscojavierarceo left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🚀

@ashwinb
Copy link
Contributor Author

ashwinb commented Oct 15, 2025

@leseb wtf how did that pre-commit magic work man! This is in my forked Repo!

@ashwinb
Copy link
Contributor Author

ashwinb commented Oct 15, 2025

Still fixing some responses tests. Not broken by this change honestly, but still (since these things are not in CI yet)...

@ashwinb
Copy link
Contributor Author

ashwinb commented Oct 15, 2025

Note that CI will appear red on this PR because llamastack/llama-stack-client-python#281 needs to be landed concurrently.

@ashwinb ashwinb merged commit e9b4278 into llamastack:main Oct 15, 2025
20 of 22 checks passed
@ashwinb ashwinb deleted the agent_rewrite branch October 15, 2025 16:36
messages = await convert_response_input_to_chat_messages(input)
else:
# Use stored messages directly and convert only new input
messages = stored_messages or []
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can stored_messages actually be None here?

filter(lambda x: not isinstance(x, OpenAISystemMessageParam), orchestrator.final_messages)
)
if store:
# TODO: we really should work off of output_items instead of "final_messages"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

does this still apply?

leseb added a commit to leseb/llama-stack that referenced this pull request Oct 16, 2025
followup on llamastack#3810

Signed-off-by: Sébastien Han <seb@redhat.com>
@leseb leseb mentioned this pull request Oct 16, 2025
leseb added a commit to leseb/llama-stack that referenced this pull request Oct 16, 2025
followup on llamastack#3810

Signed-off-by: Sébastien Han <seb@redhat.com>
leseb added a commit that referenced this pull request Oct 16, 2025
followup on #3810

Signed-off-by: Sébastien Han <seb@redhat.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CLA Signed This label is managed by the Meta Open Source bot.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants