-
Notifications
You must be signed in to change notification settings - Fork 192
feat: show document artifact after generating report #658
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
Merged
Changes from all commits
Commits
Show all changes
27 commits
Select commit
Hold shift + click to select a range
0e6505c
feat: show document artifact after generating report
thucpn 5a8f1a3
keep chat message content as it is
thucpn 0d10af0
Merge branch 'main' into tp/show-document-artifact-after-generate-report
thucpn e4db821
use artifactEvent from server
thucpn 2c033d8
Merge branch 'main' into tp/show-document-artifact-after-generate-report
thucpn 1905f7e
add deep research example
thucpn f7326ca
bump chat-ui for new editor
thucpn 78d5efb
import editor css
thucpn 02f4922
hide warning for workflowEvent<{}>() in eject mode
thucpn 4f2cdbd
fix format
thucpn bd1baa3
use CL for better testing
thucpn 3e28276
generate artifact after streaming report in Python
thucpn b0b12b8
bump chat-ui to support citations
thucpn a471085
use isinstance to check stream
thucpn b345945
fix document editor spacing
thucpn 38ac299
Create tame-wolves-obey.md
thucpn b97dc3e
add sources to document artifact
thucpn f521f12
add sources to document artifact in python
thucpn bd1294e
type cast
thucpn f6e6997
no need score
thucpn b682e55
fix lint
thucpn c244605
move handle stream logic to server
thucpn 9f062df
refactor: use chunk.text and chunk.raw
thucpn ba415e0
bump chat-ui 0.5.6 to fix citations
thucpn 2b4dbba
update changset
thucpn 4ce3cf6
Merge branch 'main' into tp/show-document-artifact-after-generate-report
thucpn b1e59f6
fix lock
thucpn File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,7 @@ | ||
| --- | ||
| "create-llama": patch | ||
| "@llamaindex/server": patch | ||
| "@create-llama/llama-index-server": patch | ||
| --- | ||
|
|
||
| feat: show document artifact after generating report |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
45 changes: 45 additions & 0 deletions
45
python/llama-index-server/llama_index/server/utils/stream.py
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,45 @@ | ||
| from typing import AsyncGenerator, Union | ||
| from llama_index.core.base.llms.types import ( | ||
| CompletionResponse, | ||
| CompletionResponseAsyncGen, | ||
| ) | ||
| from llama_index.core.workflow import Context | ||
| from llama_index.core.agent.workflow.workflow_events import AgentStream | ||
|
|
||
|
|
||
| async def write_response_to_stream( | ||
| res: Union[CompletionResponse, CompletionResponseAsyncGen], | ||
| ctx: Context, | ||
| current_agent_name: str = "assistant", | ||
| ) -> str: | ||
| """ | ||
| Handle both streaming and non-streaming LLM responses. | ||
|
|
||
| Args: | ||
| res: The LLM response (either streaming or non-streaming) | ||
| ctx: The workflow context for writing events to stream | ||
| current_agent_name: The name of the current agent (default: "assistant") | ||
|
|
||
| Returns: | ||
| The final response text as a string | ||
| """ | ||
| final_response = "" | ||
|
|
||
| if isinstance(res, AsyncGenerator): | ||
| # Handle streaming response (CompletionResponseAsyncGen) | ||
| async for chunk in res: | ||
| ctx.write_event_to_stream( | ||
| AgentStream( | ||
| delta=chunk.delta or "", | ||
| response=final_response, | ||
| current_agent_name=current_agent_name, | ||
| tool_calls=[], | ||
| raw=chunk.raw or "", | ||
| ) | ||
| ) | ||
| final_response = chunk.text | ||
| else: | ||
| # Handle non-streaming response (CompletionResponse) | ||
| final_response = res.text | ||
|
|
||
| return final_response |
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Uh oh!
There was an error while loading. Please reload this page.