-
Notifications
You must be signed in to change notification settings - Fork 2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix: The workflow cannot function properly after the conversation flow is disconnected #2428
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -44,10 +44,10 @@ def process_response(self, request, response): | |
if response.is_async: | ||
# pull to lexical scope to capture fixed reference in case | ||
# streaming_content is set again later. | ||
orignal_iterator = response.streaming_content | ||
original_iterator = response.streaming_content | ||
|
||
async def gzip_wrapper(): | ||
async for chunk in orignal_iterator: | ||
async for chunk in original_iterator: | ||
yield compress_string( | ||
chunk, | ||
max_random_bytes=self.max_random_bytes, | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. The provided code appears to be for processing API responses that can include streamed content. The key line with concerns is: async def gzip_wrapper():
async for chunk in orignal_iterator: This line has two issues:
Corrected version: async def gzip_wrapper():
async for chunk in original_iterator:
yield (chunk,) # Correct syntax to yield a tuple containing the chunk Additional Optimization Suggestion: Here's a more robust version of the function including basic exception handling: async def gzip_wrapper():
try:
for chunk in stream_original_response(original_iterator):
yield compress_string(chunk, self.max_random_bytes)
except Exception as e:
# Log or raise an exception based on requirements
print(f"An error occurred while processing stream {e}")
return
def stream_original_response(iterator):
"""Simulate logic for getting the streaming content from the original iterator."""
try:
return iterator()
except TypeError:
# Handle cases where the 'iterator' doesn't have a callable '__call__'
raise ValueError("Invalid streaming content type") In this revised code, the |
||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The code you provided appears to be part of an asynchronous implementation where
await_result
is responsible for gathering chunks from some process until it's done. The current version has a few potential improvements:Ensure Resource Cleanup: It's good to ensure that all resources are properly released when
self.is_run()
returnsFalse
. This might include closing connections, stopping background tasks, or releasing locks.Avoid Infinite Loop When Not Running: In the final lines with a
while self.is_run():
, there isn't enough context to understand why they exist. If they were intended to perform cleanup operations onceis_run()
becomes false, consider removing them or replacing them with appropriate logic.Optimization Considerations:
get_runtime_details()
can return values directly without additional processing.Code Readability: Although concise, adding comments would help readability but may introduce unnecessary complexity. Alternatively, refactor parts of the function if they become too long.
Here’s a simplified version with these considerations:
This approach assumes that
is_running()
should stop looping after a certain point, andget_runtime_details()
provides useful performance metrics needed elsewhere. Adjustments could be made based on specific requirements and constraints of your system architecture.