-
Notifications
You must be signed in to change notification settings - Fork 1.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix: assign streaming_callback to OpenAIGenerator in run() method #8054
Conversation
Pull Request Test Coverage Report for Build 10077478588Details
💛 - Coveralls |
streaming_callback = generation_kwargs.pop("streaming_callback", None) | ||
# check if streaming_callback is passed to run() | ||
if streaming_callback: | ||
self.streaming_callback = streaming_callback | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
- This is one possible and simple fix in current
run()
method. - Other option is to separate the
run()
andinvoke()
as implemented for OpenAIGenerator. I proposed this for code modularity but if its unnecessary, we can just add the above checks for both generators.
In any case, we can choose the same approach for both, and I'll update the PR.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not a great idea, this is quite dangerous.
If you call run
passing a streaming_callback
all subsequents calls will reuse that same callback even if not explicitly set. That's extremely confusing in my opinion.
Best choice would be something like this:
streaming_callback = streaming_callback or self.streaming_callback
This also mean adding streaming_callback
as another input with a None
default.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good point!
In general, I think I would also like @silvanocerza to take a look and validate/suggest other approaches. |
I agree with @anakin87 here, adding another input is the best choice. It also has the best UX in my opinion. With the current solution one would have to run a
Having an extra input would be simpler:
|
That too was an option but as mentioned in the #7836 , that would be a breaking change? |
No, adding a new Optional parameter to the |
@Amnah199 adding optional inputs is not considered a breaking change since it won't change the behaviour of existing code. |
I would suggest adding tests for the new parameter.
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good.
@silvanocerza could you take a final look?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Small important change to make, the rest is good.
Related Issues
Proposed Changes:
Allow passing
streaming_callback
parameter torun()
method to OpenAIGenerator and OpenAIChatGenerator. This allows passingstreaming_callback
during the pipeline run and prevents the need to recreate the pipeline within streaming callbacks.How did you test it?
Notes for the reviewer
N/A
Checklist
fix:
,feat:
,build:
,chore:
,ci:
,docs:
,style:
,refactor:
,perf:
,test:
.