-
Notifications
You must be signed in to change notification settings - Fork 2.5k
feat: Add warm_up() method to ChatGenerators for tool initialization #9942
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: Add warm_up() method to ChatGenerators for tool initialization #9942
Conversation
- Add warm_up() method that calls warm_up_tools() - Add _is_warmed_up flag for idempotency - Import warm_up_tools from haystack.tools - Add comprehensive tests: - test_warm_up_with_tools: single tool case - test_warm_up_with_no_tools: no tools case - test_warm_up_with_multiple_tools: multiple tools case - All tests passing Part of issue deepset-ai#9907
- Add warm_up() method that calls warm_up_tools() - Add _is_warmed_up flag for idempotency - Import warm_up_tools from haystack.tools - Add comprehensive tests: - test_warm_up_with_tools: single tool case - test_warm_up_with_no_tools: no tools case - test_warm_up_with_multiple_tools: multiple tools case - All tests passing Part of issue deepset-ai#9907
- Add warm_up() method that calls warm_up_tools() - Add _is_warmed_up flag for idempotency - Import warm_up_tools from haystack.tools - Add comprehensive tests: - test_warm_up_with_tools: single tool case - test_warm_up_with_no_tools: no tools case - test_warm_up_with_multiple_tools: multiple tools case - All tests passing Part of issue deepset-ai#9907
- Add warm_up_tools import from haystack.tools.utils - Add _is_warmed_up flag for idempotency - Enhance existing warm_up() to also warm up tools - Preserve existing pipeline initialization logic - Add comprehensive tests: - test_warm_up_with_tools: single tool case - test_warm_up_with_no_tools: no tools case - test_warm_up_with_multiple_tools: multiple tools case Part of issue deepset-ai#9907
- Add warm_up() method that delegates to underlying generators - Uses hasattr check to gracefully handle generators without warm_up - Add comprehensive tests: - test_warm_up_delegates_to_generators: verify delegation works - test_warm_up_with_no_warm_up_method: handle missing warm_up gracefully - test_warm_up_mixed_generators: mix of generators with/without warm_up - All tests passing Part of issue deepset-ai#9907
|
@HamidOna13 is attempting to deploy a commit to the deepset Team on Vercel. A member of the Team first needs to authorize it. |
|
While implementing this feature, I discovered a potential issue with the When a This means if someone does: generator = OpenAIChatGenerator(tools=[my_toolset])
generator.warm_up()The individual tools within This is beyond the scope of this PR, but wanted to document it. Should I open a separate issue for this? |
Hey @HamidOna this is a good catch. The disconnect happened due to two usage approaches for Toolset:
For the first one - yes it makes sense to call warm_up on each Tool in the Toolset. For the second - it doesn't because Toolset subclass manages/interpolates management to all tools in it. Having said all that it the argument to make default Toolset warm_up call warm_up on Tool instances does make sense because Toolset subclasses can change that behaviour to customize the warm_up as they rightly should. We'll talk internally about this and report back. Thanks for pointing this out! cc @sjrl |
Pull Request Test Coverage Report for Build 18879776280Details
💛 - Coveralls |
|
Also @HamidOna please don't forget to sign CLA, see above. This is standard procedure for Contributor LA. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LG, let's wait to see what's up with this CLA server, maybe it unblocks by tomorrow. If not, we'll figure it out, I've seen this before.
@vblagoje I figured out the problem. For some reason, It also needed me to sign with my other github account |
| @@ -201,6 +202,18 @@ def __init__( # pylint: disable=too-many-positional-arguments | |||
| self.async_client = AsyncAzureOpenAI( | |||
| http_client=init_http_client(self.http_client_kwargs, async_client=True), **client_args | |||
| ) | |||
| self._is_warmed_up = False | |||
|
|
|||
| def warm_up(self): | |||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah @HamidOna do we need this one as its parent already implements the warm_up?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, Azure needs both the flag initialization and the warm_up() method because:
Azure does not call super().init() - See line 73 and the comment on lines 152-53. It explicitly skips the parent's initialization because it only needs to instantiate the Azure-specific client.
Since Azure's init() doesn't call the parent's init():
- The self._is_warmed_up = False flag from OpenAI's init() is never set
- Without this flag, calling any inherited warm_up() would fail with AttributeError making it a requirement for Azure
The warm_up() method implementation is identical to OpenAI's, so technically we could remove Azure's warm_up() method and let it inherit from OpenAI (since the flag is now initialized). However, It makes sense to keep both for explicitness. Let me know if you'd prefer I remove the warm_up() method from Azure and just keep the flag initialization!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hmm, right this is a tension between DRY and explicitness you advocate. We could simply add:
self._is_warmed_up = False
in Azure init and be done with it. Let me ask my colleague @sjrl for an opinion
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'd rather reimplemt the warm up method so it's more explicit.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🙏 @HamidOna keep these PRs coming. Great work!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@HamidOna you can, if you want, also add this method to all Chat Generators in https://github.com/deepset-ai/haystack-core-integrations/ Just lmk
Related Issues: * Follows up on PR deepset-ai#9942 (feat: Add warm_up() method to ChatGenerators) * Addresses bug discovered during implementation of PR deepset-ai#9942 for issue deepset-ai#9907 Proposed Changes: The warm_up_tools() utility function was only calling warm_up() on Toolset objects themselves, but not on the individual Tool instances contained within them. This meant tools inside a Toolset were not properly initialized before use. This PR modifies warm_up_tools() to iterate through Toolsets and call warm_up() on each individual tool, in addition to calling warm_up() on the Toolset itself. Changes: - Modified warm_up_tools() in haystack/tools/utils.py to iterate through Toolsets when encountered (both as single argument and within lists) - Added iteration to call warm_up() on each individual Tool inside Toolsets - Added comprehensive test class TestWarmUpTools with 7 test cases How did you test it: - Added 7 comprehensive unit tests in test/tools/test_tools_utils.py: * test_warm_up_tools_with_none - handles None input * test_warm_up_tools_with_single_tool - single tool in list * test_warm_up_tools_with_single_toolset - KEY TEST: verifies both Toolset and individual tools are warmed * test_warm_up_tools_with_list_containing_toolset - toolset within list * test_warm_up_tools_with_multiple_toolsets - multiple toolsets * test_warm_up_tools_with_mixed_tools_and_toolsets - mixed scenarios * test_warm_up_tools_idempotency - safe to call multiple times Notes for the reviewer: I discovered this bug while implementing PR deepset-ai#9942 (for issue deepset-ai#9907). When a Toolset object is passed to a component's tools parameter, the warm_up_tools() function only calls Toolset.warm_up(), which is a no-op. It doesn't iterate through the individual tools inside the Toolset to warm them up. acknowledged by @vblagoje and @sjrl This implementation: - Modified warm_up_tools() to iterate through Toolsets and call warm_up() on each individual tool - Added comprehensive tests for Toolset warming behavior - Verified both the Toolset and its contained tools are warmed up Checklist: I have read the contributors guidelines and the code of conduct I have updated the related issue with new insights and changes I added unit tests and updated the docstrings I've used one of the conventional commit types for my PR title: fix: I documented my code I ran pre-commit hooks and fixed any issue
) * fix: warm up individual tools inside Toolsets in warm_up_tools() Related Issues: * Follows up on PR #9942 (feat: Add warm_up() method to ChatGenerators) * Addresses bug discovered during implementation of PR #9942 for issue #9907 Proposed Changes: The warm_up_tools() utility function was only calling warm_up() on Toolset objects themselves, but not on the individual Tool instances contained within them. This meant tools inside a Toolset were not properly initialized before use. This PR modifies warm_up_tools() to iterate through Toolsets and call warm_up() on each individual tool, in addition to calling warm_up() on the Toolset itself. Changes: - Modified warm_up_tools() in haystack/tools/utils.py to iterate through Toolsets when encountered (both as single argument and within lists) - Added iteration to call warm_up() on each individual Tool inside Toolsets - Added comprehensive test class TestWarmUpTools with 7 test cases How did you test it: - Added 7 comprehensive unit tests in test/tools/test_tools_utils.py: * test_warm_up_tools_with_none - handles None input * test_warm_up_tools_with_single_tool - single tool in list * test_warm_up_tools_with_single_toolset - KEY TEST: verifies both Toolset and individual tools are warmed * test_warm_up_tools_with_list_containing_toolset - toolset within list * test_warm_up_tools_with_multiple_toolsets - multiple toolsets * test_warm_up_tools_with_mixed_tools_and_toolsets - mixed scenarios * test_warm_up_tools_idempotency - safe to call multiple times Notes for the reviewer: I discovered this bug while implementing PR #9942 (for issue #9907). When a Toolset object is passed to a component's tools parameter, the warm_up_tools() function only calls Toolset.warm_up(), which is a no-op. It doesn't iterate through the individual tools inside the Toolset to warm them up. acknowledged by @vblagoje and @sjrl This implementation: - Modified warm_up_tools() to iterate through Toolsets and call warm_up() on each individual tool - Added comprehensive tests for Toolset warming behavior - Verified both the Toolset and its contained tools are warmed up Checklist: I have read the contributors guidelines and the code of conduct I have updated the related issue with new insights and changes I added unit tests and updated the docstrings I've used one of the conventional commit types for my PR title: fix: I documented my code I ran pre-commit hooks and fixed any issue * added release note * refactor: move tool warm-up iteration to Toolset.warm_up() Addresses architectural feedback - moved iteration logic from warm_up_tools() to base Toolset.warm_up() for better encapsulation. Subclasses can now override warm_up() to customize initialization without breaking the contract. - Toolset.warm_up() now iterates and warms tools by default - warm_up_tools() simplified to delegate to warm_up() - Updated tests and release notes --------- Co-authored-by: HamidOna13 <abdulhamid.onawole@aizatron.com>


Related Issues
warm_up()method to ChatGenerators for tool initialization #9907Proposed Changes:
This PR adds the
warm_up()method to all ChatGenerator components to properly initialize tools that require warm-up (database connections, model loading, etc.) before pipeline execution.Components Modified:
Implementation approach:
warm_up_tools()utility function_is_warmed_upflag for idempotencyHow did you test it?
Notes for the reviewer
Checklist
feat: