-
Notifications
You must be signed in to change notification settings - Fork 8.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: Allow using file variables directly in the LLM node and support more file types. #10679
Merged
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
laipz8200
force-pushed
the
refactor/prompts-convert-in-llm-node
branch
3 times, most recently
from
November 18, 2024 07:35
a567869
to
acbb678
Compare
dosubot
bot
added
size:XXL
This PR changes 1000+ lines, ignoring generated files.
⚙️ feat:model-runtime
💪 enhancement
New feature or request
labels
Nov 18, 2024
iamjoel
previously approved these changes
Nov 18, 2024
laipz8200
force-pushed
the
refactor/prompts-convert-in-llm-node
branch
from
November 18, 2024 10:13
4f1dd5c
to
2d8a720
Compare
5 tasks
…nd add validators for None handling
…memory role prefix requirements
- Changed input type from list to Sequence for prompt messages to allow more flexible input types. - Improved compatibility with functions expecting different iterable types.
- Replaced list with Sequence for more flexible content type. - Improved type consistency by importing from collections.abc.
- Simplified app configuration by removing the 'frozen' parameter since it is no longer needed. - Ensures more flexible handling of config attributes.
- Changed the Faker version from caret constraint to tilde constraint for compatibility. - Updated poetry.lock for changes in pyproject.toml content.
- Improved flexibility by using Sequence instead of list, allowing for broader compatibility with different types of sequences. - Helps future-proof the method signature by leveraging the more generic Sequence type.
- Changed 'prompt_messages' parameter from list to Sequence for broader input type compatibility.
Updated the log and text properties in segments to return empty strings instead of the segment value. This change prevents potential leakage of sensitive data by ensuring only non-sensitive information is logged or transformed into text. Addresses potential security and privacy concerns.
Replaced redundant variables in test setup to streamline and align usage of fake data, enhancing readability and maintainability. Adjusted image URL variables to utilize consistent references, ensuring uniformity across test configurations. Also, corrected context variable naming for clarity. No functional impact, purely a refactor for code clarity.
Refactored LLM node tests to enhance clarity and maintainability by creating test scenarios for different file input combinations. This restructuring replaces repetitive code with a more concise approach, improving test coverage and readability. No functional code changes were made. References: #123, #456
Refactor test scenarios in LLMNode unit tests by introducing a new `LLMNodeTestScenario` class to enhance readability and consistency. This change simplifies the test case management by encapsulating scenario data and reduces redundancy in specifying test configurations. Improves test clarity and maintainability by using a structured approach.
Ensure that messages are only created from non-empty text segments, preventing potential issues with empty content. test: add scenario for file variable handling Introduce a test case for scenarios involving prompt templates with file variables, particularly images, to improve reliability and test coverage. Updated `LLMNodeTestScenario` to use `Sequence` and `Mapping` for more flexible configurations. Closes #123, relates to #456.
Updated image processing logic to check for model support of vision features, preventing errors when handling images with models that do not support them. Added a test scenario to validate behavior when vision features are absent. This ensures robust image handling and avoids unexpected behavior during image-related prompts.
Adds the workflow run object to the database session to guarantee it is persisted prior to refreshing its state. This change resolves potential issues with data consistency and integrity when the workflow run is accessed after operations. References issue #123 for more context.
Expanded the system to handle document types across different modules and introduced video and audio content handling in model features. Adjusted the prompt message logic to conditionally process content based on available features, enhancing flexibility in media processing. Added comprehensive error handling in `LLMNode` for better runtime resilience. Updated YAML configuration and unit tests to reflect these changes.
Added a check to ensure that files have an extension before processing to avoid potential errors. Updated unit tests to reflect this requirement by including extensions in test data. This prevents exceptions from being raised due to missing file extension information.
Extended the `ConfigPromptItem` component to support file variables by including the `isSupportFileVar` prop. Updated `useConfig` hooks to accept `arrayFile` variable types for both input and memory prompt filtering. This enhancement allows handling of file data types seamlessly, improving flexibility in configuring prompts.
Removed the `_render_basic_message` function and integrated its logic directly into the `LLMNode` class. This reduces redundancy and simplifies the handling of message templates by utilizing `convert_template` more directly. This change enhances code readability and maintainability.
Moved prompt handling functions out of the `LLMNode` class to improve modularity and separation of concerns. This refactor allows better reuse and testing of prompt-related functions. Adjusted existing logic to fetch queries and handle context and memory configurations more effectively. Updated tests to align with the new structure and ensure continued functionality.
Introduce `filterJinjia2InputVar` to enhance variable filtering, specifically excluding `arrayFile` types from Jinja2 input variables. This adjustment improves the management of variable types, aligning with expected input capacities and ensuring more reliable configurations. Additionally, support for file variables is enabled in relevant components, broadening functionality and user options.
…sion management Replaces direct database operations with SQLAlchemy Session context to manage workflow_run more securely and effectively.
Introduces a new DocumentPromptMessageContent class to extend the variety of supported prompt message content types. This enhancement allows encoding document data with specific formats and handling them as part of prompt messages, improving versatility in content manipulation.
Introduces support for document files in prompt message content conversion. Refactors encoding logic by unifying base64 encoding, simplifying and removing redundancy. Improves flexibility and maintainability of file handling in preparation for expanded multimedia support.
Extends file type handling to include documents in message processing. This enhances the application's ability to process a wider range of files.
Introduces support for handling document content, specifically PDFs within prompt messages, enhancing model capabilities with a new feature. Allows dynamic configuration of headers based on document presence in prompts, improving flexibility for user interactions.
Removes the exception message content duplication in the logger to prevent unnecessary redundancy since the exception details are already captured by logger.exception.
laipz8200
force-pushed
the
refactor/prompts-convert-in-llm-node
branch
from
November 22, 2024 07:30
93f5b64
to
0619e9a
Compare
laipz8200
changed the title
Refactor/prompts-convert-in-llm-node
feat: Allow using file variables directly in the LLM node and support more file types.
Nov 22, 2024
crazywoola
approved these changes
Nov 22, 2024
5 tasks
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Labels
💪 enhancement
New feature or request
⚙️ feat:model-runtime
lgtm
This PR has been approved by a maintainer
size:XXL
This PR changes 1000+ lines, ignoring generated files.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Summary
Close #10681
Fixes #10179
Screenshots
Checklist
Important
Please review the checklist below before submitting your pull request.
dev/reformat
(backend) andcd web && npx lint-staged
(frontend) to appease the lint gods