Skip to content

Conversation

@danielenricocahall
Copy link
Contributor

@danielenricocahall danielenricocahall commented Oct 31, 2025

Support for:

  • max pooling
  • average pooling
  • CTC loss
  • one hot encoding
  • CELU
  • log sigmoid
  • dot product attention

To ensure these tests run during CI, I have removed the keras/src/ops/nn_test.py from the keras/src/backend/openvino/excluded_tests.txt file, which was used to skip tests for the OpenVINO backend, and expanded keras/src/backend/openvino/excluded_concrete_tests.txt with new tests such that test omission is decided in a fixture.

@gemini-code-assist
Copy link
Contributor

Summary of Changes

Hello @danielenricocahall, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request significantly expands the functional capabilities of the Keras OpenVINO backend by introducing support for several key neural network operations. These additions, including new activation functions, pooling layers, and specialized loss functions, enhance the compatibility of Keras models with OpenVINO, allowing for a broader range of models to be deployed and optimized using the OpenVINO toolkit.

Highlights

  • New Activation Function: The CELU (Continuously Differentiable Exponential Linear Unit) activation function has been implemented for the OpenVINO backend, allowing for its use in Keras models targeting OpenVINO.
  • Pooling Operations: Support for both max_pool and average_pool operations has been added, enabling these common neural network layers to be executed via the OpenVINO backend.
  • One-Hot Encoding: The one_hot encoding function is now supported, which is crucial for various categorical data processing tasks within the OpenVINO backend.
  • CTC Loss Implementation: The ctc_loss function, commonly used in sequence-to-sequence models like speech recognition, has been implemented for the OpenVINO backend.
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point by creating a comment using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands on the current page.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in pull request comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces support for several new functions in the OpenVINO backend, including celu, max_pool, average_pool, one_hot, and ctc_loss. The implementations are mostly correct, but I've identified a few areas for improvement. Specifically, there are opportunities to enhance code readability in celu, fix a copy-paste error in max_pool, improve the implementation of one_hot for correctness and efficiency, and address a critical bug in ctc_loss related to incorrect input processing. My detailed feedback and suggestions are provided in the comments below.

@codecov-commenter
Copy link

codecov-commenter commented Oct 31, 2025

Codecov Report

❌ Patch coverage is 7.97101% with 127 lines in your changes missing coverage. Please review.
✅ Project coverage is 74.64%. Comparing base (4d30a7f) to head (c7b59f0).

Files with missing lines Patch % Lines
keras/src/backend/openvino/nn.py 7.97% 127 Missing ⚠️

❗ There is a different number of reports uploaded between BASE (4d30a7f) and HEAD (c7b59f0). Click for more details.

HEAD has 4 uploads less than BASE
Flag BASE (4d30a7f) HEAD (c7b59f0)
keras 5 3
keras-openvino 1 0
keras-jax 1 0
Additional details and impacted files
@@            Coverage Diff             @@
##           master   #21803      +/-   ##
==========================================
- Coverage   82.66%   74.64%   -8.03%     
==========================================
  Files         577      577              
  Lines       59477    59614     +137     
  Branches     9329     9335       +6     
==========================================
- Hits        49167    44498    -4669     
- Misses       7907    12919    +5012     
+ Partials     2403     2197     -206     
Flag Coverage Δ
keras 74.50% <7.97%> (-7.98%) ⬇️
keras-jax ?
keras-numpy 57.41% <0.00%> (-0.14%) ⬇️
keras-openvino ?
keras-tensorflow 63.99% <7.97%> (-0.13%) ⬇️
keras-torch 63.47% <7.97%> (-0.13%) ⬇️

Flags with carried forward coverage won't be shown. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

danielenricocahall and others added 3 commits October 30, 2025 22:50
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
@danielenricocahall
Copy link
Contributor Author

/gemini review

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request adds support for several new neural network functions to the OpenVINO backend, including max_pool, average_pool, ctc_loss, one_hot, and celu. The implementations are generally correct and follow the patterns in the existing backend code. I've provided a few suggestions to improve the code. Specifically, I recommend simplifying the celu implementation by using the native OpenVINO operation, refactoring max_pool and average_pool to reduce code duplication, and adding a check in one_hot to handle cases where dtype is None for increased robustness.

@danielenricocahall
Copy link
Contributor Author

/gemini review

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request adds support for several new neural network functions to the OpenVINO backend, including pooling operations, celu, one_hot, and ctc_loss. The approach of using a shared _pool helper for both max_pool and average_pool is good for code reuse. However, I've identified a few critical issues that could lead to runtime errors in the _pool and one_hot implementations. I've also suggested a simplification for the celu function to use the native OpenVINO operator. Additionally, the PR description mentions support for log_sigmoid, but it is not implemented in the code; it would be great to add this for completeness.

@danielenricocahall
Copy link
Contributor Author

/gemini review

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request adds support for several neural network functions to the OpenVINO backend, including max_pool, average_pool, ctc_loss, one_hot, celu, and log_sigmoid. The implementations are generally well-structured, and the refactoring to use a shared _pool function for pooling operations is a good approach. However, I've identified a critical issue in the _pool function that will cause a runtime error for max_pool operations. Additionally, the celu function can be simplified by using a native OpenVINO operation. My review includes suggestions to address these points.

danielenricocahall and others added 2 commits October 31, 2025 09:26
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
@danielenricocahall
Copy link
Contributor Author

/gemini review

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request adds support for several neural network functions to the OpenVINO backend, including pooling operations, CTC loss, one-hot encoding, and various activation functions like CELU and log-sigmoid. The changes are generally good, with a nice refactoring for pooling operations.

I've identified two main issues:

  1. A critical bug in the pooling implementation where an incorrect keyword argument is used, which will lead to a runtime error.
  2. The celu activation function is implemented manually, while a more efficient native OpenVINO operation is available. Using the native op would improve performance and simplify the code.

Details and code suggestions are provided in the specific comments.

@danielenricocahall
Copy link
Contributor Author

/gemini summary

@gemini-code-assist
Copy link
Contributor

Summary of Changes

This pull request significantly enhances the Keras OpenVINO backend by integrating several fundamental neural network operations. These additions, including new activation functions, pooling layers, and specialized loss functions, broaden the compatibility of Keras models with OpenVINO, allowing for a wider array of models to be deployed and optimized using the OpenVINO toolkit. The changes aim to reduce NotImplementedError instances and improve the functional parity of the OpenVINO backend.

Highlights

  • New Activation Functions: Implemented support for celu (Continuously Differentiable Exponential Linear Unit) and log_sigmoid activation functions within the OpenVINO backend. The silu function was also updated to use the native OpenVINO swish operator.
  • Pooling Operations: Added support for max_pool and average_pool operations, replacing previous NotImplementedError messages. A new internal helper function, _pool, was introduced to consolidate common logic for these pooling operations.
  • One-Hot Encoding: The one_hot encoding function is now supported, including handling for sparse=True (which raises a ValueError as sparse tensors are not supported) and proper dtype conversion for on_value and off_value.
  • CTC Loss: Implemented the ctc_loss function, enabling its use with the OpenVINO backend. This includes correctly converting all input tensors to OpenVINO format.
Changelog
  • keras/src/backend/openvino/nn.py
    • Added import for OPENVINO_DTYPES.
    • Implemented the celu activation function.
    • Updated silu implementation to use ov_opset.swish.
    • Implemented the log_sigmoid activation function.
    • Implemented max_pool functionality.
    • Implemented average_pool functionality.
    • Introduced a new private helper function _pool to abstract common pooling logic.
    • Implemented one_hot encoding, including sparse argument validation and dtype handling.
    • Implemented ctc_loss function, ensuring all inputs are correctly converted to OpenVINO tensors.
Activity
  • An initial summary of changes was provided by gemini-code-assist[bot], highlighting new activation functions, pooling, one-hot encoding, and CTC loss.
  • A Codecov report was posted by codecov-commenter, indicating low patch coverage (6.38%) for keras/src/backend/openvino/nn.py.
  • The author, danielenricocahall, requested /gemini review multiple times throughout the PR's activity.
  • Critical feedback from gemini-code-assist[bot] on ctc_loss regarding incorrect log_softmax application and missing OpenVINO tensor conversions was addressed by the author.
  • Medium priority suggestions from gemini-code-assist[bot] for celu variable naming were incorporated by the author.
  • Medium priority suggestion from gemini-code-assist[bot] to create a _pool helper function for pooling operations was implemented by the author.
  • Critical and high priority feedback from gemini-code-assist[bot] on one_hot regarding sparse=True handling, dtype=None default, and input tensor conversion were addressed by the author.
  • Critical feedback from gemini-code-assist[bot] on the _pool helper function regarding kernel_shape vs kernel and strides=None handling was partially addressed; strides=None is handled, but the author noted that the current OpenVINO API uses kernel_shape for the pooling functions.
  • Repeated suggestions from gemini-code-assist[bot] to use a native ov_opset.celu operator were met with the author's response that opset14 does not contain such a function, leading to the manual implementation being retained.

Copy link
Collaborator

@fchollet fchollet left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the PR! Please also enable OpenVINO tests for these operations.

@danielenricocahall
Copy link
Contributor Author

Thanks for the PR! Please also enable OpenVINO tests for these operations.

Done! I had to rework the testing structure to enable them but now they're running in CI

@danielenricocahall
Copy link
Contributor Author

/gemini review

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request adds support for several new neural network operations to the OpenVINO backend, including pooling, CTC loss, and various activation functions. The changes are well-structured, and enabling the nn_test.py suite is a great step towards ensuring correctness.

I've found a couple of issues. There's a critical bug in the dot_product_attention implementation related to data type handling that could cause runtime errors. Additionally, the celu function can be simplified by using the native OpenVINO operation. I've also suggested a minor improvement to a test message.

Overall, this is a solid contribution to expanding OpenVINO backend support. Addressing these points will make it even better.

danielenricocahall and others added 2 commits November 10, 2025 20:54
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants