You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Please make sure that this is a feature request. As per our GitHub Policy, we only address code/doc bugs, performance issues, feature requests and build/installation issues on GitHub. tag:feature_template
System information
TensorFlow version (you are using): 2.3.1
Are you willing to contribute it (Yes/No): Yes, when able and available
Describe the feature and the current behavior/state.
TensorFlow provides the tf.GradientTape API to differentiate automatically, TensorFlow needs to remember what operations happen in what order during the forward pass. Then, during the backward pass, TensorFlow traverses this list of operations in reverse order to compute gradients.
Details about this feature can be found in the official TensorFlow documentation for Gradient Tapes
Will this change the current api? How?
Yes, I think it will add a new feature to tensorflow-core module
Who will benefit with this feature?
Anyone that requires a very low-level control over training and evaluation of a deep learning model and everyone who is already familiar with TF/Keras.
Any Other info. tf.GradientTape API is needed when writing a training loop from scratch as described here
The text was updated successfully, but these errors were encountered:
It's worth noting that we already have support for gradients when operating in Graph mode, so you can train models in TF-Java today. What we don't currently support is gradients in eager mode, as that is mostly implemented in the python layer of core tensorflow. The tensorflow team are refactoring the gradient support and pushing more of it down into the C++ layer where we may be able to access it from Java, but that refactor is still in progress and we don't know when it will be done.
Please make sure that this is a feature request. As per our GitHub Policy, we only address code/doc bugs, performance issues, feature requests and build/installation issues on GitHub. tag:feature_template
System information
Describe the feature and the current behavior/state.
TensorFlow provides the
tf.GradientTape
API to differentiate automatically, TensorFlow needs to remember what operations happen in what order during the forward pass. Then, during the backward pass, TensorFlow traverses this list of operations in reverse order to compute gradients.Details about this feature can be found in the official TensorFlow documentation for Gradient Tapes
Will this change the current api? How?
Yes, I think it will add a new feature to
tensorflow-core
moduleWho will benefit with this feature?
Anyone that requires a very low-level control over training and evaluation of a deep learning model and everyone who is already familiar with TF/Keras.
Any Other info.
tf.GradientTape
API is needed when writing a training loop from scratch as described hereThe text was updated successfully, but these errors were encountered: