-
Notifications
You must be signed in to change notification settings - Fork 202
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Layers phase 1 #318
Layers phase 1 #318
Conversation
Sync with master tensorflow on upstream
Merge main branch to local branch
Update after losses merge
Fix Javadoc errors (tensorflow#152)
pull type def
Metrics Phase 1 (tensorflow#180)
Pull latest tensorflow master
Merge with latest
Resync with origin/master
Sync with tensorflow/java master
Moved tf.raw.nn Ops to tf.nn. Changed generation to generate SoftmaxCrossEntropyWithLogits and SparseSoftmaxCrossEntropyWithLogits to core NNOps (tf.nn).
Added NnOps and SetOps as groups. Fixed MetricsHelper and Losses to use the bew FrameworkOps. Moved SetsOps to framework.op.
Added NnOps and SetOps as groups. Fixed MetricsHelper and Losses to use the bew FrameworkOps. Moved SetsOps to framework.op.
Sync with Metrics Phase 2
Sync with master
Sync with Regularizers
Moved tf.raw.nn Ops to tf.nn. Changed generation to generate SoftmaxCrossEntropyWithLogits and SparseSoftmaxCrossEntropyWithLogits to core NNOps (tf.nn).
Added NnOps and SetOps as groups. Fixed MetricsHelper and Losses to use the bew FrameworkOps. Moved SetsOps to framework.op.
Added NnOps and SetOps as groups. Fixed MetricsHelper and Losses to use the bew FrameworkOps. Moved SetsOps to framework.op.
…Logits and sparseSoftmaxCrossEntropyWithLogits
…l on the AssertThats. This change is unrelated to this PR, but the bug showed up here.
@Craigacp FYI: I had to change |
Add softmax test, fixed bugs in framework softmax
…dded NNhelper for the transpose code, change softmax to use the transpose code and added logSoftmax to do the same thing. Added test cases for logSoftmax.
@karllessard I am replacing this branch with a new branch based on the latest version of Master. This branch is out of sync with master, an won't successfully rebase. Therefore, I am creating a new branch and will be replacing this PR with a new one. |
This PR is being closed because the code became too stable with respect to Master. A new PR is being created to replace this one. |
These are the first set of model layers, fairly simple ones without back propagation.
I have also revamped the
TestSession
,GraphTestSession
andEagerTestSession
to use lambdas for common operations between the Graph and Eager modes. This will need to be updated with #272, "Tensor.dataToString" when that is merged.This PR supersedes #65, "Adding layers (based on keras) supporting multiple outputs"
This PR is dependent on PRs #255 "Framework Ops", and #191 "TensorFormat enum". I temporary created a copy of TensorFormat until #191 is merged.
The layers framework supports multiple inputs and outputs, but some layers are restricted to single input/output by design,
(e.g. Dense).
SequentialLayersTest
tests chaining layers together like in the Sequential Model.