Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Layers phase 1 #318

Closed
wants to merge 49 commits into from
Closed

Conversation

JimClarke5
Copy link
Contributor

These are the first set of model layers, fairly simple ones without back propagation.

I have also revamped the TestSession, GraphTestSession and EagerTestSession to use lambdas for common operations between the Graph and Eager modes. This will need to be updated with #272, "Tensor.dataToString" when that is merged.

This PR supersedes #65, "Adding layers (based on keras) supporting multiple outputs"

This PR is dependent on PRs #255 "Framework Ops", and #191 "TensorFormat enum". I temporary created a copy of TensorFormat until #191 is merged.

The layers framework supports multiple inputs and outputs, but some layers are restricted to single input/output by design,
(e.g. Dense).

SequentialLayersTest tests chaining layers together like in the Sequential Model.

Sync with master tensorflow on upstream
Merge main  branch to local branch
Update after losses merge
Pull latest tensorflow master
Resync with origin/master
Sync with tensorflow/java master
Moved tf.raw.nn Ops to tf.nn.
Changed generation to  generate SoftmaxCrossEntropyWithLogits and SparseSoftmaxCrossEntropyWithLogits to core NNOps (tf.nn).
Added NnOps and SetOps as groups.
Fixed MetricsHelper and Losses to use the bew FrameworkOps.
Moved SetsOps to framework.op.
Added NnOps and SetOps as groups.
Fixed MetricsHelper and Losses to use the bew FrameworkOps.
Moved SetsOps to framework.op.
Sync with Metrics Phase 2
Moved tf.raw.nn Ops to tf.nn.
Changed generation to  generate SoftmaxCrossEntropyWithLogits and SparseSoftmaxCrossEntropyWithLogits to core NNOps (tf.nn).
Added NnOps and SetOps as groups.
Fixed MetricsHelper and Losses to use the bew FrameworkOps.
Moved SetsOps to framework.op.
Added NnOps and SetOps as groups.
Fixed MetricsHelper and Losses to use the bew FrameworkOps.
Moved SetsOps to framework.op.
…Logits and sparseSoftmaxCrossEntropyWithLogits
@JimClarke5
Copy link
Contributor Author

@Craigacp FYI: I had to change tf.nn.raw.softmaxCrossEntropyWithLogits to tf.nn.softmaxCrossEntropyWithLogits in GradientDescentTest to match the new Framework Ops. tf only contains the raw ops, the higher level ops are in org.tendorflow.framework.op

Add softmax test, fixed bugs in framework softmax
…dded NNhelper for the transpose code,

change softmax to use the transpose code and added logSoftmax to do the same thing. Added test cases for logSoftmax.
@JimClarke5
Copy link
Contributor Author

@karllessard I am replacing this branch with a new branch based on the latest version of Master. This branch is out of sync with master, an won't successfully rebase. Therefore, I am creating a new branch and will be replacing this PR with a new one.

@JimClarke5
Copy link
Contributor Author

This PR is being closed because the code became too stable with respect to Master. A new PR is being created to replace this one.

@JimClarke5 JimClarke5 closed this Jun 10, 2021
@JimClarke5 JimClarke5 deleted the layers_phase_1 branch June 15, 2021 23:30
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant