Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix bug in recurrent branch #6

Open
wants to merge 26 commits into
base: master
Choose a base branch
from
Open

Fix bug in recurrent branch #6

wants to merge 26 commits into from

Conversation

cvondrick
Copy link

longjon and others added 26 commits March 13, 2015 13:12
(With layers whose backwards accumlate gradients), this effectively
decouples the computational batch from the SGD minibatch. Each
iteration accumulates gradients over iter_size batches, then parameters
are updated.
(double impl from NVIDIA dev docs; float impl included in CUDA as
"atomicAdd")
@dxj19831029
Copy link

I think your target should be recurrent branch?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants