Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Do not merge] Fix nondeterminism issue wrt sentiment analysis chapter #11133

Closed
wants to merge 5 commits into from

Conversation

sidgoyal78
Copy link
Contributor

@sidgoyal78 sidgoyal78 commented Jun 3, 2018

As mentioned in #11057 we get non-deterministic outputs for a couple of examples.

I tried debugging the sentiment analysis model: https://github.com/sidgoyal78/Paddle/blob/a801e7bcb2f4f1e131f6a640ecd84a03d21588ff/test_sa_conv.py

by actually saving the weights after each iteration, and also printing inputs and outputs at individual CUDA kernels. I have some findings that I wanted to share:

  • The SGD Sparse kernel has cudaAtomicAdd which results in different order of processing the float values.
  • The im2colOCF kernel (for the sequence_convolution layer) is also implemented using cudaAtomicAdd, and hence the most probable reason for inconsistent results.

In order to get consistent results I had to change the launching of both of these kernels to allow for only 1 thread.

So in summary the issue is with CUDA atomics. This was pointed out by Greg in a discussion recently, and there is another comment here: tensorflow/tensorflow#3103 (comment)

With these changes, we get consistent results but at the cost of performance.

Moreover, if we change the optimizer to Adam in the above example, then we don't get consistent results. So probably there is some problem with Adam optimizer as well.

Note: With launching just one thread, we don't get correct results, since only a part of the actual work is done. However, this PR is for highlighting the consistency problem (but I want to be clear that launching 1 thread does result in incorrect outputs, as can be seen in the failed unit tests).

@dzhwinter @chengduoZH : What do you guys think?

@sidgoyal78
Copy link
Contributor Author

Closing PR.

@sidgoyal78 sidgoyal78 closed this Sep 6, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant