We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
First. Thank you very much for this implementation. Great work!
I'm having a problem immediately in training which seems to have invalid logits into cross entropy.
Any ideas?...
Receptive Field: 2048 samples pad value: 0 Start new training.... /home/rig/speech/fftnet/FFTNet/utils/__init__.py:67: RuntimeWarning: invalid value encountered in log1p return np.log1p(x) if isnumpy or isscalar else tf.log1p(x) Traceback (most recent call last): File "/home/rig/.conda/envs/fftn/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 1322, in _do_call return fn(*args) File "/home/rig/.conda/envs/fftn/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 1307, in _run_fn options, feed_dict, fetch_list, target_list, run_metadata) File "/home/rig/.conda/envs/fftn/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 1409, in _call_tf_sessionrun run_metadata) tensorflow.python.framework.errors_impl.InvalidArgumentError: Received a label value of -2147483648 which is outside the valid range of [0, 256). Label values: 328 327 327 326 326 327 326 325 326 327 327 328 331 333 333 334 335 335 336 340 343 344 347 350 352 354 360 -2147483648 -2147483648 -2147483648 -2147483648 -2147483648 -2147483648 -2147483648 -2147483648 -2147483648 -2147483648 -2147483648 -2147483648 -2147483648 -2147483648 -2147483648 -2147483648 -2147483648 -2147483648 -2147483648 -2147483648 ... -2147483648 -2147483648 -2147483648 363 361 362 361 362 364 363 363 362 360 362 361 360 363 364 364 363 361 361 358 356 356 356 356 358 358 361 364 364 -2147483648 -2147483648 365 -2147483648 -2147483648 -2147483648 -2147483648 -2147483648 -2147483648 -2147483648 -2147483648 -2147483648 -2147483648 -2147483648 -2147483648 -2147483648 -2147483648 -2147483648 -2147483648 -2147483648 -2147483648 -2147483648 -2147483648 -2147483648 -2147483648 -2147483648 -2147483648 -2147483648 -2147483648 -2147483648 -2147483648 -2147483648 -2147483648 [[Node: model/loss/SparseSoftmaxCrossEntropyWithLogits/SparseSoftmaxCrossEntropyWithLogits = SparseSoftmaxCrossEntropyWithLogits[T=DT_FLOAT, Tlabels=DT_INT32, _device="/job:localhost/replica:0/task:0/device:CPU:0"](model/loss/SparseSoftmaxCrossEntropyWithLogits/Reshape, model/loss/SparseSoftmaxCrossEntropyWithLogits/Reshape_1)]]
The text was updated successfully, but these errors were encountered:
I have updated the repo, can u test it again?
Sorry, something went wrong.
No branches or pull requests
First. Thank you very much for this implementation. Great work!
I'm having a problem immediately in training which seems to have invalid logits into cross entropy.
Any ideas?...
The text was updated successfully, but these errors were encountered: