-
Notifications
You must be signed in to change notification settings - Fork 204
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Different results in caffe-android-lib and caffe-lib #28
Comments
@Syndrome777 Did u use Eigen to build caffe on your computer? And, dropout is not applied during test phase. |
@sh1r0 Thank you for your response. No, I use caffe's default BLAS(OpenBLAS/CUDA). Is Eigen the reason of this different result ? It seems that this difference is unacceptable. |
Oh, it could be related to the issue mentioned in BVLC/caffe#2619 (comment). I think my wrappers in mkl_alternate.hpp and math_functions.cpp might be buggy somewhere. Sorry for that. If you want the result to be more precise, maybe you can refer to this to get pre-built OpenBLAS, which is much slower than Eigen, as a workaround. |
@Syndrome777 ./build/examples/cpp_classification/classification.bin \
models/bvlc_reference_caffenet/deploy.prototxt \
models/bvlc_reference_caffenet/bvlc_reference_caffenet.caffemodel \
data/ilsvrc12/imagenet_mean.binaryproto \
data/ilsvrc12/synset_words.txt \
examples/images/cat.jpg And the results shows no difference between the two builds.
How did you test on your desktop? |
@Syndrome777 Finally, OpenBLAS is fully (?) supported, please refer to the latest master branch (3ac46b0). |
Hi, @sh1r0 , Thank you for your help. |
@sh1r0 Hello, Could you tell me how do you get this result ? If this result is run in an android phone and if there is any scripts? ---------- Prediction for /sdcard/caffe_mobile/cat.jpg ---------- Thank you !!! |
Hi, All,
I'm so sorry that I got a different result during using caffe-android-lib.
My task is Classification, and I got the result in softmax layer in my android device which is below:
Label 0, Label 1, Label2, Label3
0.00697519, 0.0600692, 0.904507, 0.0284484
But meanwhile, I got the result in softmax layer in my computer by caffe-lib, is below:
Label 0, Label 1, Label2, Label3
0.008724 , 0.04197951, 0.93302566, 0.01627085
They both use the same model and the same test image.
I don't know if the dropout layer is useful during Test period.
So if there is a bug? Does any have the same problem?
The text was updated successfully, but these errors were encountered: