You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I noticed that the model always says "this is image has a deffect" when testing. You can see a commented part of the code where the autor calculates the confusion matrix.
This is what i get for the category hazelnut
I am testing with the last version of Pytorch (v11). There is something wrong?
The text was updated successfully, but these errors were encountered:
mjack3
changed the title
Wrong results in the nominal examples?
Wrong results in the nominal test examples?
Jan 10, 2022
mjack3
changed the title
Wrong results in the nominal test examples?
Confusion matrix show error classifficating all the nominal (no defectives) sample
Jan 11, 2022
The last parameter of the confusion matrix is the threshold. The lower threshold, the more positive samples are. The threshold left in the comment is way too low. I set it to 2.0 for my own data and the confusion matrix looks much normal.
However, I guess reporting positive is not that helpful for this method. It gives the sample amap image with localization information anyway. Maybe that is why those code were commented out?
Hello.
I noticed that the model always says "this is image has a deffect" when testing. You can see a commented part of the code where the autor calculates the confusion matrix.
This is what i get for the category hazelnut
Note that the 40 means that the model inferenced 40 samples as deffected object instead of 'good'.
My args:
I am testing with the last version of Pytorch (v11). There is something wrong?
The text was updated successfully, but these errors were encountered: