Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Confusion matrix show error classifficating all the nominal (no defectives) sample #36

Open
mjack3 opened this issue Jan 10, 2022 · 1 comment

Comments

@mjack3
Copy link

mjack3 commented Jan 10, 2022

Hello.

I noticed that the model always says "this is image has a deffect" when testing. You can see a commented part of the code where the autor calculates the confusion matrix.
This is what i get for the category hazelnut

Total pixel-level auc-roc score :
0.8770563697278964
Total image-level auc-roc score :
0.7642857142857142
test_epoch_end
[[ 0 40]
 [ 0 70]]
false positive
['005', '001', '025', '029', '036', '002', '033', '018', '004', '015', '024', '035', '010', '021', '039', '019', '022', '031', '032', '003', '012', '020', '014', '006', '023', '000', '009', '008', '037', '007', '038', '034', '030', '017', '013', '011', '027', '028', '016', '026']

Note that the 40 means that the model inferenced 40 samples as deffected object instead of 'good'.

My args:

parser = argparse.ArgumentParser(description='ANOMALYDETECTION')
    parser.add_argument('--phase', type=str, choices=['train','test'], default='train')
    parser.add_argument('--dataset_path', type=str, default='../../datasets/mvtec_anomaly_detection/') # 'D:\Dataset\mvtec_anomaly_detection')#
    parser.add_argument('--category', type=str, default='hazelnut')
    parser.add_argument('--num_epochs', type=int, default=1)
    parser.add_argument('--batch_size', type=int, default=32)
    parser.add_argument('--load_size', type=int, default=256) # 256
    parser.add_argument('--input_size', type=int, default=224)
    parser.add_argument('--coreset_sampling_ratio', type=float, default=0.005)
    parser.add_argument('--project_root_path', type=str, default='test') # 'D:\Project_Train_Results\mvtec_anomaly_detection\210624\test') #
    parser.add_argument('--save_src_code', type=bool, default=True)
    parser.add_argument('--save_anomaly_map', type=bool, default=True)
    parser.add_argument('--n_neighbors', type=int, default=9)
    args = parser.parse_args()

I am testing with the last version of Pytorch (v11). There is something wrong?

@mjack3 mjack3 changed the title Wrong results in the nominal examples? Wrong results in the nominal test examples? Jan 10, 2022
@mjack3 mjack3 changed the title Wrong results in the nominal test examples? Confusion matrix show error classifficating all the nominal (no defectives) sample Jan 11, 2022
@akasavation
Copy link

The last parameter of the confusion matrix is the threshold. The lower threshold, the more positive samples are. The threshold left in the comment is way too low. I set it to 2.0 for my own data and the confusion matrix looks much normal.
However, I guess reporting positive is not that helpful for this method. It gives the sample amap image with localization information anyway. Maybe that is why those code were commented out?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants