Skip to content

A collection of dnn test input prioritizers often used as benchmarks in recent literature.

License

Notifications You must be signed in to change notification settings

testingautomated-usi/dnn-tip

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

16 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

DNN-TIP: Common Test Input Prioritizers Library

test Code style: black docstr-coverage Imports: isort Python Version PyPi Deployment License DOI

Implemented Approaches

  • Surprise Adequacies
    • Distance-based Surprise Adequacy (DSA)
    • Likelihood-based Surprise Adequacy (LSA)
    • MultiModal-Likelihood-based Surprise Adequacy (MLSA)
    • Mahalanobis-based Surprise Adequacy (MDSA)
    • abstract MultiModal Surprise Adequacy
  • Surprise Coverage
    • Neuron-Activation Coverage (NAC)
    • K-Multisection Neuron Coverage (KMNC)
    • Neuron Boundary Coverage (NBC)
    • Strong Neuron Activation Coverage (SNAC)
    • Top-k Neuron Coverage (TKNC)
  • Utilities
    • APFD calculation
    • Coverage-Added and Coverage-Total Prioritization Methods (CAM and CTM)

If you are looking for the uncertainty metrics we also tested (including DeepGini), head over to the sister repository uncertainty-wizard.

If you want to reproduce our exact experiments, there's a reproduction package and docker stuff available at testingautomated-usi/simple-tip.

Installation

It's as easy as pip install dnn-tip.

Documentation

Find the documentation at https://testingautomated-usi.github.io/dnn-tip/.

Citation

Here's the reference to the paper as part of which this library was release:

@inproceedings{10.1145/3533767.3534375,
author = {Weiss, Michael and Tonella, Paolo},
title = {Simple Techniques Work Surprisingly Well for Neural Network Test Prioritization and Active Learning (Replicability Study)},
year = {2022},
isbn = {9781450393799},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
url = {https://doi.org/10.1145/3533767.3534375},
doi = {10.1145/3533767.3534375},
booktitle = {Proceedings of the 31st ACM SIGSOFT International Symposium on Software Testing and Analysis},
pages = {139–150},
numpages = {12},
keywords = {neural networks, Test prioritization, uncertainty quantification},
location = {Virtual, South Korea},
series = {ISSTA 2022}
}

About

A collection of dnn test input prioritizers often used as benchmarks in recent literature.

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages