Safety checker is a model used in stable diffusion pipeline, aimed to identify NSFW image. Here is the official description.
This project extracts its safety checker into a independent function to provide a conventional way to detect NSFW image in deep neural network.
See test_imgs.py to get start.
- add standalone safety checker implementation that depends on pytorch only. provide strength config (if possible) for NSFW detection
- add FastAPI integration