This repository contains WIP tools to detect Nightshade-protected ("shaded") images in training datasets. It is intended as a data preprocessing tool to remove shaded images that distrupt model representations. This approach allows both artists to retain ownership over their work and model developers to train unaffected models whilst respecting copyright.
While Glaze is intended as a tool to defend artstyles against mimicry attacks through text-to-image model finetuning, Nightshade