Nice, small, and fast realtime CNN-based upscaler. Trained on visual novel screenshots/CG.
Currently very new and immature 😭.
Supports exporting to an mpv memeshader!
And now a Magpie effect!
mpv shaders are found inside the mpv/
directory.
Metric-focused variants are found inside the results/
directory.
Magpie effects are found inside the magpie/
directory.
The order of best quality -> worst quality is sorted by the 2nd number first,
then the first number. So 8x32
> 4x32
> 4x16
.
Conversely the order of fastest -> slowest would be the reverse, with 4x16
being the fastest and 8x32
being the slowest.
The CuNNy-fast
shader for mpv sits between 2x8
and 2x12
shaders. This is
the recommended shader if you have a relatively slow machine.
Variants:
- No suffix: Trained to upscale anime fairly neutrally.
DS
: Trained to upscale, denoise, and sharpen anime.NVL
: Trained to upscale visual novels.RGB
: Trained to upscale RGB instead of luminance.
There are versions of the mpv shaders use 8-bit dp4a
instructions. They can be
many times faster than the standard upscaling shader depending on if your
hardware supports accelerated dp4a
instructions. Requires vo=gpu-next
with
gpu-api=vulkan
. They can be found inside the dp4a/
subdirectories.
Tested training with PyTorch nightly. If any errors arise try using nightly.
Prepare data by running sh scripts/build.sh
, then sh scripts/split.sh <input-folder> <output-128-grids>
, then py scripts/proc.py <128-grids> <out>
.
To train py train.py <data> <N> <D>
where N
is the number of internal
convolutions and D
is the number of feature layers.
Convert the resulting model to an mpv shader by running
py mpv.py <models/model.pt>
.
Convert the resulting model to a Magpie effect by running
py magpie.py <models/model.pt>
.
Trains very fast on my machine.
See results/
.
LGPL v3