Releases: Teriks/dgenerate
v3.1.0 release
New Features v3
- diffusers 0.23.1
- Breaking library changes and config language changes
- Improved syntax for top level templates in config
- New library interface for handling render loop events, an iterable render loop event stream which is more streamline than using a bunch of callbacks
- Improved and robustified URI parsing
- User specifiable unet pipeline component
--unet
- Latent Consistency Model support, LCMScheduler, LCM LoRA & UNet, for super fast inference
ConsistencyDecoderVAE
VAE support, DALLE-3 contributed decoder--post-processor
option--model-cpu-offload
,--model-sequential-offload
,--sdxl-refiner-cpu-offload
, and--sdxl-refiner-sequential-offload
for manually enabling and VRAM optimizations.--sub-command
option and new sub-commandimage-process
and accompanying config directive\image_process
, which can apply dgenerates image processor implementations to any image, animated image, or video that you want. Reusable library interface included- A dozen new image processors including new edge detectors, depth detection, and segmentation. Includes the image processor
upscaler
which can upscale / restore images using models from: https://openmodeldb.info/ - Improved generic plugin system. Added config plugins, includes ability to implement config directives
v3.0.2 release
Patch Release
- Disable full exception trace output for config script runner
- Single file torch VAE load fix, missing attribute reference
New Features v3
- diffusers 0.23.1
- Breaking library changes and config language changes
- Improved syntax for top level templates in config
- New library interface for handling render loop events, an iterable render loop event stream which is more streamline than using a bunch of callbacks
- Improved and robustified URI parsing
- User specifiable unet pipeline component
--unet
- Latent Consistency Model support, LCMScheduler, LCM LoRA & UNet, for super fast inference
ConsistencyDecoderVAE
VAE support, DALLE-3 contributed decoder--post-processor
option--model-cpu-offload
,--model-sequential-offload
,--sdxl-refiner-cpu-offload
, and--sdxl-refiner-sequential-offload
for manually enabling and VRAM optimizations.--sub-command
option and new sub-commandimage-process
and accompanying config directive\image_process
, which can apply dgenerates image processor implementations to any image, animated image, or video that you want. Reusable library interface included- A dozen new image processors including new edge detectors, depth detection, and segmentation. Includes the image processor
upscaler
which can upscale / restore images using models from: https://openmodeldb.info/ - Improved generic plugin system. Added config plugins, includes ability to implement config directives
v3.0.0 release
New Features
- diffusers 0.23.1
- Breaking library changes and config language changes
- Improved syntax for top level templates in config
- New library interface for handling render loop events, an iterable render loop event stream which is more streamline than using a bunch of callbacks
- Improved an robustified URI parsing
- User specifiable unet pipeline component
--unet
- Latent Consistency Model support, LCMScheduler, LCM LoRA & UNet, for super fast inference
ConsistencyDecoderVAE
VAE support, DALLE-3 contributed decoder--post-processor
option--model-cpu-offload
,--model-sequential-offload
,--sdxl-refiner-cpu-offload
, and--sdxl-refiner-sequential-offload
for manually enabling and VRAM optimizations.--sub-command
option and new sub-commandimage-process
and accompanying config directive\image_process
, which can apply dgenerates image processor implementations to any image, animated image, or video that you want. Reusable library interface included- A dozen new image processors including new edge detectors, depth detection, and segmentation. Includes the image processor
upscaler
which can upscale / restore images using models from: https://openmodeldb.info/ - Improved generic plugin system. Added config plugins, includes ability to implement config directives
v2.1.1 release
Patch Release:
- Fix missing transient dependencies
pytorch-lightning
,lightning-utilities
,lightning-fabric
in python environment created by the windows installer. Causes issues loading certain models off disk.
v2.1.0 release
New Features
- bump huggingface diffusers to v0.22.3 (soon to be v0.23.0, needs testing)
- Allow for specifying multiple LoRA models with
--lora/--loras
- Add
--clip-skips
and--sdxl-refiner-clip-skips
scatter gun arguments for specifying CLIP Skip value, which is now implemented by the diffusers library for SD and SDXL, this is usable with--model-type torch
and--model-type torch-sdxl
in all configurations, it is not available for other model types. - Fix minor bug with reporting usage error diagnostic twice
- Fix bug with
--output-configs/--output-metadata
using incorrect names for--sdxl-refiner-prompts
and related sdxl refiner prompt options in generated configuration.
v2.0.0 release
New Features
- Massive refactor in order to have a useable public API, loosen dependency spec, dgenerate can now be used as a library.
- Full library documentation located at: https://dgenerate.readthedocs.io/en/v2.0.0/
- Basic library usage examples: https://github.com/Teriks/dgenerate/tree/v2.0.0/examples/library_usage
- dgenerate is now a package on pypi: https://pypi.org/project/dgenerate/2.0.0/
- Generate multiple images on a GPU simultaneously with
--batch-size
, optionally create image grids with--batch-grid-size
(works for animations also) - New
--image-seeds
syntax for specifying multiple control guidance images when using multiple--control-nets
. - New
--control-image-preprocessors
syntax for describing which preprocessor chain applies to which control guidance image. - Support for APGN (Animated PNG) inputs via
--image-seeds
- Support for PSD (Photoshop), TGA (Targa), BMP, JPEG-2000 (.jp2, .jpx, .j2k) image formats for static image input via
--image-seeds
- Multi-process safe caching of files downloaded from URLs while multiple dgenerate processes are alive (ref counted file cache).
- Configurable web cache directory location.
- Multi-process safe writes to the same directory when overwrite avoidance is enabled (default).
- Support for Deep Floyd IF pipelines
torch-if
,torch-ifs
(IF SuperScaler), andtorch-ifs-img2img
, new supporting--image-seeds
keyword argumentfloyd
for specifying stage 1 images for img2img and inpainting mode. See: https://github.com/Teriks/dgenerate/tree/v2.0.0/examples/deepfloyd - Disable aspect correct resizing of
--image-seeds
images with new--no-aspect
option and--image-seeds
boolean keyword argumentaspect
when--output-size
is used with--image-seeds
, new option allows for resizing to an exact dimension including skewed ones. frame-start
andframe-end
keyword arguments for--image-seeds
URIs, allows animation slicing settings per image seed specification.- Batch processing script overhaul, includes ability to save and restore diffusers pipeline modules to specific dgenerate invocations, and a few breaking syntax changes. see: https://github.com/Teriks/dgenerate/tree/v2.0.0#batch-processing-from-stdin
- More extensive upfront argument validation and helpful error messages
v1.1.0 with Windows Installer
New Features
- Support for multiple simultaneous ControlNet models with timestep offsetting via
--control-nets
, and new supporting syntax for specifying control images with--image-seeds
that is backwards compatible with 1.0.0. - Batch enumeration capability over all advanced SDXL parameters in SDXL mode. Such as second-prompt, target-size, original-size, crop-coords, aesthetic-score, Etc. for both the main model and the refiner independently, see new
--sdxl-*
and--sdxl-refiner-*
args in--help
output and README. - Batch enumeration of guidance-rescale on supported pipelines, see:
--guidance-rescales
. - Extensible image preprocessor system and chainable image preprocessing, new associated options
--plugin-modules
,--image-preprocessor-help
,--seed-image-preprocessors
,--mask-image-preprocessors
,--control-image-preprocessors
. - Canny edge detect preprocessor (name: canny) with configurability, includes auto thresholding algorithms: otsu, triangle, and median via OpenCV.
- OpenPose human character rig generation image preprocessor (name: openpose) with configurabilty via controlnet-aux.
- PIL ImageOps preprocessors (flip, mirror, grayscale, invert, posterize, solarize).
- New
--model-types
values: ("torch-pix2pix", "torch-sdxl-pix2pix"), Pix2Pix support for torch and torch-sdxl and new supporting option:--image-guidance-scales
. - Improved memory efficiency of media input during
--image-seeds
use. All video data is now read sequentially from disk without the whole of it ever existing in memory all at once. - Configurable temporary on disk caching for
--image-seeds
files downloaded from a URL by degenerate (at:~/.cache/dgenerate/web
by default). Repeat usage of a URL during bach process results in a cache read. - VAE tiling and slicing capability through
--vae-tiling
and--vae-slicing
, helps with GPU memory use during the generation of huge images. - Improved in memory model caching behavior during batch input, fine-grained caching of main / refiner models, VAE's, and ControlNet models.
- GPU Memory optimizations for batch process, force not in-use models back to CPU.
- Somewhat extensive debugging output via
-v/--verbose
option, includes information about diffusion pipeline wrapper and underlying pipeline call parameters, caching behaviors, image preprocessor operations and a few other things. - Argument injection into batch process configurations IE:
dgenerate -v --output-configs < my-config.txt
is now possible and will add those arguments to the end of every invocation in the config. - More extensive upfront arguments validation.
- Code maintainability overhaul.
v1.0.0 release
New Features
A new syntax for specifying LoRA finetunes with --lora
has been added in order to future proof for support of multiple LoRAs,
this is a breaking change. Options involving LoRA loading from huggingface repositories have been removed as program
switches and encapsulated within the new syntax for --lora
path specification.
The new syntax mentioned above has also been adopted for --sdxl-refiner
, and extraneous command line arguments relating
to SDXL have been removed.
-
Added support for multiple Textual Inversion models with Stable Diffusion
--model-type torch
via the new option--textual-inversions
, with new encapsulated model loading syntax. -
Added optional per image configuration output via
--output-configs
, useful for reproducing images -
Added the ability to write that configuration data to PNG metadata using
--output-metadata
-
File overwrites are now avoided, a file name suffix is generated for duplicates unless
--output-overwrite
is used -
Passing "help" to
--scheduler
with a model specified will list the compatible schedulers for that model type without generating images. -
Added support for x2 and x4 Stable Diffusion image upscaler models through the new
--model-type
valuestorch-upscaler-x2
andtorch-upscaler-x4
and associated option--upscaler-noise-levels
-
Added partial Jinja2 templating to batch processing from STDIN in order to allow advanced referencing of filenames generated by previous invocations of dgenerate in the input configuration file. This is useful for creating pipelines which run images through multiple img2img, in-paint, or upscaler models.
Bug Fixes
-
Fix missing mimetype recognition for .webp files which resulted in them failing to be read correctly
-
Correctly handle broken GIFS and .webp which do not specify a frame duration in the same way web browsers do, set FPS to 10
-
Fix bug with aligning in-paint masks to 8 pixels causing occasional dimensional mismatch error when rendering animations with a mask
v0.18.2 release
First release with Windows installer, only tested on Windows 10.
Due to the size of the packaged python environment, the installer is within a multi-part zip file.
The multipart zip can be extracted using 7-Zip: https://www.7-zip.org/
Download both dgenerate_installer.zip.001 and dgenerate_installer.zip.002 to a folder.
Unzip dgenerate_installer.zip.001 to a directory (Right click, 7-Zip -> Extract to "dgenerate_installer") and then run dgenerate_installer\dgenerate.msi to install.
dgenerate will be installed under "C:\Program Files\dgenerate" by default with an isolated python environment provided.
The install directory will be added to PATH, and dgenerate will be available from the command line.