VMAF is a perceptual video quality assessment algorithm developed by Netflix. This software package includes a stand-alone C library libvmaf
and its wrapping Python library. The Python library also provides a set of tools that allows a user to train and test a custom VMAF model.
Read this tech blog post for an overview, this post for the tips of best practices, and this post for our latest efforts on speed optimization, new API design and the introduction of a codec evaluation-friendly NEG mode.
Also included in libvmaf
are implementations of several other metrics: PSNR, PSNR-HVS, SSIM, MS-SSIM and CIEDE2000.
- (2021-12-15) We have added to CAMBI the
full_ref
input parameter to allow running CAMBI as a full-reference metric, taking into account the banding that was already present on the source. Check out the usage page. - (2021-12-1) We have added to CAMBI the
max_log_contrast
input parameter to allow to capture banding with higher contrasts than the default. We have also sped up CAMBI (e.g., around 4.5x for 4k). Check out the usage page. - (2021-10-7) We are open-sourcing CAMBI (Contrast Aware Multiscale Banding Index) - Netflix's detector for banding (aka contouring) artifacts. Check out the tech blog for an overview and the technical paper published in PCS 2021 (note that the paper describes an initial version of CAMBI that no longer matches the code exactly, but it is still a good introduction). Also check out the usage page.
- (2020-12-7) Check out our latest tech blog on speed optimization, new API design and the introduction of a codec evaluation-friendly NEG mode.
- (2020-12-3) We are releasing
libvmaf v2.0.0
. It has a new fixed-point and x86 SIMD-optimized (AVX2, AVX-512) implementation that achieves 2x speed up compared to the previous floating-point version. It also has a new API that is more flexible and extensible. - (2020-7-13) We have created a memo to share our thoughts on VMAF's property in the presence of image enhancement operations, its impact on codec evaluation, and our solutions. Accordingly, we have added a new mode called No Enhancement Gain (NEG).
- (2020-2-27) We have changed VMAF's license from Apache 2.0 to BSD+Patent, a more permissive license compared to Apache that also includes an express patent grant.
There is an overview of the documentation with links to specific pages, covering FAQs, available models and metrics, software usage guides, and a list of resources.
The software package offers a number of ways to interact with the VMAF implementation.
- The command-line tool
vmaf
provides a complete algorithm implementation, such that one can easily deploy VMAF in a production environment. Additionally, thevmaf
tool provides a number of auxillary metrics such as PSNR, SSIM and MS-SSIM. - The C library
libvmaf
provides an interface to incorporate VMAF into your code, and tools to integrate other feature extractors into the library. - The Python library offers a full array of wrapper classes and scripts for software testing, VMAF model training and validation, dataset processing, data visualization, etc.
- VMAF is now included as a filter in FFmpeg, and can be configured using:
./configure --enable-libvmaf
. Refer to the Using VMAF with FFmpeg page. - VMAF Dockerfile generates a docker image from the Python library. Refer to this document for detailed usage.
- To build VMAF on Windows, follow these instructions.
- AOM CTC: AOM has specified vmaf to be the standard implementation metrics tool according to the AOM common test conditions (CTC). Refer to this page for usage compliant with AOM CTC.
Refer to the contribution page. Also refer to this slide deck for an overview contribution guide.