-
Notifications
You must be signed in to change notification settings - Fork 593
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Normalize image color spaces before comparison #665
base: main
Are you sure you want to change the base?
Conversation
Use the sRGB converted snapshot when doing the perceptual comparison This reduces the chances of failures when comparing snapshots in different color spaces
Hey. Should we merge this PR after the rebase? Looks like it was approved, but never merged |
@stephencelis Please, is there a plan to merge this? |
# Conflicts: # Sources/SnapshotTesting/Snapshotting/NSImage.swift # Sources/SnapshotTesting/Snapshotting/UIImage.swift # Tests/SnapshotTestingTests/SnapshotTestingTests.swift
@stephencelis I will bump this topic cause there was no answer from few months Could you provide information what is blocking you from merging it into the main branch? It will be nice feature/fix that could reduce lots of issues. Thanks! |
+1 please merge :) |
After a while I figured out the difference between my CI machine and the dev machine was the color space, running on macOS. |
Overview
Use the sRGB converted snapshot when doing the perceptual comparison. This reduces the chances of failures when comparing snapshots using different color spaces.
This colorspace normalization technique was originally introduced in #446 and this PR extends it to both the reference and new images when performing perceptual image comparison.
Unit tests were added to verify that images in the P3 and sRGB colors spaces match after colorspace normalization.
Related Issues