Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve APNG support #668

Open
wants to merge 6 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
75 changes: 75 additions & 0 deletions src/apng.rs
Original file line number Diff line number Diff line change
@@ -0,0 +1,75 @@
use std::io::Write;

use crate::{
error::PngError,
headers::{read_be_u16, read_be_u32},
PngResult,
};

#[derive(Debug, Clone)]
/// Animated PNG frame
pub struct Frame {
/// Width of the frame
pub width: u32,
/// Height of the frame
pub height: u32,
/// X offset of the frame
pub x_offset: u32,
/// Y offset of the frame
pub y_offset: u32,
/// Frame delay numerator
pub delay_num: u16,
/// Frame delay denominator
pub delay_den: u16,
/// Frame disposal operation
pub dispose_op: u8,
/// Frame blend operation
pub blend_op: u8,
/// Frame data, from fdAT chunks
pub data: Vec<u8>,
}

impl Frame {
/// Construct a new Frame from the data in a fcTL chunk
pub fn from_fctl_data(byte_data: &[u8]) -> PngResult<Frame> {
if byte_data.len() < 26 {
return Err(PngError::TruncatedData);
}
Ok(Frame {
width: read_be_u32(&byte_data[4..8]),
height: read_be_u32(&byte_data[8..12]),
x_offset: read_be_u32(&byte_data[12..16]),
y_offset: read_be_u32(&byte_data[16..20]),
delay_num: read_be_u16(&byte_data[20..22]),
delay_den: read_be_u16(&byte_data[22..24]),
dispose_op: byte_data[24],
blend_op: byte_data[25],
data: vec![],
})
}

/// Construct the data for a fcTL chunk using the given sequence number
#[must_use]
pub fn fctl_data(&self, sequence_number: u32) -> Vec<u8> {
let mut byte_data = Vec::with_capacity(26);
byte_data.write_all(&sequence_number.to_be_bytes()).unwrap();
byte_data.write_all(&self.width.to_be_bytes()).unwrap();
byte_data.write_all(&self.height.to_be_bytes()).unwrap();
byte_data.write_all(&self.x_offset.to_be_bytes()).unwrap();
byte_data.write_all(&self.y_offset.to_be_bytes()).unwrap();
byte_data.write_all(&self.delay_num.to_be_bytes()).unwrap();
byte_data.write_all(&self.delay_den.to_be_bytes()).unwrap();
byte_data.push(self.dispose_op);
byte_data.push(self.blend_op);
byte_data
}

/// Construct the data for a fdAT chunk using the given sequence number
#[must_use]
pub fn fdat_data(&self, sequence_number: u32) -> Vec<u8> {
let mut byte_data = Vec::with_capacity(4 + self.data.len());
byte_data.write_all(&sequence_number.to_be_bytes()).unwrap();
byte_data.write_all(&self.data).unwrap();
byte_data
}
}
2 changes: 2 additions & 0 deletions src/error.rs
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,7 @@ pub enum PngError {
TimedOut,
NotPNG,
APNGNotSupported,
APNGOutOfOrder,
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hm, is this a BC break? APNGNotSupported should have been removed in v9 but it wasn’t.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, it technically is, as it's a publicly viewable enum.

cargo semver-checks is a really useful tool to answer this sort of questions:

image

Copy link
Collaborator Author

@andrews05 andrews05 Jan 17, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice, I've added APNGNotSupported back in and it's not showing that issue. However, it is showing a bunch of issues to do with the recently-added #[must_use] to a number of functions, should we worry about this?
I wonder if we should add this tool to our workflow...

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Personally, I'd ignore the lints related to #[must_use], as they are too pedantic to matter in practice: they only matter under non-default linter configurations that are a bit of a bad idea to blindly set up. Nevertheless, it's interesting that the tool finds out and let us know about these, just in case my opinion is wrong 🙂

Adding this tool to our workflow would be a good idea. I didn't read too much on how it computes a baseline for the semver checks though, so making sure that it always picks something sane in a workflow (even if it runs from a PR, or from a release tag...) would be great.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great, that’s kinda what I figured. Are we good to go with this PR then?

Copy link
Collaborator

@AlexTMjugador AlexTMjugador Jan 20, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Before that, I want to give your seemingly good changes a more proper, in-depth look. Please allow for a bit of time until I get around doing it 😄

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah sorry, I just assumed you already had. No urgency, I appreciate the scrutiny 🙂

InvalidData,
TruncatedData,
ChunkMissing(&'static str),
Expand All @@ -33,6 +34,7 @@ impl fmt::Display for PngError {
f.write_str("Missing data in the file; the file is truncated")
}
PngError::APNGNotSupported => f.write_str("APNG files are not (yet) supported"),
PngError::APNGOutOfOrder => f.write_str("APNG chunks are out of order"),
PngError::ChunkMissing(s) => write!(f, "Chunk {s} missing or empty"),
PngError::InvalidDepthForType(d, ref c) => {
write!(f, "Invalid bit depth {d} for color type {c}")
Expand Down
38 changes: 18 additions & 20 deletions src/headers.rs
Original file line number Diff line number Diff line change
Expand Up @@ -134,7 +134,7 @@ fn parse_jumbf_box(data: &[u8]) -> Option<(&[u8], &[u8])> {
return None;
}
let (len, rest) = data.split_at(4);
let len = u32::from_be_bytes(len.try_into().unwrap()) as usize;
let len = read_be_u32(len) as usize;
if len < 8 || len > data.len() {
return None;
}
Expand All @@ -153,32 +153,25 @@ pub fn parse_next_chunk<'a>(
.get(*byte_offset..*byte_offset + 4)
.ok_or(PngError::TruncatedData)?,
);
if byte_data.len() < *byte_offset + 12 + length as usize {
return Err(PngError::TruncatedData);
}
*byte_offset += 4;

let chunk_start = *byte_offset;
let chunk_name = byte_data
.get(chunk_start..chunk_start + 4)
.ok_or(PngError::TruncatedData)?;
let chunk_name = &byte_data[chunk_start..chunk_start + 4];
if chunk_name == b"IEND" {
// End of data
return Ok(None);
}
*byte_offset += 4;

let data = byte_data
.get(*byte_offset..*byte_offset + length as usize)
.ok_or(PngError::TruncatedData)?;
let data = &byte_data[*byte_offset..*byte_offset + length as usize];
*byte_offset += length as usize;
let crc = read_be_u32(
byte_data
.get(*byte_offset..*byte_offset + 4)
.ok_or(PngError::TruncatedData)?,
);
let crc = read_be_u32(&byte_data[*byte_offset..*byte_offset + 4]);
*byte_offset += 4;

let chunk_bytes = byte_data
.get(chunk_start..chunk_start + 4 + length as usize)
.ok_or(PngError::TruncatedData)?;
let chunk_bytes = &byte_data[chunk_start..chunk_start + 4 + length as usize];
if !fix_errors && crc32(chunk_bytes) != crc {
return Err(PngError::new(&format!(
"CRC Mismatch in {} chunk; May be recoverable by using --fix",
Expand All @@ -202,13 +195,13 @@ pub fn parse_ihdr_chunk(
0 => ColorType::Grayscale {
transparent_shade: trns_data
.filter(|t| t.len() >= 2)
.map(|t| u16::from_be_bytes([t[0], t[1]])),
.map(|t| read_be_u16(&t[0..2])),
},
2 => ColorType::RGB {
transparent_color: trns_data.filter(|t| t.len() >= 6).map(|t| RGB16 {
r: u16::from_be_bytes([t[0], t[1]]),
g: u16::from_be_bytes([t[2], t[3]]),
b: u16::from_be_bytes([t[4], t[5]]),
r: read_be_u16(&t[0..2]),
g: read_be_u16(&t[2..4]),
b: read_be_u16(&t[4..6]),
}),
},
3 => ColorType::Indexed {
Expand Down Expand Up @@ -245,7 +238,12 @@ fn palette_to_rgba(
}

#[inline]
fn read_be_u32(bytes: &[u8]) -> u32 {
pub fn read_be_u16(bytes: &[u8]) -> u16 {
u16::from_be_bytes(bytes.try_into().unwrap())
}

#[inline]
pub fn read_be_u32(bytes: &[u8]) -> u32 {
u32::from_be_bytes(bytes.try_into().unwrap())
}

Expand Down
83 changes: 45 additions & 38 deletions src/lib.rs
Original file line number Diff line number Diff line change
Expand Up @@ -59,6 +59,7 @@ pub use crate::{
options::{InFile, Options, OutFile},
};

mod apng;
mod atomicmin;
mod colors;
mod deflate;
Expand Down Expand Up @@ -166,7 +167,7 @@ impl RawImage {
.filter(|c| opts.strip.keep(&c.name))
.cloned()
.collect();
postprocess_chunks(&mut png, opts, deadline, &self.png.ihdr);
postprocess_chunks(&mut png, opts, &self.png.ihdr);

Ok(png.output())
}
Expand Down Expand Up @@ -364,9 +365,11 @@ fn optimize_png(
if let Some(new_png) = optimize_raw(raw.clone(), &opts, deadline.clone(), max_size) {
png.raw = new_png.raw;
png.idat_data = new_png.idat_data;
png.filter = new_png.filter;
}

postprocess_chunks(png, &opts, deadline, &raw.ihdr);
postprocess_chunks(png, &opts, &raw.ihdr);
recompress_frames(png, &opts, deadline)?;

let output = png.output();

Expand Down Expand Up @@ -530,6 +533,8 @@ fn optimize_raw(
raw: png,
idat_data,
aux_chunks: Vec::new(),
frames: Vec::new(),
filter: Some(filter),
};
if image.estimated_output_size() < max_size.unwrap_or(usize::MAX) {
debug!("Found better combination:");
Expand All @@ -549,6 +554,8 @@ fn optimize_raw(
raw: result.image,
idat_data: result.idat_data,
aux_chunks: Vec::new(),
frames: Vec::new(),
filter: Some(result.filter),
};
if image.estimated_output_size() < max_size.unwrap_or(usize::MAX) {
debug!("Found better combination:");
Expand Down Expand Up @@ -656,12 +663,7 @@ fn report_format(prefix: &str, png: &PngImage) {
}

/// Perform cleanup of certain chunks from the `PngData` object, after optimization has been completed
fn postprocess_chunks(
png: &mut PngData,
opts: &Options,
deadline: Arc<Deadline>,
orig_ihdr: &IhdrData,
) {
fn postprocess_chunks(png: &mut PngData, opts: &Options, orig_ihdr: &IhdrData) {
if let Some(iccp_idx) = png.aux_chunks.iter().position(|c| &c.name == b"iCCP") {
// See if we can replace an iCCP chunk with an sRGB chunk
let may_replace_iccp = opts.strip != StripChunks::None && opts.strip.keep(b"sRGB");
Expand Down Expand Up @@ -716,38 +718,43 @@ fn postprocess_chunks(
!invalid
});
}
}

// Find fdAT chunks and attempt to recompress them
// Note if there are multiple fdATs per frame then decompression will fail and nothing will change
let mut fdat: Vec<_> = png
.aux_chunks
.iter_mut()
.filter(|c| &c.name == b"fdAT")
.collect();
if opts.idat_recoding && !fdat.is_empty() {
let buffer_size = orig_ihdr.raw_data_size();
fdat.par_iter_mut()
.with_max_len(1)
.enumerate()
.for_each(|(i, c)| {
if deadline.passed() || c.data.len() <= 4 {
return;
}
if let Ok(mut data) = deflate::inflate(&c.data[4..], buffer_size).and_then(|data| {
let max_size = AtomicMin::new(Some(c.data.len() - 5));
opts.deflate.deflate(&data, &max_size)
}) {
debug!(
"Recompressed fdAT #{:<2}: {} ({} bytes decrease)",
i,
c.data.len(),
c.data.len() - 4 - data.len()
);
c.data.truncate(4);
c.data.append(&mut data);
}
});
/// Recompress the additional frames of an APNG
fn recompress_frames(png: &mut PngData, opts: &Options, deadline: Arc<Deadline>) -> PngResult<()> {
if !opts.idat_recoding || png.frames.is_empty() {
return Ok(());
}
// Use the same filter chosen for the main image
// No filter means we failed to optimise the main image and we shouldn't bother trying here
let Some(filter) = png.filter else {
return Ok(());
};
png.frames
.par_iter_mut()
.with_max_len(1)
.enumerate()
.try_for_each(|(i, frame)| {
if deadline.passed() {
return Ok(());
}
let mut ihdr = png.raw.ihdr.clone();
ihdr.width = frame.width;
ihdr.height = frame.height;
let image = PngImage::new(ihdr, &frame.data)?;
let filtered = image.filter_image(filter, opts.optimize_alpha);
let max_size = AtomicMin::new(Some(frame.data.len() - 1));
if let Ok(data) = opts.deflate.deflate(&filtered, &max_size) {
debug!(
"Recompressed fdAT #{:<2}: {} ({} bytes decrease)",
i,
data.len(),
frame.data.len() - data.len()
);
frame.data = data;
}
Ok(())
})
}

/// Check if an image was already optimized prior to oxipng's operations
Expand Down
Loading
Loading