Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

how to downscale, compress non-knitr images #51

Closed
maxheld83 opened this issue Jan 31, 2017 · 4 comments
Closed

how to downscale, compress non-knitr images #51

maxheld83 opened this issue Jan 31, 2017 · 4 comments
Milestone

Comments

@maxheld83
Copy link

maxheld83 commented Jan 31, 2017

This isn't really a feature request, just wanted to document a way I was going about the problem.

Maybe @yihui will find this worthwhile to pursue as a blogdown feature, I'll be happy to try and develop this into a PR then, though I'm not sure where in the stack this "belongs", and/or whether it's a wise idea to do this in R at all.


In more general use-cases of blogdown(), involving a lot of non-knitr assets (knitr can deal with resolution and compression itself), people (like myself) might want to programmatically reduce the filesize of assets in static/img.

Hugo itself doesn't (yet?) offer this kind of thing, and there is specialised software out there that a lot of people seem to recommend, such as hugulp.

I was looking for a simple solution that would work just with R and on travis (for deployment), without any additional dependencies and stuff I don't know how to use.

So I hacked together a little script to do this exclusively in R, to be found here.

If you add this to some compression.R at your project root:

# resize and compress images ===
library(magick)
library(pbapply)  # just so we have a progress bar, because this thing takes a while

resize_n_compress <- function(file_in, file_out, xmax = 1920, quality = 0.7, cutoff = 100000) {
  # xmax <- 1920  # y pixel max
  # quality <- 0.7  # passed on to jpeg::writeJPEG()
  # cutoff <- 100000  # files smaller than this will not be touched
  # file_out <- "test.jpg"
  if (file.size(file_in) < cutoff) {  # in this case, just copy file
    if (!(file_in == file_out)) {
      file.copy(from = file_in, to = file_out, overwrite = TRUE)
    }
  } else {# if larger than cutoff
    # magick workflow
    image_raw <- image_read(path = file_in)
    if (image_info(image_raw)["width"] > xmax) {  # only resize if smaller
      image_resized <- image_scale(image = image_raw, geometry = as.character(xmax))
    } else {
      image_resized <- image_raw
    }
    image_write(image = image_resized, path = file_out, format = "jpeg", quality = quality)
  }
}

find_large_files <- function(dir_original, dir_scaled) {
  # function to find all files which actually NEED to be rescaled
  # otherwise we would end up rescaling files all the time, which is pretty bad
  # dir_original <- "static/img/"
  # dir_scaled <- "public/img/"
  all_original <- list.files(path = dir_original, pattern = "\\.jpg$", all.files = FALSE, no.. = TRUE, full.names = FALSE, recursive = TRUE)
  all_scaled <- list.files(path = dir_scaled, pattern = "\\.jpg$", all.files = FALSE, no.. = TRUE, full.names = FALSE, recursive = TRUE)
  equal_sizes <- rank(x = file.size(paste0(dir_original, all_original))) == rank(file.size(paste0(dir_scaled, all_scaled[all_scaled %in% all_original])))
  large_files <- all_original[equal_sizes]
 return(large_files)
}

reduce_large_files <- function(dir_original, dir_scaled, xmax = 1920, quality = 0.7, cutoff = 100000) {
  large_files <- find_large_files(dir_original = dir_original, dir_scaled = dir_scaled)
  pbapply::pboptions(type = "txt")  # other output will not go to terminal
  pblapply(X = large_files, FUN = function(x) {
    resize_n_compress(file_in = paste0(dir_original, x), file_out = paste0(dir_scaled, x), quality = quality, xmax = xmax, cutoff = cutoff)
  })
}

# now let's do this
invisible(reduce_large_files(dir_original = "static/img/", dir_scaled = "public/img/", quality = 0.5))

and then, for Travis fun, amend your _build.sh like so:

#!/bin/sh

Rscript -e "blogdown::install_hugo()"
Rscript -e "blogdown::build_site()"
Rscript compression.R

Travis will then compress and rescale all *.jpg files bigger than cutoff as specified.

For now, only the x dim can be specified, and the y will be scaled accordingly, to retain the aspect ratio.

Ps.: one important addition would be to avoid re-scaling/compressing knitr assets. That would be stupid.

@yihui
Copy link
Member

yihui commented Jan 31, 2017

Thanks for sharing! I guess this, along with other related issues you posted to knitr and bookdown, will remain fairly low priority for at least a few months.

Anyway, if I were to crack this problem, I'd just use a shell script that calls convert (ImageMagick) or the R package magick to process images.

@maxheld83
Copy link
Author

thanks for the pointer @yihui – I moved over to magick, and the whole process is now a lot faster (updated above).
(The script now also avoids re-compressing already compressed images, though the solution is quite clumsy).

Not sure how I missed that fantastic package in the first place.
I'm still a little scared of the shell, so I'm sticking to R for now.
Understood that this is going to take a while (if ever) before this makes it to knitr()and/or blogdown(), just wanted to share my hack-job in case someone else might be interested in a quick fix.

@jsonbecker
Copy link

This is pretty cool. I personally use a imageoptim git hook that compresses all *.png and *.jpg how I want upon checking into source control.

@yihui
Copy link
Member

yihui commented Aug 10, 2017

I prefer not to solve this specific issue, but provide a completely general solution instead: blogdown will execute the script R/build.Rif it exists. You can do whatever you want in this script.

@yihui yihui closed this as completed Aug 10, 2017
@yihui yihui added this to the v0.1 milestone Aug 10, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants