Automagitically optimize your images on S3 with the magic of AWS Lambda.
Optim is a super-simple Lambda function that can listen to an S3 bucket for uploads, and runs everything it can through imagemin.
-
Clone this repo
-
Run
npm install
-
Fill in
AWS_ACCESS_KEY_ID
andAWS_SECRET_ACCESS_KEY
in.env
to a set of credentials that can create Lambda functions (alternatively have these already in your environment) -
Create an IAM role for Optim to use. It needs the following permissions on all the S3 buckets you want to use (allowing these operations on ARN
*
is easiest to start with):getObject
putObject
putObjectAcl
-
Find the ARN for this role. It looks something like
arn:aws:iam::1234567890:role/rolename
. -
Fill in
AWS_ROLE_ARN
in.env
-
Run
npm run deploy
-
Hurrah, your Lambda function is now deployed! It'll be created with the name
optim-production
unless you changed values in.env
-
You can now hook this function up to any S3 bucket you like in the management console. Easiest way is to follow AWS's guide
There are two sets of configuration here. The .env
file contains configuration related to setup and deployment.
In .env
:
AWS_ACCESS_KEY_ID
: the AWS access key used to deploy the Lambda functionAWS_SECRET_ACCESS_KEY
: the corresponding secret access keyAWS_ROLE_ARN
: role with which the lambda function will be executedAWS_REGION
: which region to deploy toAWS_FUNCTION_NAME
andAWS_ENVIRONMENT
control naming of the lambda function createdAWS_MEMORY_SIZE
is the amount of memory given to your Lambda. It's also related to how much CPU share it gets. Since optimizing images is fairly intensive, probably best to keep this highAWS_TIMEOUT
runtime timeout for the lambda in seconds up to 5 minutes. Again, image optimization is fairly intensive so you'll probably want to leave this at the maximum of 300.EXCLUDE_PREFIX
avoid optimizing images that have such prefix in its filename.
In event_sources.json
:
Bucket
: configure the bucket where to listen.
After configuring, deploy lamda with npm run deploy
This project can optimize all the existing images in the bucket.
You need to: npm run package
and node dist/optimizeAll.js
It will optimize images using all the CPUs on your pc.
I've used this repo to successfully optimize a bucket with 490k images (90% jpg, 9%png, 1% others) In less than 2hours with a 16core EC2 instance.