This is a simple block of code (all shell code really) that does a few image and S3 things. I am not sure if it will work on non-mac operating systems. It should though, with slight adjustments from what I have written.
-
brew install imagemagick
or the like- Link to imagemagick page
- It's basically the most popular image CLI library, it's written mostly in C
-
This is optional. Hook up an AWS account to your CLI.
-
Node if you use the utility script,
GetJSONData.sh
. Otherwise, you don't need it. -
Run this so you do not have to use
sudo
, or run sudo if you prefer.cd Scripts && chmod +x * && cd .. && cd Javascript && chmod +x && cd ..
- This allows access to the file system for these scripts, the scripts do not do anything outside of this home directory. All that happens is image conversion and pushing to S3 as is mentioned before.
- If you use sudo you'll just have to put your computer's password several times.
-
Clone this repository, store it wherever is fitting. You can do several things here:
- Create images in next gen formats (as V8 (Google) and Webkit(Apple) are optimised to handle specific image types)
- webp for Google
- JPEG 2000 (jp2) for Apple
- Create placeholders for your images (for lazy loading)
- Push this content to an AWS S3 bucket
- Create images in next gen formats (as V8 (Google) and Webkit(Apple) are optimised to handle specific image types)
-
After cloning, you have several files to work with. They are all shell scripts. Think of them as utility files. Below is a quick rundown for each one.
- Only run the script from the root directory.
- Other stuff to think of in the future...?
Creates an s3 bucket with public read access. Public read access means that anyone can access the image from the s3 url. But, people cannot publicly Post
, Put
, or Delete
. Only Get
.
Do not put sensitive data, AWS predominantly recommends that people do not grant public read access as sensitive data such as voter and military information has been publicly accessible. This should not matter here, as this is solely for images.
What to do before calling it
- Set up AWS account and add credentials to your terminal if you have not. Other than that, think of a good bucket name.
How to call it
bash ./Scripts/createS3Bucket.sh
Options
bucket="somebucketname"
- Required
- Requirements for naming an s3 bucket
region=some-region-1
- Not required
- Default:
us-east-1
What you'll see in the console:
{"Location": "/somebucketname"}
- Which isn't important, it comes automatically from the AWS CLI.
- Then this:
upload: Images/duckJPG.jpg to s3://somebucketname/duckJPG.jpg
- Which again is something from the AWS CLI.
- A string that says this:
Check if it works by clicking this url: https://somebucketname.s3.amazonaws.com/duckJPG.jpg
. Which, when copied into your browser, should show a duck image, as that is the test image. You should take a peek in your AWS console as well.
What to know
- You have this bucket in your AWS account now. It's not optimised to security or the like but it works and is accessible. Best of all, you can now push images to it.
Converts a group of images to nextGen formats.
What to do before calling it
- Add all the images you want to convert to the
Images
folder in the root directory. That's your main folder for Images. After conversion, allWebp
images will be in aWebpFiles
folder, allJPEG 2000 (JP2)
will be in aJP2Files
folder, and placeholders will be in thePlaceholders
folder.- You should also resize your images prior to doing this. The less amount of bytes an image contains the faster it loads for the client, and they usually do not need to be large for high quality rendering.
How to call it
bash ./Scripts/ConvertFiles.sh
Options
- None
What you'll see in the console
- You shouldn't see anything atm.
What to know
- You now have
jp2
,webp
, and placeholder images of your previous images.
This moves your images to the S3 bucket of choice.
How to call it
bash ./Scripts/MoveImagesToS3.sh bucket="somebucketname"
Options
bucket="somebucketname"
- Required
What you'll see in the console
- A bunch of text like this (corresponding to how many files you have to write to s3):
upload: ./duck.jp2 to s3://somebucketname/jp2Images/duck.jp2
- This comes from the AWS cli.
- A message of how to check the images. With your url, like this:
This is the url to see the images
- Then the url:
https://somebucketname.s3.amazonaws.com/some-extension
- And this:
some-extension is the path and filename, something like fallback/someImage.jpg.
- And lastly this:
ie. https://somebucketname.s3.amazonaws.com/fallback/someImage.jpg
This creates a file with the corresponding url's for your images for easy reference. No need to use the AWS console.
How to call it
bash ./Scripts/GetJSONData.sh bucket="somebucketname"
Options
bucket="somebucketname"
- Required
What you'll see in the console
The file has been saved! It is stored as imageUrls.json in the home directory.
- Change the scripts to whatever you want! These options are not set in stone. Read more about the
AWS S3 CLI
orImage Magick CLI
to see additional possibilities.