Skip to content

Cloud file resource loading

Elena edited this page Mar 1, 2020 · 25 revisions

It is possible to use one of those solutions in the project. _======================================= cloud controller also works with Google it is enough to change cloud.provider=google and change parameters in application.properties starting with google: google.credentialsPath google.projectId etc. To chose proveder before starting: cloud.provider=google or cloud.provider=aws _====================================== File upload from frontend looks like following:

uml diagram

Pricing

If one of free solutions is going to be used, nginx might be used to cache static files to decrease number of downloads from cloud. Example nginx configuration : https://github.com/coopernurse/nginx-s3-proxy

AWS

https://aws.amazon.com/free/

Offers following for 12 months

5GB free space

2000 monthly put (upload and bucket list) requests

20000 monthly get (download) requests

When quota is exceeded it is automatically charged from card.

There is no usage capping feature to prevent exceeding the quota and charges, but it is possible to create alerts such as sending email when 80% of bucket is full or 80% of get request limit is used.

Selectel

Looks cheaper than other options, still waiting for a response to our free quota demand special for this project

IBM

https://www.ibm.com/cloud/free/

No credit card is required to create account. Offers following for free account.

25GB free space

2000 monthly put (upload and bucket list) requests

20000 monthly get (download) requests

Google

https://cloud.google.com/free/docs/gcp-free-tier

Credit card is required. Offers following for 12 months, also 300$ credit for free

5GB free space

5000 monthly put (upload and bucket list) requests

50000 monthly get (download) requests

After 12 months or 300$ credit is finished, there is a 1 month period which services are not available to use but data remaining in cloud can be saved

Configuring Backend

Google

Google has its own implementation for cloud upload. Following properties should be set correctly for google:

https://cloud.google.com/storage/docs/access-control/signing-urls-with-helpers#storage-signed-url-object-java

cloud.provider=google

# credentials json can be downloaded by https://console.cloud.google.com/apis/credentials -> Create Credentials -> Service Account -> Grant Users Access to This Service Account -> Create Key -> JSON

google.credentialsPath=C:\\Users\\User_Name\\pragmatic-port-267812-27d76242250a.json

google.projectId=pragmatic-port-267812

google.bucketName=somebucket

AWS, Selectel, IBM

Aws,selectel and ibm use same api and properties for file upload.

If one of these is going to be used, set in application.properties

cloud.provider=aws

To provide credentials, either these two properties should be set from jvm parameters -Daws.accessKeyId="someAccessKeyId" -DsecretAccessKey="someSecretKey" or aws.credentialsPath=someFolder/credentials.properties should be set in properties and a file with path someFolder/credentials.properties should be created which contains these properties aws.accessKeyId="someAccessKeyId" aws.secretAccessKey="someSecretKey"

Cloud specific properties

AWS

https://docs.aws.amazon.com/AmazonS3/latest/API/sigv4-post-example.html

# access key id and secret access key are visible in https://console.aws.amazon.com/iam/home#/home -> users -> user -> security credentials

In credentials.properties

aws.accessKeyId=AKIAI7KLKATWVCMEKGPA

aws.secretAccessKey=AuME0IWCfv2fcHuA6vk*********************

In application.properties

aws.region=us-east-2

aws.bucketName=testbucket

aws.bucketLink=http://${aws.bucketName}.s3.amazonaws.com

Selectel

https://kb.selectel.ru/54789216.html

In credentials.properties

# access key id and secret access key are visible in https://my.selectel.ru/storage/users

aws.accessKeyId=111023_Someuser

aws.secretAccessKey=w6vbCXbHYV

In application.properties

aws.bucketName=epmbrn

aws.region=ru-1a

aws.bucketLink=https://s3.selcdn.ru/${aws.bucketName}

IBM

https://docs.aws.amazon.com/AmazonS3/latest/API/sigv4-post-example.html

In credentials.properties

# access key id and secret are available in https://cloud.ibm.com/iam/serviceids -> service id -> API Keys

aws.accessKeyId=8f638ab50e424d5ebd999edd32876013

aws.secretAccessKey=193ebd53794d29e1517401**************************

In application.properties

aws.region=us-south

aws.bucketName=cloud-object-storage-gg-cos-standard-koy

aws.bucketLink=https://s3.${aws.region}.cloud-object-storage.appdomain.cloud/${aws.bucketName}

Manual Testing

  1. Start project locally
  2. Start postman and import collection https://www.getpostman.com/collections/a2d0d67b65da48479f0c
  3. In postman, run request with title "get signed form" pointing the url to local backend
  4. With the response parameters and a selected file, run postman request "create object"
  5. Run request "list" to see the file is uploaded (public access should be allowed)
    1. Example bucket list response from AWS <ListBucketResult xmlns="http://s3.amazonaws.com/doc/2006-03-01/"> <Name>bucketname</Name> <Prefix></Prefix> <Marker></Marker> <MaxKeys>1000</MaxKeys> <IsTruncated>false</IsTruncated> <Contents> <Key>123/</Key> <LastModified>2020-02-14T09:02:56.000Z</LastModified> <ETag>&quot;d41d8cd98f00b204e9800998ecf8427e&quot;</ETag> <Size>0</Size> <StorageClass>STANDARD</StorageClass> </Contents> <Contents> <Key>123/456/</Key> <LastModified>2020-02-14T09:03:02.000Z</LastModified> <ETag>&quot;d41d8cd98f00b204e9800998ecf8427e&quot;</ETag> <Size>0</Size> <StorageClass>STANDARD</StorageClass> </Contents> <Contents> <Key>123/test.jpeg</Key> <LastModified>2020-02-14T09:10:48.000Z</LastModified> <ETag>&quot;7ae9191463771fc777ce0e85b276b2be&quot;</ETag> <Size>119709</Size> <StorageClass>STANDARD</StorageClass> </Contents> <Contents> <Key>test.jpeg</Key> <LastModified>2020-02-14T09:02:47.000Z</LastModified> <ETag>&quot;7ae9191463771fc777ce0e85b276b2be&quot;</ETag> <Size>119709</Size> <StorageClass>STANDARD</StorageClass> </Contents> </ListBucketResult>
    1. Example bucket list response from google cloud { "kind": "storage#objects", "items": [ { "kind": "storage#object", "id": "bucketname/folder/Screenshot (9).png/1581512882946872", "selfLink": "https://www.googleapis.com/storage/v1/b/bucketname/o/bucketname%2FScreenshot%20(9).png", "mediaLink": "https://storage.googleapis.com/download/storage/v1/b/bucketname/o/bucketname%2FScreenshot%20(9).png?generation=1581512882946872&alt=media", "name": "folder/Screenshot (9).png", "bucket": "bucketname", "generation": "1581512882946872", "metageneration": "1", "contentType": "image/png", "storageClass": "STANDARD", "size": "266509", "md5Hash": "D/IBt1dA5nnBcsXQyHA7JA==", "crc32c": "/6us+g==", "etag": "CLiG4syKzOcCEAE=", "timeCreated": "2020-02-12T13:08:02.946Z", "updated": "2020-02-12T13:08:02.946Z", "timeStorageClassUpdated": "2020-02-12T13:08:02.946Z" }, { "kind": "storage#object", "id": "bucketname/testfile.jpg/1581611602964030", "selfLink": "https://www.googleapis.com/storage/v1/b/bucketname/o/testfile.jpg", "mediaLink": "https://storage.googleapis.com/download/storage/v1/b/bucketname/o/testfile.jpg?generation=1581611602964030&alt=media", "name": "testfile.jpg", "bucket": "bucketname", "generation": "1581611602964030", "metageneration": "1", "contentType": "image/png", "storageClass": "STANDARD", "size": "266509", "md5Hash": "D/IBt1dA5nnBcsXQyHA7JA==", "crc32c": "/6us+g==", "etag": "CL7cka76zucCEAE=", "timeCreated": "2020-02-13T16:33:22.963Z", "updated": "2020-02-13T16:33:22.963Z", "timeStorageClassUpdated": "2020-02-13T16:33:22.963Z" } ] }

Allowing Public Access

AWS

  • This will allow anyone with bucket link to be able to list contents and download files, which might increase resource usage

To allow public access for bucket listing and downloading files,

  1. Change Block Public Access setting in https://s3.console.aws.amazon.com/s3/buckets/bucketname/?region=us-east-2&tab=permissions
  2. Add following in Bucket Policy setting in https://s3.console.aws.amazon.com/s3/buckets/bucketname/?region=us-east-2&tab=permissions (More information https://docs.aws.amazon.com/AmazonS3/latest/dev/example-bucket-policies.html#example-bucket-policies-use-case-2)

{ "Version": "2012-10-17", "Statement": [ { "Sid": "PublicRead", "Effect": "Allow", "Principal": "*", "Action": [ "s3:GetObject" ], "Resource": [ "arn:aws:s3:::bucketname/*" ] } ] }

  1. Add CORS policy to allow file upload <CORSConfiguration> <CORSRule> <AllowedOrigin>*</AllowedOrigin> <AllowedHeader>*</AllowedHeader>

    <AllowedMethod>PUT</AllowedMethod> <AllowedMethod>GET</AllowedMethod> <AllowedMethod>POST</AllowedMethod> </CORSRule> </CORSConfiguration>

Google Cloud

https://cloud.google.com/storage/docs/access-control/making-data-public