On your local machine:
- Docker, ensure Docker Engine is running
- Docker Hub Account, and log in to CLI
- npm
- AWS CLI installed and configured
AWS EC2, type - Amazon Linux 2 instance
In your your local machine's terminal, within the Bento root folder (created in the Bento pipeline deployment process), or any other folder:
git clone https://github.com/bento-video/bento-dashboard-backend.git && cd bento-dashboard-backend
Within the Dockerfile (found within the bento-dashboard-backend
folder) the following environment variables require values:
ENV START_BUCKET
ENV END_BUCKET
enter the following command to view view all of your bucket names:
aws s3api list-buckets --query "Buckets[].Name"
There will be a bucket with bento-prod-videouploadbucket in its name. Use this bucket's full name for the value of ENV START_BUCKET.
There will be a bucket with bento-prod-processedvideosbucket in its name. Use the full bucket name for the value of ENV END_BUCKET.
ENV RECORD_UPLOAD_LAMBDA
ENV EXECUTOR_LAMBDA
These variables reference the arn of the recordUpload and executor Lambdas. The following commands lists the properties of these Lambdas:
aws lambda get-function --function-name recordUpload
aws lambda get-function --function-name executor
-
ENV REGION
your AWS region -
ENV AWS_ACCESS_KEY_ID
your AWS access key -
ENV AWS_SECRET_ACCESS_KEY
your AWS secret access key
docker build -t yourhubusername/bentobackend .
IMPORTANT: immediately after pushing this image login to Docker hub and configure your settings to make this repo private as it contains your AWS keys.
docker push yourhubusername/bentobackend
2. Install Docker on Amazon Linux 2
Connect to your EC2 instance within terminal and enter the following commands:
sudo yum update -y &&
sudo amazon-linux-extras install docker &&
sudo service docker start &&
sudo usermod -a -G docker ec2-user &&
sudo chmod 666 /var/run/docker.sock
Log in to Docker hub (docker login --username=yourhubusername
)within your EC2 terminal and enter the following command:
docker run --rm -d -v ${PWD}:/app -v /app/node_modules -v app/package.json -p 3001:3001 yourhubusername/bentobackend
Within AWS web console modify the inbound rules for your EC2 instance:
Type: Custom TCP
Protocol: TCP
Port range: 3001
Source: My IP (or any you want to authorize to interact with your Bento pipeline)
In your your local machine's terminal, within the Bento root folder, or any other folder:
git clone https://github.com/bento-video/bento-dashboard.git && cd bento-dashboard
The following variable references the public endpoint of your EC2 instance:
REACT_APP_API_ENDPOINT
Change the hostname to your EC2 instance's public IP or DNS name, both values are returned within the output of the following command:
aws ec2 describe-instances
npm install && npm run build
Within the AWS S3 web console:
-
create a new S3 bucket
-
remove the Block all public access selection
-
move all the files (and folder) within bento-dashboard/build to this bucket
aws s3 sync build/ s3://your-bucket-name --acl public-read
-
navigate to the Properties tag and select Static website hosting
-
select Use this bucket to host a website, Index document:
index.html
-
copy Endpoint, this is your endpoint to access the Bento Dashboard front-end from your browser
-
add a policy (Permissions -> Bucket Policy) to this bucket to enable GET requests to the objects (files) of this bucket (note, this will allow anyone with the above end point to access these static React files however access to the content of your pipeline is still secured with the entry IP address you configured for the Express app port on EC2 in step 4):
{
"Version":"2008-10-17",
"Statement":[{
"Sid":"AllowPublicRead",
"Effect":"Allow",
"Principal": {
"AWS": "*"
},
"Action":["s3:GetObject"],
"Resource":["arn:aws:s3:::yourbucketname/*"]
}]
}