From 29a9567bce56860d3c4266f1e3f547f974eaac90 Mon Sep 17 00:00:00 2001 From: mattwr18 Date: Mon, 6 Apr 2020 14:39:00 +0200 Subject: [PATCH 1/2] chore: Update docs for spaces backups --- deployment/digital-ocean/README.md | 12 ++++++++++++ 1 file changed, 12 insertions(+) diff --git a/deployment/digital-ocean/README.md b/deployment/digital-ocean/README.md index 12c2726918..424e7e0eae 100644 --- a/deployment/digital-ocean/README.md +++ b/deployment/digital-ocean/README.md @@ -24,3 +24,15 @@ Digital Ocean kubernetes clusters don't have a graphical interface, so I suggest to setup the [kubernetes dashboard](./dashboard/README.md) as a next step. Configuring [HTTPS](./https/README.md) is bit tricky and therefore I suggest to do this as a last step. + +## Spaces + +We are storing our images in the s3-compatible [DigitalOcean Spaces](https://www.digitalocean.com/docs/spaces/). + +We still want to take backups of our images in case something happens to the images in the cloud. See these [instructions](https://www.digitalocean.com/docs/spaces/resources/s3cmd-usage/) about getting set up with `s3cmd` to take a copy of all images in a `Spaces` namespace, ie `human-connection-uploads`. + +After configuring `s3cmd` with your credentials, etc. I was able to make a backup with this command. + +```sh +s3cmg get --recursive s3://human-connection-uploads --skip-existing +``` \ No newline at end of file From 5ff7bf993f508f8df1f1be1302ce0222c81a8c97 Mon Sep 17 00:00:00 2001 From: mattwr18 Date: Mon, 6 Apr 2020 18:32:54 +0200 Subject: [PATCH 2/2] fix(grammar): avoid first person in docs --- deployment/digital-ocean/README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/deployment/digital-ocean/README.md b/deployment/digital-ocean/README.md index 424e7e0eae..a71eed9960 100644 --- a/deployment/digital-ocean/README.md +++ b/deployment/digital-ocean/README.md @@ -31,7 +31,7 @@ We are storing our images in the s3-compatible [DigitalOcean Spaces](https://www We still want to take backups of our images in case something happens to the images in the cloud. See these [instructions](https://www.digitalocean.com/docs/spaces/resources/s3cmd-usage/) about getting set up with `s3cmd` to take a copy of all images in a `Spaces` namespace, ie `human-connection-uploads`. -After configuring `s3cmd` with your credentials, etc. I was able to make a backup with this command. +After configuring `s3cmd` with your credentials, etc. you should be able to make a backup with this command. ```sh s3cmg get --recursive s3://human-connection-uploads --skip-existing