Skip to content

Commit

Permalink
streamlined deployment
Browse files Browse the repository at this point in the history
  • Loading branch information
Brian Hannafious committed Feb 16, 2018
1 parent f8f02b6 commit 47d1e5d
Show file tree
Hide file tree
Showing 18 changed files with 833 additions and 56 deletions.
4 changes: 4 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -47,3 +47,7 @@ npm-debug.log
/environment.local

/smoketest-*.tmp

# Terraform deployment configuration
/deployment/*
!/deployment/Makefile
14 changes: 11 additions & 3 deletions Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -51,16 +51,24 @@ smoketest:

deploy: deploy-chalice deploy-daemons

deploy-infra:
for comp in `scripts/dss_deployment.py components`; do \
$(MAKE) -C deployment apply COMPONENT=$$comp; \
done
curl https://dss.dev.data.humancellatlas.org/internal/application_secrets > application_secrets.json
scripts/dss_deployment.py create_gcp_service_account_credentials
scripts/dss_deployment.py enable_gcp_dss_services

deploy-chalice:
$(MAKE) -C chalice deploy
source environment && $(MAKE) -C chalice deploy

deploy-daemons: deploy-daemons-serial deploy-daemons-parallel

deploy-daemons-serial:
$(MAKE) -j1 -C daemons deploy-serial
source environment && $(MAKE) -j1 -C daemons deploy-serial

deploy-daemons-parallel:
$(MAKE) -C daemons deploy-parallel
source environment && $(MAKE) -C daemons deploy-parallel

release_staging:
scripts/release.sh master staging
Expand Down
74 changes: 24 additions & 50 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,15 +34,6 @@ The tests require certain node.js packages. They must be installed using `npm`,

Tests also use data from the data-bundle-examples subrepository. Run: `git submodule update --init`

#### Environment Variables

Environment variables are required for test and deployment. The required environment variables and their default values
are in the file `environment`. To customize the values of these environment variables:

1. Copy `environment.local.example` to `environment.local`
2. Edit `environment.local` to add custom entries that override the default values in `environment`

Run `source environment` now and whenever these environment files are modified.

#### Configuring cloud-specific access credentials

Expand All @@ -51,47 +42,14 @@ Run `source environment` now and whenever these environment files are modified.
1. Follow the instructions in http://docs.aws.amazon.com/cli/latest/userguide/cli-chap-getting-started.html to get the
`aws` command line utility.

2. Create an S3 bucket that you want DSS to use and in `environment.local`, set the environment variable `DSS_S3_BUCKET`
to the name of that bucket. Make sure the bucket region is consistent with `AWS_DEFAULT_REGION` in
`environment.local`.

3. Repeat the previous step for

* DSS_S3_CHECKOUT_BUCKET
* DSS_S3_CHECKOUT_BUCKET_TEST
* DSS_S3_CHECKOUT_BUCKET_TEST_FIXTURES

4. If you wish to run the unit tests, you must create two more S3 buckets, one for test data and another for test
fixtures, and set the environment variables `DSS_S3_BUCKET_TEST` and `DSS_S3_BUCKET_TEST_FIXTURES` to the names of
those buckets.

Hint: To create S3 buckets from the command line, use `aws s3 mb --region REGION s3://BUCKET_NAME/`.

##### GCP

1. Follow the instructions in https://cloud.google.com/sdk/downloads to get the `gcloud` command line utility.

2. In the [Google Cloud Console](https://console.cloud.google.com/), select the correct Google user account on the top
right and the correct GCP project in the drop down in the top center. Go to "IAM & Admin", then "Service accounts",
then click "Create service account" and select "Furnish a new private key". Under "Roles" select "Project – Owner",
"Project – Service Account Actor" and "Cloud Functions – Cloud Function Developer". Create the account and download
the service account key JSON file.

3. In `environment.local`, set the environment variable `GOOGLE_APPLICATION_CREDENTIALS` to the path of the service
account key JSON file.

4. Choose a region that has support for Cloud Functions and set `GCP_DEFAULT_REGION` to that region. See
https://cloud.google.com/about/locations/ for a list of supported regions.

5. Run `gcloud auth activate-service-account --key-file=/path/to/service-account.json`.

6. Run `gcloud config set project PROJECT_ID` where PROJECT_ID is the ID, not the name (!) of the GCP project you
2. Run `gcloud config set project PROJECT_ID` where PROJECT_ID is the ID, not the name (!) of the GCP project you
selected earlier.

7. Enable required APIs: `gcloud service-management enable cloudfunctions.googleapis.com`; `gcloud service-management
enable runtimeconfig.googleapis.com`

8. Generate OAuth application secrets to be used for your instance:
3. Generate OAuth application secrets to be used for your instance:

1) Go to https://console.developers.google.com/apis/credentials (you may have to select Organization and Project
again)
Expand All @@ -109,14 +67,30 @@ Hint: To create S3 buckets from the command line, use `aws s3 mb --region REGION

7) Place the downloaded JSON file into the project root as `application_secrets.json`

9. Create a Google Cloud Storage bucket and in `environment.local`, set the environment variable `DSS_GS_BUCKET` to the
name of that bucket. Make sure the bucket region is consistent with `GCP_DEFAULT_REGION` in `environment.local`.
#### Terraform

Some cloud assets are managed by Terraform, inlcuding the storage and test buckets, and the Elasticsearch domain.

1. Follow the instructions in https://www.terraform.io/intro/getting-started/install.html to get the
`terraform` command line utility.

2. Run `terraform init` in the data-store root directory.

3. Run `configure.py` to prepare the terraform input variables, which will be stored in `dss_variables.tf`

10. If you wish to run the unit tests, you must create two more buckets, one for test data and another for test
fixtures, and set the environment variables `DSS_GS_BUCKET_TEST` and `DSS_GS_BUCKET_TEST_FIXTURES` to the names of
those buckets.
4. Run `terraform apply` to deploy the cloud assets

5. Run `scripts/dss_deployment.py create_gcp_service_account_credentials`

6. Run `scripts/dss_deployment.py enable_gcp_dss_services` to enable the required APIS:
`cloudfunctions.googleapis.com` and `runtimeconfig.googleapis.com`k

#### Environment Variables

Environment variables are required for test and deployment. The required environment variables and their default values
are in the file `environment`. To customize the values of these environment variables, run `configure.py`.

Hint: To create GCS buckets from the command line, use `gsutil mb -c regional -l REGION gs://BUCKET_NAME/`.
Run `source environment` now and whenever `configure.py` is executed.

##### Azure

Expand Down
59 changes: 59 additions & 0 deletions configure.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,59 @@
#!/usr/bin/env python

import os
import sys
import click
import subprocess
import dss_terraform


pkg_root = os.path.abspath(os.path.dirname(__file__)) # noqa


def run(*args):
out = subprocess.run(args, stdout=subprocess.PIPE, stderr=subprocess.PIPE, encoding='utf-8')
try:
out.check_returncode()
except subprocess.CalledProcessError:
print(f'\t{out.stderr}')
return out.stdout.strip()


def verify_gcp_account():
print("GCP Account:")
print(run('gcloud', 'auth', 'list'))
print(run('gcloud', 'config', 'configurations', 'list'))

if not click.confirm('Continue?'):
sys.exit(1)


def verify_aws_account():
print("AWS Account:")
print(run('aws', 'configure', 'list'))

if not click.confirm('Continue?'):
sys.exit(1)


@click.command()
@click.option('--stage', prompt=True)
@click.option('--accept-defaults', is_flag=True, default=False)
def main(stage, accept_defaults):
if not accept_defaults:
verify_aws_account()
verify_gcp_account()
print('Current stages:', ', '.join(dss_terraform.current_stages()))

tf_tree = dss_terraform.TFTree(stage)

if not tf_tree.vars.get('GCP_PROJECT', None):
tf_tree.vars['GCP_PROJECT'] = run('gcloud', 'config', 'get-value', 'project')

tf_tree.get_user_input(accept_defaults=accept_defaults)
tf_tree.write()
dss_terraform.set_active_stage(stage)


if __name__ == "__main__":
main()
1 change: 1 addition & 0 deletions daemons/Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -26,6 +26,7 @@ $(SERIAL_AWS_DAEMONS) $(PARALLEL_AWS_DAEMONS):
git clean -df $@/domovoilib $@/vendor
shopt -s nullglob; for wheel in $@/vendor.in/*/*.whl; do unzip -q -o -d $@/vendor $$wheel; done
cp -R ../dss ../dss-api.yml $@/domovoilib
chmod -R ugo+r $@/domovoilib
cp "$(GOOGLE_APPLICATION_CREDENTIALS)" $@/domovoilib/gcp-credentials.json
./build_deploy_config.sh $@ $(DSS_DEPLOYMENT_STAGE)
cd $@; domovoi deploy --stage $(DSS_DEPLOYMENT_STAGE)
Expand Down
19 changes: 19 additions & 0 deletions deployment/Makefile
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
STAGE=active
COMPONENT=

init:
cd $(STAGE)/$(COMPONENT); terraform init

plan: init
cd $(STAGE)/$(COMPONENT); terraform plan

apply: init
cd $(STAGE)/$(COMPONENT); terraform apply

destroy: init
cd $(STAGE)/$(COMPONENT); terraform destroy

clean:
cd $(STAGE)/$(COMPONENT); -rm -rf .terraform

.PHONY: init plan apply clean
Loading

0 comments on commit 47d1e5d

Please sign in to comment.