Skip to content

Commit

Permalink
first pass at renaming paths
Browse files Browse the repository at this point in the history
frankhereford committed Sep 16, 2024
1 parent aa42ed5 commit 2b13f81
Showing 8 changed files with 61 additions and 49 deletions.
6 changes: 3 additions & 3 deletions .github/workflows/apply_db_migrations_and_metadata.yml
Original file line number Diff line number Diff line change
@@ -1,15 +1,15 @@
#
# Applies database migrations to staging (master) and production
# Applies database migrations to staging (main) and production
#
name: "Applies the migrations to the database"

on:
push:
branches:
- master
- main
- production
paths:
- "atd-vzd/**"
- "database/**"
- ".github/workflows/apply_db_migrations_and_metadata.yml"
- ".github/workflows/migration-helper.sh"
workflow_dispatch:
2 changes: 1 addition & 1 deletion .github/workflows/aws-vz-api-helper.sh
Original file line number Diff line number Diff line change
@@ -6,7 +6,7 @@ case "${BRANCH_NAME}" in
export WORKING_STAGE="production";
;;

"master")
"main")
export WORKING_STAGE="staging";
;;

28 changes: 14 additions & 14 deletions .github/workflows/build_docker_images.yml
Original file line number Diff line number Diff line change
@@ -5,22 +5,22 @@ on:
# and any updates to the atd-etl scripts
push:
branches:
- master
- main
- production
paths:
- ".github/workflows/build_docker_images.yml"
- "atd-etl/afd_ems_import/**"
- "atd-etl/cris_import/**"
- "atd-etl/socrata_export/**"
- "etl/afd_ems_import/**"
- "etl/cris_import/**"
- "etl/socrata_export/**"
pull_request:
branches:
- master
- main
- production
paths:
- ".github/workflows/build_docker_images.yml"
- "atd-etl/afd_ems_import/**"
- "atd-etl/cris_import/**"
- "atd-etl/socrata_export/**"
- "etl/afd_ems_import/**"
- "etl/cris_import/**"
- "etl/socrata_export/**"
# Allows you to run this workflow manually from the Actions tab
workflow_dispatch:

@@ -45,11 +45,11 @@ jobs:
base: ${{ github.ref }}
filters: |
afd_ems:
- 'atd-etl/afd_ems_import/**'
- 'etl/afd_ems_import/**'
cris:
- 'atd-etl/cris_import/**'
- 'etl/cris_import/**'
socrata_export:
- 'atd-etl/socrata_export/**'
- 'etl/socrata_export/**'
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2
@@ -59,7 +59,7 @@ jobs:
uses: docker/build-push-action@v4
with:
platforms: linux/amd64,linux/arm64
context: atd-etl/afd_ems_import
context: etl/afd_ems_import
push: true
tags: atddocker/vz-afd-ems-import:${{ github.ref == 'refs/heads/production' && 'production' || 'latest' }}

@@ -68,7 +68,7 @@ jobs:
uses: docker/build-push-action@v4
with:
platforms: linux/amd64,linux/arm64
context: atd-etl/cris_import
context: etl/cris_import
push: true
tags: atddocker/vz-cris-import:${{ github.ref == 'refs/heads/production' && 'production' || 'latest' }}

@@ -77,6 +77,6 @@ jobs:
uses: docker/build-push-action@v4
with:
platforms: linux/amd64,linux/arm64
context: atd-etl/socrata_export
context: etl/socrata_export
push: true
tags: atddocker/vz-socrata-export:${{ github.ref == 'refs/heads/production' && 'production' || 'development' }}
2 changes: 1 addition & 1 deletion .github/workflows/migration-helper.sh
Original file line number Diff line number Diff line change
@@ -53,7 +53,7 @@ function run_migration() {
# Controls the migration process
#
function run_migration_process() {
cd ./atd-vzd;
cd ./database;
echo "Running migration process @ ${PWD}"
export_hasura_env_vars;
run_migration;
5 changes: 3 additions & 2 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -63,7 +63,8 @@ build-iPhoneSimulator/
.idea/
.vscode/
# node_modules
/atd-vze/node_modules
/viewer/node_modules
/editor/node_modules


.DS_Store
@@ -78,7 +79,7 @@ build-iPhoneSimulator/
atd-events/**/*.json
atd-events/**/*.zip
atd-events/**/package
atd-etl/app/**/*.json
etl/app/**/*.json

# ignore `env` because it may be laying around in legacy ETL directoroes
env
55 changes: 33 additions & 22 deletions README.md
Original file line number Diff line number Diff line change
@@ -8,46 +8,47 @@ This folder hosts our API that securely downloads a private file from S3. It is

[more info](./atd-cr3-api/README.md)

## atd-etl (Extract-Transform-Load)
## etl (Extract-Transform-Load)

Our current method for extracting data from the TxDOT C.R.I.S. data system uses a python library called [Splinter](https://splinter.readthedocs.io/en/latest/) to request, download and process data. It is deployed as a Docker container.

For step-by-step details on how to prepare your environment and how to execute this process, please refer to the documentation in the [atd-etl folder.](https://github.com/cityofaustin/atd-vz-data/tree/master/atd-etl)
For step-by-step details on how to prepare your environment and how to execute this process, please refer to the documentation in the [etl folder.](https://github.com/cityofaustin/atd-vz-data/tree/master/etl)

[more info](./atd-etl/README.md)
[more info](./etl/README.md)

## atd-vzd (Vision Zero Database)
## database (Vision Zero Database)

VZD is our name for our Hasura GraphQL API server that connects to our Postgres RDS database instances.

[more info](./atd-vzd/README.md)
[more info](./database/README.md)

Production site: http://vzd.austinmobility.io/
Staging site: https://vzd-staging.austinmobility.io/

## atd-vze (Vision Zero Editor)
## editor (Vision Zero Editor)

VZE is our front end application built in React.js with CoreUI that allows a trusted group of internal users to edit and improve the data quality of our Vision Zero data. It consumes data from Hasura/VZD.

[more info](./atd-vze/README.md)
[more info](./editor/README.md)

Production site: https://visionzero.austin.gov/editor/
Staging site: https://visionzero-staging.austinmobility.io/editor/

## atd-vzv (Vision Zero Viewer)
## viewer (Vision Zero Viewer)

VZV is our public facing home for visualizations, maps, and dashboards that help make sense and aggregate trends in our Vision Zero Database

[more info](./atd-vzv/README.md)
[more info](./viewer/README.md)

Production site: https://visionzero.austin.gov/viewer/
Staging site: https://visionzero-staging.austinmobility.io/viewer/

## atd-toolbox
## toolbox

Collection of utilities related to maintaining data and other resources related to the Vision Zero Data projects.

## Local Development

The suite has a python script which can be used to run and populate a local development instance of the stack. The script is found in the root of the repository, and is named `vision-zero`. It's recommended to create a virtual environment in the root of the repo, and if you name it `venv`, it will be ignored by the `.gitignore` file in place. VS Code will automatically source the activation script, if you start a terminal from within it to interface with the stack.

The `vision-zero` program is a light wrapper around the functionality provided by `docker compose`. By inspecting the `docker-compose.yml` file, you can find the definitions of the services in the stack, and you can use the `docker compose` command to turn up, stop, and attach terminals to the running containers and execute on-off commands. This can provide you access to containers to install nodejs libraries, use postgres' supporting programs (`psql`, `pg_dump`) and other lower level utilities.
@@ -56,7 +57,7 @@ Ideally, you should be able to operate the entire vision zero suite and access a

### `vision-zero` command auto-completion

The `vision-zero` application is able to generate auto-completion scripts via the `shtab` python library. For example, `zsh` users may use the following to enable this feature. `bash` and `csh` users will have similar steps to follow particular to their shell of choice.
The `vision-zero` application is able to generate auto-completion scripts via the `shtab` python library. For example, `zsh` users may use the following to enable this feature. `bash` and `csh` users will have similar steps to follow particular to their shell of choice.

```
mkdir ~/.zsh_completion_functions;
@@ -70,40 +71,48 @@ source ./venv/bin/active;

Note: There is a flag which ends up being observed for any of the following commands which start the postgres database:

`-r / --ram-disk` will cause the database to back its "storage" on a RAM disk instead of non-volatile storage. This has the upside of being much faster as there is essentially no limit to the IOPS available to the database, but the data won't be able to survive a restart and will require being `replicate-db`'d back into place.

The default is to use the disk in the host to back the database, which is the operation our team is most familiar with, so if you don't need or want the RAM disk configuration, you can ignore this option.
`-r / --ram-disk` will cause the database to back its "storage" on a RAM disk instead of non-volatile storage. This has the upside of being much faster as there is essentially no limit to the IOPS available to the database, but the data won't be able to survive a restart and will require being `replicate-db`'d back into place.

The default is to use the disk in the host to back the database, which is the operation our team is most familiar with, so if you don't need or want the RAM disk configuration, you can ignore this option.

#### `vision-zero build`
Rebuild the stack's images based on the Dockerfiles found in the repository. They are built with the `--no-cache` flag which will make the build process slower, but avoid any stale image layers that have inadvertently cached out-of-date apt resource lists.

Rebuild the stack's images based on the Dockerfiles found in the repository. They are built with the `--no-cache` flag which will make the build process slower, but avoid any stale image layers that have inadvertently cached out-of-date apt resource lists.

#### `vision-zero db-up` & `vision-zero db-down`

Start and stop the postgres database

#### `vision-zero graphql-engine-up` & `vision-zero graphql-engine-down`

Start and stop the Hasura graphql-engine software

#### `vision-zero vze-up` & `vision-zero vze-down`

Start and stop the Vision Zero Editor

#### `vision-zero vzv-up` & `vision-zero vzv-down`

Start and stop the Vision Zero Viewer

#### `vision-zero psql`

Start a `psql` postgreSQL client connected to your local database

#### `vision-zero tools-shell`

Start a `bash` shell on a machine with supporting tooling

#### `vision-zero stop`

Stop the stack

#### `vision-zero replicate-db`
* Download a snapshot of the production database
* Store the file in `./atd-vzd/snapshots/visionzero-{date}-{with|without}-change-log.sql
* Drop local `atd_vz_data` database
* Create and repopulate the database from the snapshot

- Download a snapshot of the production database
- Store the file in `./database/snapshots/visionzero-{date}-{with|without}-change-log.sql
- Drop local `atd_vz_data` database
- Create and repopulate the database from the snapshot

Note: the `-c / --include-change-log-data` flag can be used to opt to include the data of past change log events. The schema is created either way.
Note: the `-f / --filename` flag can be optionally used to point to a specific data dump .sql file to use to restore.
@@ -112,11 +121,13 @@ The way the snapshots are dated means that one will only end up downloading
one copy of the data per-day, both in the with and without change log data.

#### `vision-zero dump-local-db`
* pg_dump the current local database
* Stores the file in `./atd-vzd/dumps/visionzero-{date}-{time}.sql

- pg_dump the current local database
- Stores the file in `./database/dumps/visionzero-{date}-{time}.sql

#### `vision-zero remove-snapshots`
Remove snapshot files. This can be done to save space and clean up old snapshots, but it's also useful to cause a new copy of the day's data to be downloaded if an upstream change is made.

Remove snapshot files. This can be done to save space and clean up old snapshots, but it's also useful to cause a new copy of the day's data to be downloaded if an upstream change is made.

## Technology Stack

10 changes: 5 additions & 5 deletions docker-compose.yml
Original file line number Diff line number Diff line change
@@ -2,7 +2,7 @@ services:
graphql-engine:
image: hasura/graphql-engine:v2.40.2
volumes:
- ./atd-vzd/graphql-engine-metadata:/metadata
- ./viewer/graphql-engine-metadata:/metadata
container_name: visionzero-graphql-engine
ports:
- 8084:8080
@@ -14,13 +14,13 @@ services:
container_name: visionzero_download_db_data
logging:
driver: none
build: atd-toolbox/download-db-data
build: toolbox/download-db-data
command: tail -f /dev/null
hostname: db-tools
env_file:
- .env
volumes:
- ./atd-vzd/snapshots:/snapshots
- ./database/snapshots:/snapshots
vze:
tty: true
container_name: visionzero-vze
@@ -30,7 +30,7 @@ services:
hostname: vze
build: atd-vze
volumes:
- ./atd-vze:/root/atd-vze
- ./editor:/root/atd-vze
vzv:
tty: true
container_name: visionzero-vzv
@@ -40,4 +40,4 @@ services:
hostname: vzv
build: atd-vzv
volumes:
- ./atd-vzv:/root/atd-vzv
- ./viewer:/root/atd-vzv
2 changes: 1 addition & 1 deletion viewer/README.md
Original file line number Diff line number Diff line change
@@ -2,7 +2,7 @@

This project is for a public-facing interactive web app showing crash data related to Vision Zero. Users can view crash data by different categories, including transportation mode, demographic groups impacted, time of day, and location.

Crash data is sourced from TxDOT's Crash Records Information System (CRIS) database. [Vision Zero Editor](https://github.com/cityofaustin/atd-vz-data/tree/production/atd-vze) provides tools for City of Austin Transportation Department staff to enrich crash data with additional attributes, as well as correct any erroneous or missing data.
Crash data is sourced from TxDOT's Crash Records Information System (CRIS) database. [Vision Zero Editor](https://github.com/cityofaustin/atd-vz-data/tree/production/editor) provides tools for City of Austin Transportation Department staff to enrich crash data with additional attributes, as well as correct any erroneous or missing data.

For resources and updates, see the [Vision Zero Crash Data System](https://github.com/cityofaustin/atd-data-tech/issues/255) project index.

0 comments on commit 2b13f81

Please sign in to comment.