Skip to content

Commit

Permalink
Merge branch 'master' into tailwindcss
Browse files Browse the repository at this point in the history
  • Loading branch information
dekkerglen committed Oct 14, 2024
2 parents f186854 + ecf0a36 commit 6c264be
Show file tree
Hide file tree
Showing 14 changed files with 173 additions and 39 deletions.
12 changes: 7 additions & 5 deletions .env_EXAMPLE
Original file line number Diff line number Diff line change
@@ -1,10 +1,12 @@
AWS_ACCESS_KEY_ID=""
AWS_LOG_GROUP=""
AWS_PROFILE="localstack"
AWS_ACCESS_KEY_ID="test"
AWS_LOG_GROUP="local"
AWS_LOG_STREAM=""
AWS_REGION=""
AWS_SECRET_ACCESS_KEY=""
AWS_REGION="us-east-1"
AWS_SECRET_ACCESS_KEY="test"
AWS_ENDPOINT=http://localhost:4566
CUBECOBRA_VERSION="x.x.x"
DATA_BUCKET=""
DATA_BUCKET="local"
DOMAIN="localhost:8080"
DOWNTIME_ACTIVE="false"
DYNAMO_PREFIX="LOCAL"
Expand Down
161 changes: 131 additions & 30 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,31 +12,110 @@ If you are interested in contributing towards Cube Cobra, please read the [Contr

You will need to install NodeJS, Redis, and an IDE of your preference (I recommend VSCode). You can find the necessary resources here:

### NodeJS

Node 20

NodeJS: https://nodejs.org/en/download/

### Redis

Redis Server:

- Windows: https://github.com/microsoftarchive/redis
- Mac: `brew install redis`
- Linux: `apt-get install redis`

After installing redis, start the server. On mac, a shortcut to do this is `brew services start redis`. You can seet the status with `brew services list`.

### Localstack

[Localstack][localstack] provides a local emulation of AWS Services required to run CubeCobra including S3, DynamoDB and Cloudwatch.

You may follow the installation guidelines from the localstack site. The recommended setup involves running localstack in a docker container, which requires [Docker Desktop][docker] as well.

- Windows: Download and install the binary from localstack
- Mac: `brew install localstack/tap/localstack-cli`
- Linux: Use the `curl` command from localstack

[localstack]: https://docs.localstack.cloud/getting-started/installation/
[docker]: https://docs.docker.com/desktop/install/mac-install/

Once localstack is installed, you can start the server in the background with the CLI: `localstack start --detached`. You can see the status with `localstack status`.

### Code Editor (IDE)

VSCode (strongly recommended, but not required): https://code.visualstudio.com/
ESLint Extension for VSCode: https://marketplace.visualstudio.com/items?itemName=dbaeumer.vscode-eslint
Prettier Extension for VSCode: https://marketplace.visualstudio.com/items?itemName=esbenp.prettier-vscode

VSCode (with the ESLint and Prettier extension) is the recommended environment. When using this setup, make sure that your selected workspace is the root folder that you have cloned, this will ensure that the ESLint plugin can work with our linting rules. Prettier will automatically apply standard formatting to your code. Using these plugins will make adhering to the linting and code formatting rules significantly easier.

### Initial Setup

For the first setup, you will need to run:

```sh
yarn install && yarn build
yarn setup:local
```

This will:
- install dependencies
- build the application code to run setup scripts
- run setup scripts to:
- create a .env file with values for running the application locally already set
- setup localstack w/ s3 bucket
- setup local files for application perisistence
- setup localstack dynamodb tables (ex. Users, Cubes, Cards, etc.)
- download bulk card data from scryfall, persist to files and load it to localstack s3

If you are on Windows, you will need to set bash as your script shell:

You will need to make sure you have `bash` installed somewhere and run the following command [with your `bash` path in place of the path below].

```sh
yarn config set script-shell "C:\\Program Files\\git\\bin\\bash.exe"
```

### Running CubeCobra

Then you can start the program like so:

```sh
yarn start:dev
```

This script will:
- ensure localstack is running
- ensure nearly parsers for card filters have compiled
- compile & watch scss (bootstrap) styles
- compile & watch server javascript w/ nodemon
- run & watch webpack dev server

You can now open up a browser and connect to the app through: http://localhost:8080.

VSCode (with the ESLint extension) is the recommended environment. When using this setup, make sure that your selected workspace is the root folder that you have cloned, this will ensure that the ESLint plugin can work with our linting rules. Using this plugin will make adhering to the linting rules significantly easier.
(Despite the fact that node says it is running on port 5000 in the logs, you should use port 8080 to connect.)

### Environment Variables
Nodemon will restart the application anytime there is a change to a source file.

After accessing the application locally you will need to create a new user account using the "Resister" link on the nav bar.

### Environment Variables & Connecting to AWS

Environment variables are populated from the `.env` file. There is no `.env` file checked in, so the first thing you need to do is copy `.env_EXAMPLE` to `.env` and fill out the values. Cube Cobra uses several AWS resources, including S3, DynamoDB, and Cloudwatch. For development purposes, you will need to create an AWS account and insert your credentials into the `.env` file.
Environment variables are populated from the `.env` file. There is no `.env` file checked in, so the setup script copies `.env_EXAMPLE` to `.env` and with some default values to support CubeCobra backed by LocalStack.

You can run a local instance of Cube Cobra against real AWS resources rather than LocalStack, if desired. After setting up S3, DynamoDB, and Cloudwatch using your AWS account, you can insert your credentials into the `.env` file.

Here is a table on how to fill out the env vars:

| Variable Name | Description | Required? |
| ---------------------- | -------------------------------------------------------------------------------------------- | --------- | --- |
| AWS_ACCESS_KEY_ID | The AWS access key for your account. | Yes |
| AWS_ENDPOINT | The base endpoint to use for AWS. Used to point to localstack rather than hosted AWS. | |
| AWS_LOG_GROUP | The name of the AWS CloudWatch log group to use. | Yes |
| AWS_LOG_STREAM | The name of the AWS CloudWatch log stream to use. | |
| AWS_REGION | The AWS region to use. | Yes |
| AWS_REGION | The AWS region to use (default: us-east-2). | Yes |
| AWS_SECRET_ACCESS_KEY | The AWS secret access key for your account. | Yes |
| CUBECOBRA_VERSION | The version of Cube Cobra. | |
| DATA_BUCKET | The name of the AWS S3 bucket to use. You will need to create this bucket in your account. | Yes |
Expand Down Expand Up @@ -64,47 +143,69 @@ Here is a table on how to fill out the env vars:
| AUTOSCALING_GROUP | The name of the autoscaling group this instance is run in, used for the distributed cache. | |
| CACHE_SECRET | The secret for the distributed cache. | |

### Initial Setup
### Updating Card Definitions and Analytics

For the first setup, you will need to run:
In the initial setup scripts, `yarn update-cards` is what creates the card definitions. Running this script will pull the latest data from scryfall.

If you want card analytics, can run the following script:

```sh
yarn install && yarn build
node one_shot_scripts/create_local_files.js
node --max-old-space-size=4096 one_shot_scripts/createTables.js
node --max-old-space-size=4096 jobs/update_cards.js
yarn update-all
```

If you are on Windows, you will need to set bash as your script shell:
You will need to make sure you have `bash` installed somewhere and run the following command [with your `bash` path in place of the path below].

yarn config set script-shell "C:\\Program Files\\git\\bin\\bash.exe"

Then you can start the program like so:
This will, in sequence:
- update draft history
- update cube history
- update metadata dictionary
- update cards

yarn devstart

You can now open up a browser and connect to the app through: http://localhost:8080. Despite the fact that node says it is running on port 5000, you should use port 8080 to connect.
# Concepts

Nodemon will restart the application anytime there is a change to a source file.
## Backend

### Updating Card Definitions and Analytics
### API & Template Rendering

From the previous script, `jobs/update_cards` is what creates the card definitions. Running this script will pull the latest data from scryfall. If you want card analytics, you'll need to run the following scripts in this order:
[Express 4][express] provides a minimalist web framework to support both template rendering with [PugJS 3][pug] and definition of JSON-based API endpoints. HTML templates are mainly used to render a minimal page for React to bootstrap itself into with initial props injected from the server.

```sh
node --max-old-space-size=4096 jobs/update_draft_history.js
node --max-old-space-size=4096 jobs/update_cube_history.js
node --max-old-space-size=4096 jobs/update_metadata_dict.js
node --max-old-space-size=4096 jobs/update_cards.js
```
[express]: https://expressjs.com/en/4x/api.html

# Concepts
[pug]: https://pugjs.org/api/getting-started.html

### Cards

We keep all card definitions in large pre-processed files, so that nodes in production just need to download and load the files, and can fetch the latest files from S3 when they're ready. We do this because it's much faster to read from memory than to have to make requests to some other service anytime we need card data. An external process is responsible for updating the card definitions, and uploading to S3. This same process is also responsible for updating the card analytics, and data exports.
We keep all card definitions in large pre-processed files, so that nodes in production just need to download and load the files, and can fetch the latest files from S3 when they're ready. We do this because it's much faster to read from memory than to have to make requests to some other service anytime we need card data.

An external process is responsible for updating the card definitions, and uploading to S3. This same process is also responsible for updating the card analytics, and data exports.

### Multiplayer Drafting

We use redis for concurrency control for multiplayer drafting. All redis operations are handled in `multiplayerDrafting.js`

### Scheduled jobs

Each instance of the express server runs a job using node-schedule on a nightly basis to update the in-memory carddb from s3.

Bash scripts (`jobs/definition`) are executed periodically on AWS to run hourly, daily & weekly jobs.

### Card Filters

Card filters are defined that can be used by the frontend and backend. [Nearley][nearly] is a nodejs parser toolkit that is used to generate code that define filters that can be applied to the card database.

[nearly]: https://nearley.js.org/

## Frontend

### Typescript

[TypeScript 5.5][typescript] is gradually being rolled out to replace usage of vanilla JS components with PropTypes.

[typescript]: https://www.typescriptlang.org/docs/handbook/release-notes/typescript-5-5.html

### Components & Styling

Components are provided by [Reactstrap 9][reactstrap] which is powered by [Bootstrap v5.1][boostrap], which uses SCSS.

Components typically directly use bootstrap classes for additional styling, but may also use global classnames defined in public CSS files.

[reactstrap]: https://reactstrap.github.io/
[bootstrap]: https://getbootstrap.com/docs/5.1/getting-started/introduction/
3 changes: 2 additions & 1 deletion app.js
Original file line number Diff line number Diff line change
Expand Up @@ -86,9 +86,10 @@ app.use(
writeCapacityUnits: 10,
},
dynamoConfig: {
endpoint: process.env.AWS_ENDPOINT || undefined,
accessKeyId: process.env.AWS_ACCESS_KEY_ID,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
region: process.env.AWS_REGION,
region: process.env.AWS_REGION || 'us-east-2',
},
keepExpired: false,
touchInterval: 30000,
Expand Down
2 changes: 2 additions & 0 deletions dynamo/cache.js
Original file line number Diff line number Diff line change
Expand Up @@ -54,6 +54,8 @@ const AWS = require('aws-sdk');

// Set the region
AWS.config.update({
endpoint: process.env.AWS_ENDPOINT || undefined,
s3ForcePathStyle: !!process.env.AWS_ENDPOINT,
accessKeyId: process.env.AWS_ACCESS_KEY_ID,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
region: process.env.AWS_REGION,
Expand Down
3 changes: 2 additions & 1 deletion dynamo/client.js
Original file line number Diff line number Diff line change
Expand Up @@ -4,8 +4,9 @@ require('dotenv').config();
// Load the AWS SDK for Node.js
const AWS = require('aws-sdk');

// Set the region
AWS.config.update({
endpoint: process.env.AWS_ENDPOINT || undefined,
s3ForcePathStyle: !!process.env.AWS_ENDPOINT,
accessKeyId: process.env.AWS_ACCESS_KEY_ID,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
region: process.env.AWS_REGION,
Expand Down
2 changes: 2 additions & 0 deletions dynamo/documentClient.js
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,8 @@ const AWS = require('aws-sdk');

// Set the region
AWS.config.update({
endpoint: process.env.AWS_ENDPOINT || undefined,
s3ForcePathStyle: !!process.env.AWS_ENDPOINT,
accessKeyId: process.env.AWS_ACCESS_KEY_ID,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
region: process.env.AWS_REGION,
Expand Down
2 changes: 2 additions & 0 deletions dynamo/s3client.js
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,8 @@ const { get, put, invalidate } = require('./cache');

// Set the region
AWS.config.update({
endpoint: process.env.AWS_ENDPOINT || undefined,
s3ForcePathStyle: !!process.env.AWS_ENDPOINT,
accessKeyId: process.env.AWS_ACCESS_KEY_ID,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
region: process.env.AWS_REGION,
Expand Down
3 changes: 3 additions & 0 deletions jobs/download_model.js
Original file line number Diff line number Diff line change
Expand Up @@ -5,8 +5,11 @@ const AWS = require('aws-sdk');

const downloadFromS3 = async () => {
const s3 = new AWS.S3({
endpoint: process.env.AWS_ENDPOINT || undefined,
s3ForcePathStyle: !!process.env.AWS_ENDPOINT,
accessKeyId: process.env.AWS_ACCESS_KEY_ID,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
region: process.env.AWS_REGION || 'us-east-2',
});

// list all from s3 under s3://cubecobra/model
Expand Down
3 changes: 3 additions & 0 deletions jobs/update_cards.js
Original file line number Diff line number Diff line change
Expand Up @@ -872,8 +872,11 @@ const downloadFromScryfall = async (metadatadict, indexToOracle) => {
};

const s3 = new AWS.S3({
endpoint: process.env.AWS_ENDPOINT || undefined,
s3ForcePathStyle: !!process.env.AWS_ENDPOINT,
accessKeyId: process.env.AWS_ACCESS_KEY_ID,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
region: process.env.AWS_REGION || 'us-east-2',
});

const uploadStream = (key) => {
Expand Down
9 changes: 8 additions & 1 deletion package.json
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,14 @@
"upload-exports": "aws s3 sync temp/export s3://cubecobra",
"exports": "node --max-old-space-size=8192 jobs/export_cubes.js && node --max-old-space-size=8192 jobs/export_decks.js && node --max-old-space-size=8192 jobs/export_simple_card_dict.js && yarn run upload-exports",
"rotate-queue": "node --max-old-space-size=8192 jobs/rotate_featured.js",
"sync-podcasts": "node --max-old-space-size=8192 jobs/update_podcasts.js"
"sync-podcasts": "node --max-old-space-size=8192 jobs/update_podcasts.js",
"start:localstack": "./scripts/local/start_localstack.sh",
"setup:local": "yarn start:localstack && yarn setup:local:env && yarn setup:local:localstack && yarn setup:local:files && yarn setup:local:db && yarn update-cards",
"setup:local:localstack": "./scripts/init_localstack.sh",
"setup:local:env": "cp .env_EXAMPLE .env",
"setup:local:files": "node one_shot_scripts/create_local_files.js",
"setup:local:db": "node --max-old-space-size=8192 one_shot_scripts/createTables.js",
"start:dev": "yarn start:localstack && yarn devstart"
},
"author": "Gwen Dekker",
"license": "ISC",
Expand Down
3 changes: 3 additions & 0 deletions scripts/local/init_localstack.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
#!/bin/bash

awslocal s3 mb s3://local
3 changes: 3 additions & 0 deletions scripts/local/start_localstack.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
#!/bin/bash

localstack start --detached
3 changes: 2 additions & 1 deletion serverjs/cloudwatch.js
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,8 @@ const {
const uuid = require('uuid');

const client = new CloudWatchLogsClient({
region: process.env.AWS_REGION,
endpoint: process.env.AWS_ENDPOINT || undefined,
region: process.env.AWS_REGION || 'us-east-2',
credentials: {
accessKeyId: process.env.AWS_ACCESS_KEY_ID,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
Expand Down
3 changes: 3 additions & 0 deletions serverjs/updatecards.js
Original file line number Diff line number Diff line change
Expand Up @@ -5,8 +5,11 @@ const carddb = require('./carddb');

const downloadFromS3 = async (basePath = 'private') => {
const s3 = new AWS.S3({
endpoint: process.env.AWS_ENDPOINT || undefined,
s3ForcePathStyle: !!process.env.AWS_ENDPOINT,
accessKeyId: process.env.AWS_ACCESS_KEY_ID,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
region: process.env.AWS_REGION || 'us-east-2',
});

await Promise.all(
Expand Down

0 comments on commit 6c264be

Please sign in to comment.