Skip to content

Commit

Permalink
feat(observer): connect to spark_stats database (#123)
Browse files Browse the repository at this point in the history
Also, modify spark-stats to accept two connection strings:
- DATABASE_URL pointing to spark_stats database
- EVALUATE_DB_URL pointing to spark_evaluate database

For now, spark-stats is reading data from the spark_evaluate 
database only.

Signed-off-by: Miroslav Bajtoš <oss@bajtos.net>
  • Loading branch information
bajtos authored May 30, 2024
1 parent a7b062e commit 1e52665
Show file tree
Hide file tree
Showing 19 changed files with 202 additions and 37 deletions.
10 changes: 6 additions & 4 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ jobs:
postgres:
image: postgres:latest
env:
POSTGRES_DB: postgres
POSTGRES_DB: spark_stats
POSTGRES_PASSWORD: postgres
POSTGRES_USER: postgres
ports:
Expand All @@ -23,9 +23,11 @@ jobs:
--health-timeout 5s
--health-retries 5
env:
DATABASE_URL: postgres://postgres:postgres@localhost:5432/postgres
DATABASE_URL: postgres://postgres:postgres@localhost:5432/spark_stats
EVALUATE_DB_URL: postgres://postgres:postgres@localhost:5432/spark_evaluate
NPM_CONFIG_WORKSPACE: stats
steps:
- run: psql "${DATABASE_URL}" -c "CREATE DATABASE spark_evaluate"
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with:
Expand All @@ -40,7 +42,7 @@ jobs:
postgres:
image: postgres:latest
env:
POSTGRES_DB: postgres
POSTGRES_DB: spark_stats
POSTGRES_PASSWORD: postgres
POSTGRES_USER: postgres
ports:
Expand All @@ -51,7 +53,7 @@ jobs:
--health-timeout 5s
--health-retries 5
env:
DATABASE_URL: postgres://postgres:postgres@localhost:5432/postgres
DATABASE_URL: postgres://postgres:postgres@localhost:5432/spark_stats
NPM_CONFIG_WORKSPACE: observer
steps:
- uses: actions/checkout@v4
Expand Down
1 change: 1 addition & 0 deletions Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -29,6 +29,7 @@ COPY --link package-lock.json package.json ./

# We cannot use a wildcard until `COPY --parents` is stabilised
# See https://docs.docker.com/reference/dockerfile/#copy---parents
COPY --link migrations/package.json ./migrations/
COPY --link stats/package.json ./stats/
COPY --link observer/package.json ./observer/

Expand Down
22 changes: 15 additions & 7 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@ Set up [PostgreSQL](https://www.postgresql.org/) with default settings:
- Port: 5432
- User: _your system user name_
- Password: _blank_
- Database: spark_public
- Database: spark_stats

Alternatively, set the environment variable `$DATABASE_URL` with
`postgres://${USER}:${PASS}@${HOST}:${POST}/${DATABASE}`.
Expand All @@ -60,12 +60,18 @@ You can also run the following command to set up the PostgreSQL server via Docke
docker run -d --name spark-db \
-e POSTGRES_HOST_AUTH_METHOD=trust \
-e POSTGRES_USER=$USER \
-e POSTGRES_DB=spark_public \
-e POSTGRES_DB=spark_stats \
-p 5432:5432 \
postgres
```

Finally, run database schema migration scripts from spark-evaluate.
Next, you need to create `spark_evaluate` database.

```bash
psql postgres://localhost:5432/ -c "CREATE DATABASE spark_evaluate"
```

Finally, run database schema migration scripts.

```bash
npm run migrate
Expand All @@ -80,7 +86,7 @@ npm test
### Run the `spark-stats` service

```sh
npm start --workspace stats
npm start -w stats
```

You can also run the service against live data in Spark DB running on Fly.io.
Expand All @@ -97,14 +103,16 @@ You can also run the service against live data in Spark DB running on Fly.io.
2. Start the service and configure the database connection string to use the proxied connection.
Look up the user and the password in our shared 1Password vault.

```
DATABASE_URL="postgres://user:password@localhost:5455/spark_public" npm start
```bash
DATABASE_URL="postgres://user:password@localhost:5455/spark_stats" \
EVALUATE_DB_URL="postgres://user:password@localhost:5455/spark_evaluate" \
npm start -w stats
```

### Run the `spark-observer` service

```sh
npm start --workspace observer
npm start -w observer
```

## Deployment
Expand Down
1 change: 1 addition & 0 deletions migrations/001.do.sql
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
SELECT now();
43 changes: 43 additions & 0 deletions migrations/index.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
import { dirname, join } from 'node:path'
import { fileURLToPath } from 'node:url'
import pg from 'pg'
import Postgrator from 'postgrator'

const migrationsDirectory = join(
dirname(fileURLToPath(import.meta.url)),
'..',
'migrations'
)

/**
*@param {pg.ClientConfig} pgConfig
*/
export const migrateWithPgConfig = async (pgConfig) => {
const client = new pg.Client(pgConfig)
await client.connect()
try {
await migrateWithPgClient(client)
} finally {
await client.end()
}
}

/**
* @param {pg.Client} client
*/
export const migrateWithPgClient = async (client) => {
const postgrator = new Postgrator({
migrationPattern: join(migrationsDirectory, '*'),
driver: 'pg',
execQuery: (query) => client.query(query)
})
console.log(
'Migrating DB schema from version %s to version %s',
await postgrator.getDatabaseVersion(),
await postgrator.getMaxVersion()
)

await postgrator.migrate()

console.log('Migrated DB schema to version', await postgrator.getDatabaseVersion())
}
23 changes: 23 additions & 0 deletions migrations/package.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,23 @@
{
"name": "@filecoin-station/spark-stats-db-migrations",
"version": "1.0.0",
"type": "module",
"main": "index.js",
"private": true,
"scripts": {
"lint": "standard",
"test": "mocha"
},
"devDependencies": {
"standard": "^17.1.0"
},
"dependencies": {
"pg": "^8.11.5",
"postgrator": "^7.2.0"
},
"standard": {
"env": [
"mocha"
]
}
}
5 changes: 5 additions & 0 deletions observer/bin/migrate.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
import { DATABASE_URL } from '../lib/config.js'
import { migrateWithPgConfig as migrateStatsDB } from '@filecoin-station/spark-stats-db-migrations'

console.log('Migrating spark_stats database')
await migrateStatsDB({ connectionString: DATABASE_URL })
33 changes: 32 additions & 1 deletion observer/bin/spark-observer.js
Original file line number Diff line number Diff line change
@@ -1,8 +1,11 @@
import { ethers } from 'ethers'
import * as SparkImpactEvaluator from '@filecoin-station/spark-impact-evaluator'
import { migrateWithPgClient } from '@filecoin-station/spark-stats-db-migrations'
import { ethers } from 'ethers'
import pg from 'pg'

// TODO: move this to a config.js file
const {
DATABASE_URL = 'postgres://localhost:5432/spark_stats',
RPC_URLS = 'https://api.node.glif.io/rpc/v0',
GLIF_TOKEN
} = process.env
Expand Down Expand Up @@ -31,6 +34,34 @@ const ieContract = new ethers.Contract(
provider
)

const pgPool = new pg.Pool({
connectionString: DATABASE_URL,
// allow the pool to close all connections and become empty
min: 0,
// this values should correlate with service concurrency hard_limit configured in fly.toml
// and must take into account the connection limit of our PG server, see
// https://fly.io/docs/postgres/managing/configuration-tuning/
max: 100,
// close connections that haven't been used for one second
idleTimeoutMillis: 1000,
// automatically close connections older than 60 seconds
maxLifetimeSeconds: 60
})

pgPool.on('error', err => {
// Prevent crashing the process on idle client errors, the pool will recover
// itself. If all connections are lost, the process will still crash.
// https://github.com/brianc/node-postgres/issues/1324#issuecomment-308778405
console.error('An idle client has experienced an error', err.stack)
})

await migrateWithPgClient(pgPool)

// Check that we can talk to the database
await pgPool.query('SELECT 1')

console.log('Listening for impact evaluator events')

ieContract.on('Transfer', (to, amount, ...args) => {
/** @type {number} */
const blockNumber = args.pop()
Expand Down
4 changes: 4 additions & 0 deletions observer/lib/config.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
export const {
// DATABASE_URL points to `spark_stats` database managed by this monorepo
DATABASE_URL = 'postgres://localhost:5432/spark_stats'
} = process.env
2 changes: 2 additions & 0 deletions observer/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,7 @@
"type": "module",
"private": true,
"scripts": {
"migrate": "node bin/migrate.js",
"start": "node bin/spark-observer.js",
"lint": "standard",
"test": "mocha"
Expand All @@ -13,6 +14,7 @@
},
"dependencies": {
"@filecoin-station/spark-impact-evaluator": "^1.1.1",
"@filecoin-station/spark-stats-db-migrations": "^1.0.0",
"@sentry/node": "^8.7.0",
"debug": "^4.3.4",
"ethers": "^6.12.1",
Expand Down
17 changes: 17 additions & 0 deletions observer/test/smoke.test.js
Original file line number Diff line number Diff line change
@@ -1,5 +1,22 @@
// TODO: remove this file once we have real tests in place

import { DATABASE_URL } from '../lib/config.js'
import { migrateWithPgClient } from '@filecoin-station/spark-stats-db-migrations'
import pg from 'pg'

describe('spark-observer', () => {
/** @type {pg.Pool} */
let pgPool

before(async () => {
pgPool = new pg.Pool({ connectionString: DATABASE_URL })
await migrateWithPgClient(pgPool)
})

after(async () => {
await pgPool.end()
})

it('works', async () => {
await import('../index.js')
})
Expand Down
Loading

0 comments on commit 1e52665

Please sign in to comment.