** WANTED ** Cover Image
Web-Scraper for crypto prices with a simple api. Web-socket feeds pull live prices from supported exchanges (WIP), and aggregate to minute candles. Minutely cron job filling in any missing data via historic candle stick api's. Candle-stick data is stored in influxdb, and can be accessed via the api.
Supported Exchanges:
- Binance
- CoinbasePro
- KuCoin
- FTX
- OKEX
- golang:
go version go1.16.6 linux/amd64
(other versions not tested) - docker & docker-compose
- clone:
git clone git@github.com:chain-bot/prices.git
- Create
.env
file via templatecp env_example.txt .env
- Variables with a value of
<...>
need to be filled in by the user
cat .env | grep '<...>'
- Variables with a value of
- install project packages:
go get -u ./... -v
- run postgres & influxdb:
docker-compose --file ./build/docker-compose.yaml --env-file ../.env up
- run the scraper app :
go run app/cmd/scraper/main.go
- run the prices api server :
go run app/cmd/server/main.go
At this point you should see debug logs in the console of the scraper running, if this isn't the case please file an issue.
The project is bundeled as two docker images: prices-scraper and prices-server, with the server running on port 8080
.
# External Dependencies (psql, influxdb)
docker-compose --file ./build/docker-compose.yaml --env-file ../.env up -d
# Build app docker image
docker image build -t prices-server -f build/server.dockerfile .
docker image build -t prices-scraper -f build/scraper.dockerfile .
docker run -d --rm --env-file ./.env --network="host" prices-server
docker run -d --rm --env-file ./.env --network="host" prices-scraper
├── app
│ ├── cmd
│ ├── configs // Handles secrets resolution (secrets, passwords, etc)
│ ├── internal // Scraper code + influx/psql interface code
│ └── pkg // All API interface code
├── build
│ ├── docker-compose.yaml
│ └── dockerfile
├── docs
│ ├── chronograph
│ ├── cover.html
│ ├── env_example.txt
│ └── images
├── scripts
│ ├── run-test-with-coverage.sh // Code Coverage Script (Run Before Making PR)
- Database models are generated using the database schema via sqlboiler
- sqlboiler introspects the database schema and creates the model files
- Before generating the models, the database needs to be running, and the migrations need to be executed
docker-compose --file ./build/docker-compose.yaml --env-file ../.env up -d
./scripts/run-database-migrations.sh
./scripts/generate-database-models.sh
- Note: Running
main.go
will automatically run relevant migrations
- Create a migration file under
data/psql/migrations
, and name it appropriately (ex.2_new_migration.up.sql
) - Run the migration script
./scripts/run-database-migrations.sh # run from the root of the repo
- Generate database models
./scripts/generate-database-models.sh # run from the root of the repo
- Create a new package under
app/pkg/api
with the name of the data source - Create an
api_client.go
andapi_wrapper.go
and have them implement theExchangeAPIClient
type interface - Create a wrapper method for the new data source in
app/pkg/api/module.go
- ex.
func NewCoinbaseProAPIClient(...) ExchangeClientResult {...}
- ex.
- Add wrapper method to
GetAPIProviders
(this makes it available to the app via uber.fx dependency injection) - Run the test file
app/pkg/api/exchange_client_test.go
- The runs on port 8080 by default
- Routes are defined and registered in
pkg/server/
- Run tests and update the
README.md
via the following script - The script will run all tests inside the
app
folder (excluding sqlboiler generated files)
./scripts/run-test-with-coverage.sh #run from root of the repo
There is an example dashboard under docs/chronograph/dashboard