SQL scripts that demonstrate various features and concepts.
This project contains a bunch of SQL learning materials aimed at different levels of experience and covering a variety of topics. It focuses on just writing SELECT
statements so there will be very few resources for anything else.
Jump into https://bilbottom.github.io/sql-learning-materials/ to see the summary of what's covered in this project, and continue below for instructions on how to set up the databases.
The data used in this project is from a couple of sources.
The SQL Server instance will load the ubiquitous AdventureWorks databases (the transactional one and the analytical one), which is available from various Microsoft pages:
- https://learn.microsoft.com/en-us/sql/samples/adventureworks-install-configure
- https://github.com/Microsoft/sql-server-samples/releases/download/adventureworks
The PostgreSQL instance will load the similarly ubiquitous Sakila database, which is available from the following GitHub repo:
All credit for the data goes to the respective owners, and these sources should be consulted for any documentation you need around the data.
The docs are built using MkDocs and the Material for MkDocs theme, with several plugins to make the experience more interactive. I cannot express the love I have for the maintainers of these tools!
This project uses Poetry to manage the Python dependencies and Docker to spin up the databases.
To install these, follow the instructions on their websites:
- https://python-poetry.org/docs/#installation
- https://www.python.org/downloads/
- https://docs.docker.com/get-docker/
After installing the pre-requisites and cloning this repo, just run the resources
package to download the files needed to feed into the SQL Server and PostgreSQL databases before running Docker's compose
command.
poetry install --sync # --with dev,test,docs
python -m resources
docker compose up --detach
mkdocs build
mkdocs serve
docker compose down --volumes # When you're finished
This will take a little while to run since there's a fair bit of data to chunk through.
You can connect to the databases using any of your favourite SQL clients.
The credentials for the databases are not sensitive and are defined in the docker-compose.yml
file. For reference, the credentials are:
Database | Username | Password | Host | Port |
---|---|---|---|---|
SQL Server | SA | Test@12345 | localhost | 1433 |
MySQL | root | Test@12345 | localhost | 3306 |
PostgreSQL | postgres | Test@12345 | localhost | 5432 |
ClickHouse | localhost | 8123 |
Note that ClickHouse can be connected to without any authentication.
The SQLite and DuckDB databases are just files, so using Docker for these is overkill -- when you run python -m resources
, the files for these databases will be created in your file system. The file locations are defined in the resources.toml
config file (you can override them there if you want) and are:
Database | File location |
---|---|
SQLite | src/resources/data/sqlite/loan.db |
DuckDB | src/resources/data/duckdb/duckdb.db |
The Metabase instance will be launched on localhost:3000
and you will have to configure your own login details.
...you will have to make sure that you have enabled the virtualisation framework and Rosetta for amd64 support, see the following GitHub issue and comment: