This repository shows ASP.NET Core 9 logging in action; it also serves as a learning, experimenting and teaching path for .NET, Azure Pipelines and other technologies & tools.
❗ Currently this web API uses JSON web tokens (JWT) for authentication & authorization purposes, but momentarily the mechanism used for generating these tokens has been greatly simplified to the point of being actually naive as my focus is set on other topics; on the other hand, I do intend on providing a more realistic implementation in a not so far away future.
This project has several posts associated with it:
- Structured logging in ASP.NET Core using Serilog and Seq
- Use Docker Compose when running integration tests with Azure Pipelines
- Use Docker when running integration tests with Azure Pipelines
- Build an ASP.NET Core application using Azure Pipelines
- Logging HTTP context in ASP.NET Core
Build Server | Operating System | Status |
---|---|---|
Azure Pipelines | Linux | |
Azure Pipelines | macOs | |
Azure Pipelines | Windows |
Provider | Badge |
---|---|
Codacy | |
FOSSA | |
SonarCloud | |
In order to run this application locally, you need to setup some things first, like: run PostgreSQL and pgAdmin via Docker Compose, create a PostgreSQL database using EF Core database migrations, etc.
This ASP.NET Core web API uses PostgreSQL as persistent storage and pgAdmin as database manager, all running locally via Docker Compose.
These volumes are needed to store data outside the Docker containers running the PostgreSQL databases and their manager.
- Volume used by the local development database
docker volume create --name=aspnet-core-logging-db-for-local-dev_data
- Volume used by the integration tests when run locally
docker volume create --name=aspnet-core-logging-db-for-integration-tests_data
- Volume used by the acceptance tests when run locally
docker volume create --name=aspnet-core-logging-db-for-acceptance-tests_data
- Volume used by pgAdmin tool
docker volume create --name=pgadmin_data
- Volume used by Seq tool
docker volume create --name=seq_data
The .env file is used by Docker Compose to avoid storing sensitive data inside docker-compose.yml
file.
Create a new file named .env
inside the folder where you have checked-out this git repository and add the following lines:
# Environment variables used by the local dev DB compose service
DB_LOCAL_POSTGRES_USER=<DB_LOCAL_USERNAME>
DB_LOCAL_POSTGRES_PASSWORD=<DB_LOCAL_PASSWORD>
# Environment variables used by the integration tests DB compose service
DB_INTEGRATION_TESTS_POSTGRES_USER=<DB_INTEGRATION_TESTS_USERNAME>
DB_INTEGRATION_TESTS_POSTGRES_PASSWORD=<DB_INTEGRATION_TESTS_PASSWORD>
# Environment variables used by the acceptance tests DB compose service
DB_ACCEPTANCE_TESTS_POSTGRES_USER==<DB_ACCEPTANCE_TESTS_USERNAME>
DB_ACCEPTANCE_TESTS_POSTGRES_PASSWORD=<DB_ACCEPTANCE_TESTS_PASSWORD>
# Environment variables used by DB client compose service
PGADMIN_DEFAULT_EMAIL=<PGADMIN_EMAIL>
PGADMIN_DEFAULT_PASSWORD=<PGADMIN_PASSWORD>
Make sure you replace all of the above <DB_LOCAL_USERNAME>
, <DB_LOCAL_PASSWORD>
, ..., <PGADMIN_PASSWORD>
tokens with the appropriate values.
All of the commands below must be run from the folder where you have checked-out this git repository.
This folder contains a docker-compose.yml
file describing the aforementioned compose services.
# The -d flag instructs Docker Compose to run services in the background
docker compose up -d
docker compose stop
docker compose start
# The -f flag instructs Docker Compose to display and follow the log entries of the 'pgadmin' service
docker compose logs -f pgadmin
The command below will not delete the Docker volumes!
docker compose down
Once the services have been started using docker compose up
command, pgAdmin UI is ready to be used.
Open your browser and navigate to http://localhost:8080.
In order to start using pgAdmin, you need to authenticate - use the PGADMIN_DEFAULT_EMAIL
and PGADMIN_DEFAULT_PASSWORD
properties found in your .env
file to login.
When asked about a PostgreSQL server to register, populate the fields found inside Connection
tab as below:
- Host name/address =
aspnet-core-logging-dev
- the compose service name and not the container name (the Docker Compose networking page is a little bit misleading, as it mentions container name, that's why the services found inside thedocker-compose.yml
file are named differently than their containers) - Port =
5432
- the Docker internal port - Username = the value of the
${DB_DEV_POSTGRES_USER}
property from the local.env
file - Password = the value of the
${DB_DEV_POSTGRES_PASSWORD}
property from the local.env
file
Since storing sensitive data inside configuration file put under source control is not a very good idea, the following environment variables must be defined on your local development machine:
Name | Value | Description |
---|---|---|
TODO_WEB_API_BY_SATRAPU_CONNECTIONSTRINGS__APPLICATION | Server=localhost; Port=5432; Database=aspnet-core-logging-dev; Username=satrapu; Password=***; | The connection string pointing to the local development database |
TODO_WEB_API_BY_SATRAPU_CONNECTIONSTRINGS__INTEGRATIONTESTS | Server=localhost; Port=5433; Database=aspnet-core-logging-integrationtests; Username=satrapu; Password=***; | The connection string pointing to the integration tests database |
TODO_WEB_API_BY_SATRAPU_CONNECTIONSTRINGS__ACCEPTANCETESTS | Server=localhost; Port=5434; Database=aspnet-core-logging-acceptancetests; Username=satrapu; Password=***; | The connection string pointing to the acceptance tests database |
TODO_WEB_API_BY_SATRAPU_GENERATEJWT__SECRET | <YOUR_JWT_SECRET> | The secret used for generating JSON web tokens for experimenting purposes only |
The connection strings above use the same username and password pairs find in the local .env
file.
The port from each connection string represent the host port declared inside the local docker-compose.yml
file -
see more about ports here.
In order to run the application locally, you need to have an online PostgreSQL database whose schema is up-to-date. The database will be started using the aforementioned Docker Compose commands, while its schema will be updated via one of the options below.
In order to create and update the local development database, you need to install EF Core CLI tools; the reference documentation can be found here. I also recommend reading about database migrations here. All of the commands below should be executed from the folder where you have checked-out this git repository.
- Install dotnet-ef
dotnet tool install dotnet-ef --global
❗ Please restart the terminal after running the above command to ensure the following dotnet ef
commands do not fail.
- Update dotnet-ef to latest version, if requested to do so
dotnet tool update dotnet-ef --global
- Add a new database migration
dotnet ef migrations add <MIGRATION_NAME> --startup-project ./Sources/Todo.WebApi --project ./Sources/Todo.Persistence
- List existing database migrations
dotnet ef migrations list --startup-project ./Sources/Todo.WebApi --project ./Sources/Todo.Persistence
- Update database to the last migration
dotnet ef database update --startup-project ./Sources/Todo.WebApi --project ./Sources/Todo.Persistence
- Drop existing database
dotnet ef database drop --startup-project ./Sources/Todo.WebApi --project ./Sources/Todo.Persistence
Ensure the MigrateDatabase
configuration property is set to true
.
See more about applying EF Core migrations at runtime here.
In order to inspect application log events generated via Serilog, navigate to http://localhost:8888, which will open Seq UI.
In order to inspect application traces, navigate to http://localhost:16686/search, which will open Jaeger UI. To see Jaeger metrics, navigate to http://localhost:14269/metrics. To see Jaeger health status, navigate to http://localhost:14269/.