Skip to content

renatomh/api-onlibrary

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

42 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

OnLibrary

OnLibrary

💡 Project's Idea

This project was developed to create an online library platform for customers.

🔍 Features

  • Login and signup to the application;
  • Update user's profile;
  • Multilanguage support;
  • Different timezones support;
  • Files thumbnails creation;

💹 Extras

  • API pagination for better performance;

🛠 Technologies

During the development of this project, the following techologies were used:

💻 Project Configuration

First, create a new virtual environment on the root directory

$ python -m venv env

Activate the created virtual environment

$ .\env\Scripts\activate # On Windows machines
$ source ./env/bin/activate # On MacOS/Unix machines

Install the required packages/libs

(env) $ pip install -r requirements.txt

Internationalization (i18n) and Localization (l10n)

In order to provide results according to the specified languages set in the request headers (Accept-Language), we make use of Flask-Babel. Here are a few commands for its use:

(env) $ pybabel extract -F babel.cfg -k _l -o messages.pot . # Gets list of texts to be translated
(env) $ pybabel init -i messages.pot -d app/translations -l pt # Creates file (messages.po) with 'pt' translations (replace 'pt' with required language code)
(env) $ pybabel update -i messages.pot -d app/translations -l pt # Updates file (messages.po) with 'pt' translations (replace 'pt' with required language code)
(env) $ python translate_texts.py # Optional: auto translate the entries from the '.po' translation files
(env) $ pybabel compile -d app/translations # Compiles the translation files

It's important to compile the translation files before running the application, should it provide the correct translations for the system users.

🌐 Setting up config files

Create an .env file on the root directory, with all needed variables, credentials and API keys, according to the sample provided (.env.example).

Microsoft SQL Server

When using the Microsoft SQL Server, it is also required to download and install the ODBC Driver for SQL Server. Otherwise, it won`t be possible to connect with the SQL server.

Also, in order to install pyodbc on Linux, it might be necessary to install unixodbc-dev with the command below:

$ sudo apt-get install unixodbc-dev

Note: some Docker images have troubles when installing this dependency. Hence, we're avoiding to use SQL Server when deploying the application with Docker.

MySQL Server

When using the MySQL Server, it is required to choose a default charset which won't conflict with some models fields data length. The 'utf8/utf8_general_ci' should work.

FFmpeg

In order to be able to create thumbnail images for video files, the FFmpeg must be installed on the machine. Also, the path to the FFmpeg executable file must be set on the .env file.

Firebase Cloud Messaging

In order to be able to send push notifications to mobile applications, currently the Firebase Cloud Messaging solution it's being used. Aside from setting the .env file, you must also have your service account JSON credentials file present on the app's root folder.

Time Zones

Since the application allows working with different time zones, it might be interesting to use the same time zone as the machine where the application is running when defining the TZ variable on the .env file, since internal database functions (which are used for creating columns like created_at and updated_at) usually make use of the system's time zone (when not set manually).

Also, on server's migration, the database backup could be coming from a machine with a different time zone definition. In this case, it might be necessary to convert the datetime records to the new machine time zone, or set the new machine time zone to the same as the previous machine.

Black Formatter

The project uses the Black Python code formatter, provided by the Black Formatter VS Code extension (ms-python.black-formatter). It's a good practice to install it on your VS Code environment, so the code formatting will be consistent.

⏱ Setting up services

In order to execute tasks, jobs, scripts and others periodically, we can set up services in the computer where the application should be running.

The sample crontab configuration provided (crontab-script example) shows how to set tasks to be executed at the specified dates and times on a Unix machine, using crontab.

These must be set in order to run the jobs created for the application (from the folder app/jobs).

💾 Database Migrations

Once the SQL server is ready (MySQL, PostgreSQL, etc.) and the required credentials to access it are present in the .env file, you can run the migrations with the command:

(env) $ alembic upgrade head

You can also downgrade the migrations with the following command:

(env) $ alembic downgrade base

Alternatively, you can migrate up or down by a specific number of revision, or to a specific revision:

(env) $ alembic upgrade +2 # Migrating up 2 revisions
(env) $ alembic downgrade -1 # Migrating down 1 revision
(env) $ alembic upgrade db9257fac0e2 # Migrating to a specific revision

When there are changes to the application models, new revisions for the migrations can be generated with the command below (where you can provide a custom short description for the update):

(env) $ alembic revision --autogenerate -m "revision description"

⏯️ Running

To run the project in a development environment, execute the following command on the root directory, with the virtual environment activated.

(env) $ python run.py

In order to leave the virtual environment, you can simply execute the command below:

(env) $ deactivate

🔨 Production Server

In order to execute the project in a production server, you must make use of a Web Server Gateway Interface (WSGI), such as uWSGI for Linux or waitress for Windows.

💻 Windows

In Windows, you could run the wsgi.py file, like so:

(env) $ python wsgi.py

After that, a Windows Task can be created to restart the application, activating the virtual environment and calling the script, whenever the machine is booted.

⌨ Linux

In Linux systems, you can use the following command to check if the server is working, changing the port number to the one you're using in the app:

(env) $ gunicorn --worker-class eventlet --bind 0.0.0.0:8080 wsgi:app --reload

The api-onlibrary.service file must be updated and placed in the '/etc/systemd/system/' directory. After that, you should execute the following commands to enable and start the service:

$ sudo systemctl daemon-reload
$ sudo systemctl enable api-onlibrary
$ sudo systemctl start api-onlibrary
$ sudo systemctl status api-onlibrary

In order to serve the application with Nginx, it can be configured like so (adjusting the paths, server name, etc.):

# Flask Server
server {
    listen 80;
    server_name api.domain.com.br;

    location / {
        include proxy_params;
        proxy_pass http://localhost:8080;
        client_max_body_size 16M;
    }

    location /socket.io {
        include proxy_params;
        proxy_http_version 1.1;
        proxy_buffering off;
        proxy_set_header Upgrade $http_upgrade;
        proxy_set_header Connection "Upgrade";
        proxy_pass http://localhost:8080/socket.io;
    }
}

📜 SSL/TLS

You can also add security with SSL/TLS layer used for HTTPS protocol. One option is to use the free Let's Encrypt certificates.

For this, you must install the Certbot's package and use its plugin, with the following commands (also, adjusting the srver name):

$ sudo apt install snapd # Installs snapd
$ sudo snap install core; sudo snap refresh core # Ensures snapd version is up to date
$ sudo snap install --classic certbot # Installs Certbot
$ sudo ln -s /snap/bin/certbot /usr/bin/certbot # Prepares the Certbot command
$ sudo certbot --nginx -d api.domain.com.br

🐳 Docker

There's also the option to deploy the application using Docker. In this case, we have a Dockerfile for the Flask Application.

If you want to deploy the application with an already existing database, you just need the Flask Dockerfile, otherwise, you can use the docker-compose to deploy the application as well as the database server on your machine.

To build a container image for the Flask application, we can run the following command on the app's root folder:

$ docker build -t api-onlibrary .

A few useful Docker commands are listed below:

$ docker image ls # Shows available images
$ docker ps --all # Shows available containers
$ docker run --name <container-name> -p 8080:8080 -it <image-name> # Runs a container from an image with specified options
$ docker exec -it <container> bash # Access the Docker container's shell

After starting the container, you should add the environment and credentials files to it, in order for it to work correctly. You can do it with the following commands:

$ docker cp ./.env <container>:/app/.env
$ docker cp ./service-credentials.json <container>:/app/service-credentials.json

If you want to create containers for both the Flask Application and the database server, you can use the following command for the Docker Composer:

$ docker compose up

This will first try to pull existing images to create the containers. If they're not available, it'll build the images and then run the conatainers.

Finally, a Makefile was created in order to help providing some of the commands listed above in a simple way.

🧪 Testing

In order to make sure that the application's main features are working as expected, some tests were created to assert the functionalities.

To allow the execution of the tests, first the required dependencies must be installed:

(env) $ pip install -r requirements-test.txt

The tests can then be run:

(env) $ pytest # Optionally, you can add options like '-W ignore::DeprecationWarning' to suppress specific warnings or '-o log_cli=true' to show logs outputs

Also, you can generate HTML test coverage reports with the commands below:

(env) $ coverage run -m pytest
(env) $ coverage html

🏗️ Infrastructure as Code (IaC) with Terraform

To make it easier to provision infrastructure on cloud providers, you can make use of the Terraform template provided.

First, you'll need to install Terraform on your machine; then, since we're using AWS for the specified resources, you'll need to install the AWS CLI as well.

After that, you must set up an IAM user with permissions to manage resources, create an access key for the new user and configure the AWS CLI with the following command (entering the access key ID, secret access key, default region and outout format):

$ aws configure

Once these steps are done, you can use the Terraform commands to create, update and delete resources.

$ terraform init # Downloads the necessary provider plugins and set up the working directory
$ terraform plan # Creates the execution plan for the resources
$ terraform apply # Executes the actions proposed in a Terraform plan
$ terraform destroy # Destroys all remote objects managed by a particular Terraform configuration

If you want to provide the required variables for Terraform automatically when executing the script, you can create a file called prod.auto.tfvars file on the root directory, with all needed variables, according to the sample provided (auto.tfvars).

Documentation:

📄 License

This project is under the MIT license. For more information, access LICENSE.

About

Flask (Python) API for online library platform.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published