Skip to content

Latest commit

 

History

History
41 lines (39 loc) · 2.81 KB

README.md

File metadata and controls

41 lines (39 loc) · 2.81 KB

Apache Airflow Tutorial Series YouTube

Updated Tutorial Episode

  1. Introduction and Local Installation
  2. Get Airflow running in Docker
  3. Airflow Core Concepts in 5 mins
  4. Airflow Task Lifecycle and Basic Architecture
  5. Airflow DAG with BashOperator
  6. Airflow DAG with PythonOperator and XComs
  7. Airflow TaskFlow API
  8. Airflow Catchup and Backfill
  9. Schedule Airflow DAG with Cron Expression
  10. Airflow Connection and PostgresOperator
  11. Add Python Dependencies via Airflow Docker Image Extending and Customizing
  12. AWS S3 Key Sensor Operator
  13. Airflow Hooks S3 PostgreSQL

Running apache airflow 2.0 in docker with local executor.

Here are the steps to take to get airflow 2.0 running with docker on your machine.

  1. Clone this repo
  2. Create dags, logs and plugins folder inside the project directory
mkdir ./dags ./logs ./plugins
  1. Set user permissions for Airflow to your current user
echo -e "AIRFLOW_UID=$(id -u)\nAIRFLOW_GID=0" > .env
  1. Install docker desktop application if you don't have docker running on your machine
  1. Launch airflow by docker-compose
docker-compose up -d
  1. Check the running containers
docker ps
  1. Open browser and type http://0.0.0.0:8080 to launch the airflow webserver