Apache Airflow Providers containing Deferrable Operators & Sensors from Astronomer.
Install and update using pip:
pip install astronomer-providers
This only installs dependencies for core provider. To install all dependencies, run:
pip install 'astronomer-providers[all]'
To only install the dependencies for a specific provider, specify the integration name as extra argument, example to install Kubernetes provider dependencies, run:
pip install 'astronomer-providers[cncf.kubernetes]'
Extra Name | Installation Command | Dependencies |
---|---|---|
all |
pip install 'astronomer-providers[all]' |
All |
amazon |
pip install 'astronomer-providers[amazon]' |
Amazon |
apache.hive |
pip install 'astronomer-providers[apache.hive]' |
Apache Hive |
apache.livy |
pip install 'astronomer-providers[apache.livy]' |
Apache Livy |
cncf.kubernetes |
pip install 'astronomer-providers[cncf.kubernetes]' |
Cncf Kubernetes |
databricks |
pip install 'astronomer-providers[databricks]' |
Databricks |
dbt.cloud |
pip install 'astronomer-providers[dbt.cloud]' |
Dbt Cloud |
google |
pip install 'astronomer-providers[google]' |
|
http |
pip install 'astronomer-providers[http]' |
Http |
microsoft.azure |
pip install 'astronomer-providers[microsoft.azure]' |
Microsoft Azure |
openlineage |
pip install 'astronomer-providers[openlineage]' |
Openlineage |
snowflake |
pip install 'astronomer-providers[snowflake]' |
Snowflake |
This repo is structured same as the Apache Airflow's source code, so for example if you want to import Async operators, you can import it as follows:
from astronomer.providers.amazon.aws.sensors.s3 import S3KeySensorAsync as S3KeySensor
waiting_for_s3_key = S3KeySensor(
task_id="waiting_for_s3_key",
bucket_key="sample_key.txt",
wildcard_match=False,
bucket_name="sample-bucket",
)
Example DAGs for each provider is within the respective provider's folder. For example, the Kubernetes provider's DAGs are within the astronomer/providers/cncf/kubernetes/example_dags folder.
We will only create Async operators for the "sync-version" of operators that do some level of polling (take more than a few seconds to complete).
For example, we won’t create an async Operator for a BigQueryCreateEmptyTableOperator
but will create one
for BigQueryInsertJobOperator
that actually runs queries and can take hours in the worst case for task completion.
To create async operators, we need to inherit from the corresponding airflow sync operators.
If sync version isn't available, then inherit from airflow BaseOperator
.
To create async sensors, we need to inherit from the corresponding sync sensors.
If sync version isn't available, then inherit from airflow BaseSensorOperator
.
We follow Semantic Versioning for releases. Check CHANGELOG.rst for the latest changes.
All contributions, bug reports, bug fixes, documentation improvements, enhancements, and ideas are welcome.
A detailed overview on how to contribute can be found in the Contributing Guide.
As contributors and maintainers to this project, you are expected to abide by the Contributor Code of Conduct.
- Our focus is on the speed of iteration and development in this stage of the project and so we want to be able to quickly iterate with our community members and customers and cut releases as necessary
- Airflow Providers are separate packages from the core
apache-airflow
package and we would like to avoid further bloating the Airflow repo - We want users and the community to be able to easily track features and the roadmap for individual providers that we develop
- We would love to see the Airflow community members create, maintain and share their providers to build an Ecosystem of Providers.
- In Airflow sensors have a param
mode
which can bepoke
andreschedule
. In async sensors, this param has no usage since tasks gets deferred to Triggerer.