-
Notifications
You must be signed in to change notification settings - Fork 16.3k
Description
Apache Airflow version
3.0.4
If "Other Airflow 2 version" selected, which one?
No response
What happened?
We’re running Airflow in Kubernetes using both Celery and Local executors. Deploying from official chart 1.18.0.
In Airflow 3.0.0 – 3.0.4, the secret masker doesn’t seem to be working.
We use HashiCorp Vault to store our secrets. When retrieving a connection/variable or simply calling the mask_secret function with a string value, the sensitive values are not being hidden in logs or task Rendered Template fields.
I tried reproducing this issue in Airflow 2.4 – 2.11, and it worked fine there, so the problem seems to have started in version 3.
Environment:
Docker image: apache/airflow:3.0.4-python3.12
What you think should happen instead?
All sensitive secret values should be masked with ***
How to reproduce
Run Airflow using chart
---
airflow:
defaultAirflowTag: "3.0.4"
airflowVersion: "3.0.4"
useStandardNaming: true
webserver:
defaultUser:
enabled: false
executor: "LocalExecutor"
config:
secrets:
backend: airflow.providers.hashicorp.secrets.vault.VaultBackend
backend_kwargs: |
{"connections_path": "dapa/airflow/environment/connections", "variables_path": "dapa/airflow/environment/variables", "mount_point": "datas", "token": "vow", "url": "http://vault:8200"}
core:
simple_auth_manager_all_admins: true
auth_manager: airflow.api_fastapi.auth.managers.simple.simple_auth_manager.SimpleAuthManager
Run DAG
from datetime import datetime
from airflow import DAG
from airflow.hooks.base import BaseHook
from airflow.providers.standard.operators.python import PythonOperator
def print_connection():
from airflow.sdk.execution_time.secrets_masker import mask_secret
conn = BaseHook.get_connection("some_connection")
a = 'private_key'
mask_secret(a)
print(f"🔹 Conn ID: {conn.conn_id}")
print(f"🔹 Login: {conn.login}")
print(f"🔹 Password: {conn.password}")
print(f"🔹 password: private_key")
with DAG(
dag_id="demo_print_connections_variables",
start_date=datetime(2025, 1, 1),
catchup=False,
) as dag:
task_show_connection = PythonOperator(
task_id="show_connection",
python_callable=print_connection
)
{conn.password} and private_key should be masked
Operating System
Debian GNU/Linux 12 (bookworm)
Versions of Apache Airflow Providers
apache-airflow==3.0.4
apache-airflow-providers-google==17.1.0
apache-airflow-providers-mysql==6.3.3
apache-airflow-providers-slack==9.1.3
apache-airflow-providers-vertica==4.1.2
apache-airflow-providers-apache-livy==4.4.2
apache-airflow-providers-ssh==4.1.2
apache-airflow-providers-common-sql==1.27.4
apache-airflow-providers-cncf-kubernetes==10.6.0
apache-airflow-providers-hashicorp==4.3.2
apache-airflow-providers-standard==1.5.0
Deployment
Official Apache Airflow Helm Chart
Deployment details
No response
Anything else?
While debugging, I also noticed that running the CLI command:
airflow config list
tries to mask passwords, but in the end, it doesn’t actually mask them.
There’s a test for this here:
https://github.com/apache/airflow/blob/main/task-sdk/tests/task_sdk/definitions/test_secrets_masker.py
When I tried running it locally, if you don’t explicitly add:
from airflow import settings
settings.MASK_SECRETS_IN_LOGS = True
it won’t work.
So maybe that’s the root cause — this value is only being set for the task_test and triggerer processes, while it’s expected to also be set when running a task.
However, if a Connection or Variable is created manually via the Airflow UI, it is masked successfully.
Are you willing to submit PR?
- Yes I am willing to submit a PR!
Code of Conduct
- I agree to follow this project's Code of Conduct