You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Apache Airflow Provider versions (please include all providers that are relevant to your bug):
apache-airflow-providers-snowflake==2.0.0
DAG deals with loading data to Snowflake, not sure if other providers are related, but installation happens with constraints file.
Kubernetes version (if you are using kubernetes) (use kubectl version): 1.18.20
Environment: Dockerfile running in k8s deployment, python:3.8.8-slim-buster docker is used as base
What happened:
Full error message
Traceback (most recent call last):
File "/usr/local/bin/airflow", line 8, in <module>
sys.exit(main())
File "/usr/local/lib/python3.8/site-packages/airflow/__main__.py", line 40, in main
args.func(args)
File "/usr/local/lib/python3.8/site-packages/airflow/cli/cli_parser.py", line 48, in command
return func(*args, **kwargs)
File "/usr/local/lib/python3.8/site-packages/airflow/utils/cli.py", line 91, in wrapper
return f(*args, **kwargs)
File "/usr/local/lib/python3.8/site-packages/airflow/cli/commands/task_command.py", line 237, in task_run
_run_task_by_selected_method(args, dag, ti)
File "/usr/local/lib/python3.8/site-packages/airflow/cli/commands/task_command.py", line 64, in _run_task_by_selected_method
_run_task_by_local_task_job(args, ti)
File "/usr/local/lib/python3.8/site-packages/airflow/cli/commands/task_command.py", line 120, in _run_task_by_local_task_job
run_job.run()
File "/usr/local/lib/python3.8/site-packages/airflow/jobs/base_job.py", line 237, in run
self._execute()
File "/usr/local/lib/python3.8/site-packages/airflow/jobs/local_task_job.py", line 147, in _execute
self.on_kill()
File "/usr/local/lib/python3.8/site-packages/airflow/jobs/local_task_job.py", line 166, in on_kill
self.task_runner.on_finish()
File "/usr/local/lib/python3.8/site-packages/airflow/task/task_runner/base_task_runner.py", line 179, in on_finish
self._error_file.close()
File "/usr/local/lib/python3.8/tempfile.py", line 499, in close
self._closer.close()
File "/usr/local/lib/python3.8/tempfile.py", line 436, in close
unlink(self.name)
FileNotFoundError: [Errno 2] No such file or directory: '/tmp/tmpynk61lsx'
What you expected to happen:
Tempfile to be there and DAG not failing
How to reproduce it:
Don't know, haven't seen consistency when this occurs, but it has been happening for a while, more than a month.
Anything else we need to know:
How often does this problem occur? Once? Every time etc?
This happens on an almost daily basis, one or two dags fail from time to time.
EDIT: Corrected version
The text was updated successfully, but these errors were encountered:
Can you please open a new issue @leiguorui and describe your case in details - what command you run , your deployment etc. It looks quite a bit different.
Apache Airflow version: 2.1.2
Apache Airflow Provider versions (please include all providers that are relevant to your bug):
apache-airflow-providers-snowflake==2.0.0
DAG deals with loading data to Snowflake, not sure if other providers are related, but installation happens with constraints file.
Kubernetes version (if you are using kubernetes) (use
kubectl version
): 1.18.20Environment: Dockerfile running in k8s deployment, python:3.8.8-slim-buster docker is used as base
What happened:
Full error message
What you expected to happen:
Tempfile to be there and DAG not failing
How to reproduce it:
Don't know, haven't seen consistency when this occurs, but it has been happening for a while, more than a month.
Anything else we need to know:
How often does this problem occur? Once? Every time etc?
This happens on an almost daily basis, one or two dags fail from time to time.
EDIT: Corrected version
The text was updated successfully, but these errors were encountered: