-
Notifications
You must be signed in to change notification settings - Fork 14.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
sqlalchemy error when running CLI command airflow tasks test
#34109
Comments
Thanks for opening your first issue here! Be sure to follow the issue template! If you are willing to raise PR to address this issue please do so, no need to wait for approval. |
I am experiencing the same issue |
There is very little value in stating it @freeduck . But I think there is huge value if you check if the fix proposed in the linked PR fixes it. Can you please apply the fix in #34120 to your installation and verify if it solves it ? That would help a lot in finding out if the fix is good. Can we count on your help with that @freeduck ? |
The issue seems to still exist in v2.7.2 |
Yes, we already detected it in the RC (#34830 (comment)). |
The issue seems to still exist in v2.7.3. |
I believe I have a fix as well as a workaround. Create the following as the plugin file, """
Workaround for: https://github.com/apache/airflow/issues/34109
DagRunNote has a foreign key `user_id` (and thus a dependency) to User, but it
seems `airflow.models.dagrun` gets loaded first (at least when running `airflow
tasks test DAG_ID TASK_ID`). Loading User first seems to solve the issue.
"""
from airflow.auth.managers.fab.models import User # noqa
# UPDATE: don't import the following
# it'll prematurely import airflow.configuration, triggering additional import side-effects
# from airflow.models.dagrun import DagRunNote # noqa The equivalent fix would be to encode the dependency into the foreign key definition, establishing proper import ordering. from airflow.auth.managers.fab.models import User
class DagRunNote(Base):
user_id = Column(
Integer,
ForeignKey(User.id, name="dag_run_note_user_fkey"), # reference User.id instead of using "ab_user.id"
nullable=True,
) I can create a PR. I'm not familiar with sqlalchemy though, scrutiny is welcome |
It seems DagRun gets loaded before User, resulting in a sqlalchemy.exc.NoReferencedTableError (at least when running `airflow tasks test DAG_ID TASK_ID`) when defining DagRun's foreign key to the still nonexistent User table. Referencing the column object instead of using str establishes proper import ordering and fixes the issue. Fix: apache#34109
In #35614, it was discussed that there is already a feature coming down the line that'll have a proper fix. In the short term, the maintainers would accept a PR that removes the In the immediate term, the workaround should suffice. |
I don't see this error anymore on 2.9.3 (using an image based on the airflow:slim-2.9.3-python3.12 docker image). |
Closing then. |
This issue is still present. I use airflow 2.9.3, and try to run airflow using a sqlite database for the dag.test() command. The database gets created correctly using airflow db init/migrate, but when running a dag twice, when trying to delete the previous dag run, the error is present again.
|
@phyk That's interesting that you have the issue. Perhaps it's DB specific? I was having the issue with Postgresql on Airflow 2.8.3, but after upgrading to 2.9.3 don't have it anymore. I used the sample DAG in the "how to reproduce" it of this issue. |
FYI updated my posted workaround by removing the second import because it causes premature importing of many more airflow modules like |
Apache Airflow version
2.7.0
What happened
Calling
airflow tasks test <dag_id> <task_id>
runs the task correctly, but yields the following error after it's done (regardless of failure/success of the task):What you think should happen instead
This error should never happen. It wasn't happening with version
2.5.3
, which I was running before.How to reproduce
Create this DAG file:
Then run
airflow tasks test my_dag my_task
Operating System
Debian GNU/Linux 10 (buster)
Versions of Apache Airflow Providers
Deployment
Docker-Compose
Deployment details
No response
Anything else
This error always happens if you run
airflow tasks test my_dag my_task
If you pass an execution date, it will yield the error the first time, but then it won't error out again:
I've seen this error both with Postgres and sqlite as backend.
Are you willing to submit PR?
Code of Conduct
The text was updated successfully, but these errors were encountered: