Skip to content

ValidationError: dag_versions should be a valid dictionary or object #52660

@dabeeru

Description

@dabeeru

Apache Airflow version

3.0.2

If "Other Airflow 2 version" selected, which one?

No response

What happened?

After upgrading Airflow from 2.10.4 to 3.0.2, we encountered 500 Internal Server Error responses in the UI when attempting to access the following views:

  • /dag_runs/
  • /dags/<dag_id>/runs

These errors are triggered by the API call:

GET /api/v2/dags/~/dagRuns?limit=50&offset=200&order_by=-run_after

The following exception is thrown in the api-server logs:

api-server     |   File "/home/airflow/.local/lib/python3.12/site-packages/airflow/api_fastapi/core_api/routes/public/dag_run.py", line 363, in get_dag_runs
api-server     |     return DAGRunCollectionResponse(
api-server     |            ^^^^^^^^^^^^^^^^^^^^^^^^^
api-server     |   File "/home/airflow/.local/lib/python3.12/site-packages/pydantic/main.py", line 253, in __init__
api-server     |     validated_self = self.__pydantic_validator__.validate_python(data, self_instance=self)
api-server     |                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
api-server     | pydantic_core._pydantic_core.ValidationError: 1 validation error for DAGRunCollectionResponse
api-server     | dag_runs.13.dag_versions.0
api-server     |   Input should be a valid dictionary or object to extract fields from [type=model_attributes_type, input_value=None, input_type=NoneType]
api-server     |     For further information visit https://errors.pydantic.dev/2.11/v/model_attributes_type
api-server     +------------------------------------
api-server 
api-server During handling of the above exception, another exception occurred:

This indicates that dag_versions contains a None value, which is invalid according to the DAGRunCollectionResponse model.

A very similar error is observed in the /dags/ view:

File ".../routes/ui/dags.py", line 148, in recent_dag_runs
    dag_run_response = DAGRunResponse.model_validate(dag_run)
...
pydantic_core._pydantic_core.ValidationError: 1 validation error for DAGRunResponse
dag_versions.0
  Input should be a valid dictionary or object...

After investigation, it appears that these errors are related to DAG runs created before the upgrade to Airflow 3.0, when the dag_versioning feature did not exist. These older dag_run records have [null] in the created_dag_version_id column in the database.

Example of DB item:

"id","dag_id","queued_at","logical_date","start_date","end_date","state","run_id","creating_job_id","run_type","data_interval_start","data_interval_end","last_scheduling_decision","log_template_id","updated_at","clear_number","triggered_by","backfill_id","created_dag_version_id","bundle_version","conf","run_after","scheduled_by_job_id","context_carrier","span_status"
12389,"manual_report_generation_playground","2025-06-12 09:24:19.315759+00","2025-06-12 09:24:07.328915+00","2025-06-12 09:24:20.033242+00","2025-06-12 10:02:41.150214+00","failed","manual__2025-06-12T09:24:07.328915+00:00",NULL,"manual","2025-06-12 09:24:07.328915+00","2025-06-12 09:24:07.328915+00","2025-06-12 10:02:40.136303+00",1,"2025-06-12 10:02:41.179041+00",0,NULL,NULL,NULL,NULL,"{}","2025-06-12 09:24:07.328915+00",NULL,NULL,"not_started"

Interestingly, this issue does not affect all DAGs with pre-3.0 DAG runs. Some pre-upgrade DAGs return a valid response with "dag_versions": [], as expected. For example:

{
  "dag_run_id": "manual__2025-06-12T08:23:23.654780+00:00",
  "dag_id": "billing_run_delete_rerun_client_playground",
  ...
  "dag_versions": []
}

I was unable to identify consistent differences in the corresponding dag_run table entries that would explain the divergent behavior.

What you think should happen instead?

For all pre-upgrade dag runs, empty array in dag_version field should be returned to the client.

How to reproduce

Migrate from airflow 2.10.4 to Airflow 3.0.2 by following this guide, browse through UI or query mentioned endpoints to get information about dag_runs perfomed before migration

Operating System

Amazon Linux 2023 (AL2023)

Versions of Apache Airflow Providers

apache-airflow-providers-amazon==9.8.0
apache-airflow-providers-celery==3.11.0
apache-airflow-providers-cncf-kubernetes==10.5.0
apache-airflow-providers-common-compat==1.7.0
apache-airflow-providers-common-io==1.6.0
apache-airflow-providers-common-messaging==1.0.2
apache-airflow-providers-common-sql==1.27.1
apache-airflow-providers-docker==4.4.0
apache-airflow-providers-elasticsearch==6.3.0
apache-airflow-providers-fab==2.2.0
apache-airflow-providers-ftp==3.13.0
apache-airflow-providers-git==0.0.2
apache-airflow-providers-google==15.1.0
apache-airflow-providers-grpc==3.8.0
apache-airflow-providers-hashicorp==4.2.0
apache-airflow-providers-http==5.3.0
apache-airflow-providers-microsoft-azure==12.4.0
apache-airflow-providers-mysql==6.3.0
apache-airflow-providers-odbc==4.10.0
apache-airflow-providers-openlineage==2.3.0
apache-airflow-providers-postgres==6.2.0
apache-airflow-providers-redis==4.1.0
apache-airflow-providers-sendgrid==4.1.0
apache-airflow-providers-sftp==5.3.0
apache-airflow-providers-slack==9.1.0
apache-airflow-providers-smtp==2.1.0
apache-airflow-providers-snowflake==6.3.1
apache-airflow-providers-ssh==4.1.0
apache-airflow-providers-standard==1.2.0

Deployment

Official Apache Airflow Helm Chart

Deployment details

  • Used official helm chart in version 1.16.0.
  • Using KubernetesExecutor.
  • api-server configured in high-avalability

Anything else?

No response

Are you willing to submit PR?

  • Yes I am willing to submit a PR!

Code of Conduct

Metadata

Metadata

Assignees

Labels

area:APIAirflow's REST/HTTP APIarea:corearea:upgradeFacilitating migration to a newer version of Airflowkind:bugThis is a clearly a bugpriority:highHigh priority bug that should be patched quickly but does not require immediate new releasepriority:upgrade_to_airflow3

Type

No type

Projects

No projects

Milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions