Add 'breeze k8s dev' command for hot-reloading DAGs and core sources (#40005)#59747
Conversation
|
WOWOWOWOWOW ! |
|
One comment here - (might be follow-up) - if we run airflow commands in the chart with --dev flags , we could also simply synchronize airflow's source code - not only Dags. We already have built-in hot-reload in airflow components when they are run in |
- Updated output_k8s.txt with new hash value. - Added output_k8s_dev.svg for visual representation of the 'k8s dev' command usage. - Created output_k8s_dev.txt with corresponding hash for the new SVG output.
|
Yes, exactly. That's why I chose |
jason810496
left a comment
There was a problem hiding this comment.
Yes, exactly. That's why I chose Skaffold, it handles syncing the source code directly to the running containers without rebuilding images. Combined with the --dev flag, this should give us the fast iteration experience we need :)
Yes, this is super nice feature for improving k8s test iteration! Very appreciate it!
The current PR should also close #44508.
Backport failed to create: v3-1-test. View the failure log Run details
You can attempt to backport this manually by running: cherry_picker 9f874c3 v3-1-testThis should apply the commit to the v3-1-test branch and leave the commit in conflict state marking After you have resolved the conflicts, you can continue the backport process by running: cherry_picker --continue |
|
No backorting needed |
…pache#40005) (apache#59747) * Add 'breeze k8s dev' command for hot-reloading DAGs and core sources (apache#40005) * fix CI test error * Add Kubernetes development output files and update documentation - Updated output_k8s.txt with new hash value. - Added output_k8s_dev.svg for visual representation of the 'k8s dev' command usage. - Created output_k8s_dev.txt with corresponding hash for the new SVG output. * Update documentation for Breeze test commands * update output files and add log level parameter to Kubernetes commands
|
#protm |
|
#protm for me as well. |
…pache#40005) (apache#59747) * Add 'breeze k8s dev' command for hot-reloading DAGs and core sources (apache#40005) * fix CI test error * Add Kubernetes development output files and update documentation - Updated output_k8s.txt with new hash value. - Added output_k8s_dev.svg for visual representation of the 'k8s dev' command usage. - Created output_k8s_dev.txt with corresponding hash for the new SVG output. * Update documentation for Breeze test commands * update output files and add log level parameter to Kubernetes commands
…pache#40005) (apache#59747) * Add 'breeze k8s dev' command for hot-reloading DAGs and core sources (apache#40005) * fix CI test error * Add Kubernetes development output files and update documentation - Updated output_k8s.txt with new hash value. - Added output_k8s_dev.svg for visual representation of the 'k8s dev' command usage. - Created output_k8s_dev.txt with corresponding hash for the new SVG output. * Update documentation for Breeze test commands * update output files and add log level parameter to Kubernetes commands
…pache#40005) (apache#59747) * Add 'breeze k8s dev' command for hot-reloading DAGs and core sources (apache#40005) * fix CI test error * Add Kubernetes development output files and update documentation - Updated output_k8s.txt with new hash value. - Added output_k8s_dev.svg for visual representation of the 'k8s dev' command usage. - Created output_k8s_dev.txt with corresponding hash for the new SVG output. * Update documentation for Breeze test commands * update output files and add log level parameter to Kubernetes commands
Closes: #40005
Why
Based on my experience resolving k8s-related bugs, it's quite inconvenience to build and upload a k8s image when I only make a small change.
How
After some research, I decided integrate Skaffold into
breeze k8sto improve the development experience.I added a new command,
breeze k8s dev, to sync Airflow source code to K8s pods in seconds.What
1.
/dagsfolder synchronizationairflow_40005_demo_output.mp4
2. Airflow Core synchronization
2.1. Scheduler
Exec into the pod with
kubectl exec -n airflow -c scheduler -it airflow-scheduler-0 -- /bin/bash, and modify the log message to "Starting the scheduler [hot-reload]":airflow/airflow-core/src/airflow/jobs/scheduler_job_runner.py
Line 1184 in ee5e21b
Use the following command to verify the synchronization:
grep -n "Starting the scheduler" airflow-core/src/airflow/jobs/scheduler_job_runner.py2.2. Triggerer
I used
kubectl logs -n airflow -l component=triggerer -c triggerer -fto observe the logs. After I modified the file atairflow-core/src/airflow/jobs/triggerer_job_runner.py, the logs showed:2.3. Dag-Processor
I used
kubectl logs -n airflow -l component=dag-processor -c dag-processor -fto observe the logs. After I modified the file atairflow-core/src/airflow/cli/commands/dag_processor_command.py, the logs showed: