It simply not the right way - also the UI / API are not supposed to change anything in DAG definition (stll with using DAG Factories ) but I am not sure if it addresses your problem.Īlso there are some AIPs - Airflow Improvement Proposals (which we are not actively working on right now) - for example - that might better fit your rrequirements. However as of Airflow 2.2 you hav custom timetables that might help you a bit to manage complex schedules There is no way we can change "schedule" by submitting a DAG at this point. For now the only way how you can modify the DAG is via parsing DAG file coming from the DAG_FOLDER directory. There is no way we can change "schedule" by submitting a DAG Run at this point - it's just logically impossible :).Īnd It simply not the right "place" - the UI / API are not supposed to change anything in DAG definition (at least not yet). The DAG Run that you mention is only single DAG execution, and since you are triggering it - by definition it has no schedule, because this is jus single "execution" of the DAG Run - it has "no schedule" at the point you trigger it (because this is a DAG Run, not DAG). The "triggering" time is a bad idea for specifying custom schedules, because the DAGs are already created and scheduled. I agree to follow this project's Code of Conduct.No response Are you willing to submit a PR? While the UI is nice to look at, it’s a pretty clunky way to manage your pipeline configuration, particularly at deployment time. We have adjusted by using functional DAG creation as described, but this would be a nice feature as well. on ApApache Airflow’s documentationputs a heavy emphasis on the use of its UI client for configuring DAGs. Scheduling with custom parameters is a feature in, for example, Jenkins. The DAG scheduled with custom parameters would provide an alternate means of managing these types of workflows. In terms of code, we can efficiently create these DAGs using DAG and task factory functions, but DAG management on the Airflow frontend becomes cluttered (although mitigated by Tags and search features). When managing many similar jobs that run on different schedules, our approach so far has been to create one DAG per job type. So presumably (and in practice), any scheduled DAG Run will only use the default parameters.Īn internet search also did not indicate a way of scheduling DAGs with custom parameters. When you initialize the Airflow webserver, predefined configuration is used, based on the webserver section of the airflow.cfg file. When triggering a DAG from the CLI, the REST API or the UI, it is possible to pass configuration for a DAG Run as a JSON blob. Configuring Flask Application for Airflow Webserver. ![]() The documentation for DAG Run, describes the custom parametrization as follows link: ![]() Currently, scheduling DAGs can be accomplished by using the schedule_interval kwarg on the DAG initialization Link
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |