Dennis Hinnenkamp
02/16/2022, 10:34 AMall_successful
trigger, whose upstream task again consists of the last three import tasks. I hope it is somewhat understandable what exactly I want to achieve 😃Anna Geller
02/16/2022, 10:42 AMIs there a way to merge two or more parallel tasks if successful?you can merge the branches using dependencies e.g.
upstream_tasks
keyword, and you can influence what should run based on success/failure using triggers
Dennis Hinnenkamp
02/16/2022, 10:51 AMAnna Geller
02/16/2022, 11:07 AMSomeTask(init_kwargs)(runtime_kwargs)
Also, to run those tasks in parallel, you need to use (local) dask executor.
Lastly, as requested I added the task that runs once all databricks tasks are finished successfully.
Also, if you are on Prefect Cloud, you can replace the Parameter task that contains the secret by PrefectSecret task for better security.
Here is the Gist: https://gist.github.com/2033184d2d64b6ad61a4f86dd121d638
LMK if you have any questions about it.Dennis Hinnenkamp
02/16/2022, 11:45 AMAnna Geller
02/16/2022, 11:51 AMDennis Hinnenkamp
02/16/2022, 11:56 AMor do you want just nice diamond-shaped flow chart due to visual preferences?That is exactly what I want to achieve. I also want to make it clear in the visualisation that the result of the flow depends on the results of the parallel tasks. This makes it easier for our customers to understand the flow even in e.g. 1 year.
Anna Geller
02/16/2022, 12:01 PM@task(trigger=all_successful, name="All Databricks Tasks Finished")
def reduce_step():
pass
Dennis Hinnenkamp
02/16/2022, 12:23 PM