Dennis Hinnenkamp02/16/2022, 10:34 AM
trigger, whose upstream task again consists of the last three import tasks. I hope it is somewhat understandable what exactly I want to achieve 😃
Anna Geller02/16/2022, 10:42 AM
Is there a way to merge two or more parallel tasks if successful?you can merge the branches using dependencies e.g.
keyword, and you can influence what should run based on success/failure using
Anna Geller02/16/2022, 11:07 AM
Also, to run those tasks in parallel, you need to use (local) dask executor. Lastly, as requested I added the task that runs once all databricks tasks are finished successfully. Also, if you are on Prefect Cloud, you can replace the Parameter task that contains the secret by PrefectSecret task for better security. Here is the Gist: https://gist.github.com/2033184d2d64b6ad61a4f86dd121d638 LMK if you have any questions about it.
Dennis Hinnenkamp02/16/2022, 11:45 AM
Anna Geller02/16/2022, 11:51 AM
Dennis Hinnenkamp02/16/2022, 11:56 AM
or do you want just nice diamond-shaped flow chart due to visual preferences?That is exactly what I want to achieve. I also want to make it clear in the visualisation that the result of the flow depends on the results of the parallel tasks. This makes it easier for our customers to understand the flow even in e.g. 1 year.
Anna Geller02/16/2022, 12:01 PM
@task(trigger=all_successful, name="All Databricks Tasks Finished") def reduce_step(): pass
Dennis Hinnenkamp02/16/2022, 12:23 PM