Hey, have a question. Have a flow that I am currently running with a
DaskTaskRunner
that and multiple tasks. Prior to setting a
max_runtime
the flow would crash after 90 minutes on whatever task it is currently on but instead of the flow entering the failed state it shows as completed. Is this by design or a bug? If it is by design, is there a way to set the flow to instead enter the failed state to keep incomplete ETLs from being missed?
Bring your towel and join one of the fastest growing data communities. Welcome to our second-generation open source orchestration platform, a completely rethought approach to dataflow automation.