Tasks execution when one of the task fails.
if I have a situation:
Copy code
@task(timeout=60)
def task_1():
x = func1(...)
return x == 0
@task(timeout=60)
def task_2(r):
y = func2(...)
return y == 0
@task(timeout=60)
def task_3():
z = func3(...)
return z == 0
def main():
with Flow("my-flow", executor=LocalDaskExecutor()) as flow:
r = task_1()
r = task_2(r)
r2 = task_3()
if
task_1
times out and fail,
task_2
will not run.
What can I change so that I can still use the
LocalDaskExecutor
but allow
task_2
to run even if
task_1
failed ?
n
nicholas
08/25/2021, 10:54 PM
Hi @YD - you can use an
always_run
trigger on
task_2
like this:
Copy code
from prefect.triggers import always_run
...
@task(timeout=60)
def task_2(r, trigger=always_run):
return
In your code, doesn't task_2 depend on the result of task_1? So how can task_2 run without that result? So if you force task_2 to run, it may be operating with an erroneous value in r.
y
YD
08/26/2021, 4:32 PM
sometimes there is no real dependencies.
I just do not want to run the jobs at the same time, to be conscious of the servers resources, so if I have 40 tasks in a flow, I might have 4 groups of 10 flows running in parallel.
Bring your towel and join one of the fastest growing data communities. Welcome to our second-generation open source orchestration platform, a completely rethought approach to dataflow automation.