Jai P
08/26/2022, 2:32 PMDaskTaskRunner
, but for subflows? E.g. if i have a flow that kicks off a bunch of subflows, i want those to be distributed out and not running on the same instance as the parent flow (or...is that already possible and i'm missing something)? Thanks in advance!Ryan Peden
08/26/2022, 6:19 PMfrom prefect import flow
@flow(task_runner=DaskTaskRunner(address="<http://my-dask-cluster-one>")
def subflow_one():
print("hello from subflow one!")
@flow(task_runner=DaskTaskRunner(address="<http://my-dask-cluster-two>")
def subflow_two():
print("hello from subflow two!")
@flow
def main_flow():
subflow_one()
subflow_two()
There's a bit more info in the composing flows section of the docs - if you scroll down a bit to the 'Subflows or tasks?' callout you well see a point about using different task runners for each subflow.Jai P
08/26/2022, 6:38 PMJai P
08/26/2022, 6:40 PM@flow
def my_subflow(arg):
....
@flow
def my_parent_flow(args):
subflows = [my_subflow(arg) for arg in args]
asyncio.gather(*subflows)
are triggered as co-routines on the same instance, but what i actually want is something like:
@flow
def my_subflow(arg):
....
@flow
def my_parent_flow(args):
subflows = [my_subflow.submit(arg) for arg in args]
asyncio.gather(*subflows)
where the .submit
call on the subflow actually triggers an api call to orion, a task is put on a work queue, and another instance picks it upJai P
08/26/2022, 6:41 PMmy_parent_flow
is triggered on instance_1
, but my_subflow(arg_1)
runs on instance_2
, my_subflow(arg_2)
runs on instance_4
, etc.Jai P
08/26/2022, 6:42 PMRyan Peden
08/26/2022, 6:55 PM