Jai P

03/01/2023, 9:18 PM
Hi folks! Question…is there a way to use the same task runner when creating multiple of the same subflow? I’m also curious if this is a possible performance improvement…I’m guessing there is at least some cost to use a new task runner, so if I can use one that was already created, that would be great
Kind of related to the entire set of issues related to this one

Taylor Curran

03/01/2023, 10:50 PM
Hi Jai, you can get subflows to behave like tasks by wrapping the run_deployment command in a a task like so:
def subflow_wrapping_task():
            run_deployment(  # returns a FlowRun object

Jai P

03/02/2023, 3:11 AM
Hey @Taylor Curran, thanks for the info. The thing is, I want to run my subflows on the same infrastructure. i'm already using different infrastructure to run many of the parent flow in parallel, the subflows don't need their own infra to run on. What i've been doing so far is:
def my_subflow():

async def my_main_flow():
    subflows = [
        for _ in range(100)

    return await asyncio.gather(*subflows)

if __name__ == "__main__":
but i run into
RuntimeError: The task runner is already started!
which i can get around by `deepcopy`ing the
subflows = [
        for _ in range(100)
but that comes at the cost of (i believe) many many task runners being started up, which i would guess is unnecessary cost when i could just re-use a single task runner

Vishnu Duggirala

03/21/2023, 4:26 PM
@Jai P I tired using deep copy but the issue still persists.