Hi All. I have a dumb design question on how to us...
# prefect-community
p
Hi All. I have a dumb design question on how to use Prefect cloud + ECS Fargate. I have a flow registered to an ECS cluster. Now I want to iterate over a list of value for a parameter(e.g.[1,2,3,4,5]) that I used inside a task or the registered flow: example:
Copy code
@task
def make_df(i):
    logger = prefect.context.get("logger")
    <http://logger.info|logger.info>("Hi from Prefect %s", prefect.__version__)
    <http://logger.info|logger.info>("this is the first step")
    data = {'Name':['Tom', 'Brad', 'Kyle', 'Jerry'],
        'Age':[20, i**2, 2*i, 18*i],
        'Height' : [6.1, 5.9, 6.0, 6.1]
        }
    df = pd.DataFrame(data)
    return df
How should I design my work to run 5 parallel task in ECS cluster?
a
We don't have support for that, unless you would e.g. leverage mapping with dask cloud provider e.g. via FargateCluster with Dask -- you could ask more about FargateCluster in Dask community https://dask.discourse.group/
p
@Anna Geller: Thanks for the response. Is workaround to have for example 5 flows for 5 parameters and register them to ECS?
a
that's a viable option for sure, yes
if you schedule those at the same time, or trigger via create_flow_run (orchestrator pattern), potentially even with mapping, then indeed, you can run those in parallel on ECS without the added complexity of a Dask cluster, great idea!
p
is there an example on how to use create_flow_run with mapping?
a
check discourse