Hi, Will all mapped task run in paralell, can we p...
# prefect-community
s
Hi, Will all mapped task run in paralell, can we put some delay between the mapped tasks
k
They run in parallel if using the LocalDaskExecutor. If you need them sequentially, you have to use a LocalExecutor. Yes you can add a delay in the state handler
s
In UI, mapped task is shown as single task, how i can restart or check status of a child task
k
Click a mapped child:
then mapped runs
s
Copy code
collate_monthly_lotame = LakeviewRunJobTask(cid=CID, job="europe-collate-monthly-lotame",
cluster=CLUSTER, poll_interval=300, name="collate-monthly-lotame", timeout=10800, result=S3_RESULT)

with Flow(FLOW_NAME) as flow:
       arg_map = generate_arg_map(event_month)
    t4 = collate_monthly_lotame.map(args=arg_map)
We have a custom task library, i want to run it as an mapped task. But i am missing somethinh
k
No that’s it you should need as long as
arg_map
returns a list
I think you can add logging to take a look at how to ran?
s
Ok I will add it
Mapped task is showing as collate_monthly_lotame.0, I want to differentiate it based on some string inside input list, is that possible?
k
Yes you can template task run names with inputs. See this
s
My input is an dict, can i use "{val["country"]}"?
k
Inside the task your can. Not inside the flow.
Ah sorry for templating names? I don’t think so but am not sure
s
_task_run_name_="{args[country]}"
This worked
It would be nice if the schematic is shown for mapped child tasks, is it available?
k
Oh that’s nice. It is denoted as mapped in the UI schematic and you can click into it and see the mapped runs
s
Got it
Copy code
@task
def show(x):
  print(x)
  return(x)

with Flow(FLOW_NAME) as flow:
    r=show.map(["A","B","C"])
    s=show.map(["A","B","C"])
    s.set_upstream(r)
In this case will s -mapA won't trigger if r -mapA fails?
@Kevin Kho I am getting this doubt as there is no direct dependency, but prefect doc says if two mapping jobs are linked , nth task of upstream will be linked with nth task of downstream
I have tested it all tasks of s is failing if one of r is failing, not sure how i can do n-n mapping without direct dependency.
k
Yes that is right. Mapped jobs are connected so you don’t want to set upstream if there is no dependency. Or you need an intermediate task. I think you can force
s
to run with Triggers but you will run into other issues doing this because the assumption is they have the same number of elements