Sharath Chandra
04/20/2022, 5:46 AMLocalExecutor
which these mapped jobs are running sequentially.
Are there instances where the LocalExecutor is not able to track the spark job running on k8s and thus not able to trigger the subsequent tasks in the map ?with Flow("Master", schedule=schedule, run_config=master_run_config) as flow:
jobs = ["x1","x2", "xn"]
task_compute_bots_markers = task_spark_submit_compute.map(
run_id=jobs
)
Each task is a spark-submit
executed as ShellTask
Anna Geller
LocalDaskExecutor
to your flow.
However in some instances I can see that only few of the jobs in the map are executing.Can you share a full code example and some screenshots or logs of what you see? Where are your Prefect agent and Spark cluster running?
Sharath Chandra
04/20/2022, 2:34 PMLocalDaskExecutor
. I can see that the jobs are running in parallel, but even here the long running jobs fail to get any status and marked as completedSucceeded
phaseAnna Geller
Sharath Chandra
04/21/2022, 3:20 AMLoop
and then possibly based on some condition mark the specific Task
as SUCCESS
and FAIL
? Will the executor then proceed to next items in map ?Anna Geller
disable_flow_heartbeat
mutation or from the UI,
◦ you can orchestrate those child flows from a parent flow (i.e. build a flow-of-flows).