Is it possible to stop a task map if any of the calls fail?
My issue am running a map that's around 100k plus n if one of them fails, I still need to wait till all 100k is done before the flow stops itself
04/26/2022, 1:45 PM
Good question! It's probably a matter of setting triggers and reference tasks. Let me test it with a simple example and get back to you. But you can also share your flow code
I looked more into this and it looks like you can't do that with mapping. The reason for that is: each child flow run spun up using mapping (potentially in parallel) runs in its own process/thread or even in its own independent pipeline. Other child task runs are not aware of the other ones, therefore you can't stop task run D if task run B failed since they are independent of each other.
I think to stop a flow run if any of those dynamically generated tasks fails, you would need to use looping rather than mapping, but let me confirm with the team.
04/26/2022, 2:09 PM
I think this is a less complicated version of this.
TLDR: I believe it’s very hard to do. You might have some options.
1. Run the mapped elements in batches with create_flow_run. This way at least, you can end at a batch when one fails.
2. Use the GraphQL API to set the Flow as Failed in the state handler. This is so aggressive though, but the Flow will error out.
3. You can give up parallelism and use task looping to make it sequential, that way you can just exit the loop.
4. You can try persisting a flag in the KV store and the tasks can just look there and if the flag is up they just skip or raise FAIL
04/26/2022, 2:30 PM
hmm thanks for the replies..I guess tbh even though it would be a good feature to do that, I guess its against the first settings the flow, being that its DAG n each instance of map task is another task n like mentioned above, its not related to another instance of its own
plus my flow is spinning up all those tasks with db sessions, so I think I should def avoid that
04/26/2022, 2:58 PM
the most productive way of approaching this would be if you try doing that in Prefect 2.0 - this allows you to run arbitrary Python code in your flow so you could implement that as you wish
and DB sessions can also be easier shared between tasks this way