Hi there,
I'm currently running my flows in cloud run infrastructure. However, cloud run jobs have a max time limit of 1h, so the flow needs to finish before that.
Is there an elegant way to early stop the running flow (by measuring either the time passed or the "work done") and trigger another one with the same deployment so that it continues the work that is left in a new flow run / cloud run job? Is there an elegant way of passing job metadata (for example, records processed / failed ) from one flow run to another?
Thanks in advance!
d
Deceivious
06/01/2023, 12:45 PM
Maybe caching task results can help?
l
Luis Cebrián
06/01/2023, 1:07 PM
Maybe caching task results can help?
I'll try that for the cursor thing across flows. Thanks
Bring your towel and join one of the fastest growing data communities. Welcome to our second-generation open source orchestration platform, a completely rethought approach to dataflow automation.