Hi, I’m wondering if it’s possible if subflow run on different architecture? For example, if I have
• master-flow which trigger subflow-1 and subflow-2
• subflow-1 needs more CPU & memory -> package in a deployment with more CPU & memory in the worker config for ECS task
• subflow-2 needs less CPU & memory -> package in a deployment with different (and less) amount of CPU & memory in the worker config for ECS task
If it’s possible, will it be done through deployment trigger or simply trigger the subflows in the master-flow after created the deployments for subflows?
d
Deceivious
07/12/2023, 11:42 AM
You cant use different infra for direct subflow call AFAIK.
Deceivious
07/12/2023, 11:43 AM
You will have to deploy SF1 and SF2 and call
run_deployments
.
upvote 1
🙏 1
c
Christopher Boyd
07/12/2023, 11:56 AM
If the subflows are in line with your main flow (same file or same thread) as @Deceivious mentioned, you can’t split up a running flow to different infra mid-flow.
You can however register each as their own deployment entirely, and call
run_deployments
as mentioned - this will ensure each gets its own process / job; they would run with whatever the work-pool has, or you could override the job params in
infra_overrides
when you defined the deployment
a
Anh Pham
07/12/2023, 12:45 PM
Thank you, both. I could make it to try the run. Anw, I could see that if the flow run in
run_deployment
failed, the
master-flow
run still complete successfully. Is there a simple way to mark the
master-flow
run to failed when
run_deployment
failed?
d
Deceivious
07/12/2023, 12:59 PM
run_deployment
returns a flow run id. You can check the state of the flow run after it completes.
Check for documentation on
Bring your towel and join one of the fastest growing data communities. Welcome to our second-generation open source orchestration platform, a completely rethought approach to dataflow automation.