Hey all! I'm having difficulties parallelizing my ...
# ask-community
a
Hey all! I'm having difficulties parallelizing my flows. I have provisioned a push work pool that runs on AWS/ECS, and I currently have two flows: 1. A flow that runs 1x per day and then starts a nested flow forever job that needs to be done 2. The nested flow that does all the jobs If I call the nested flow from the scheduled flow, all the jobs get executed in the same container instance, but I would ideally like each nested flow to run inside its own container. I've also looked into changing the nested flow into a task and parallelizing those, but setting up Dask/Ray runners feels redundant since the existing ECS cluster infrastructure should already handle parallelization. I can push the nested flow jobs by running the
prefect deployment run
shell command from the scheduled flow, but this doesn't feel like the right way. Is there a way to use Python code to push the nested flows into their container?
Ok, I figured out that I can use
run_deployment
with
timeout=0
from the Python SDK to achieve what I was trying to do from Python code.
🚀 1
b
Yes Anže, exactly that. A flow-of-deployments pattern should get you the behavior you're looking for 🙌
a
@Bianca Hoch thank you for confirming that I'm on the correct track! I have a follow up question: I now want to write a test for this code. If I use the
prefect_test_harness
manager the test will fail because the deployment that I am running in the application code does not exist. Is there a recommended/easy way to set up the deployments for testing? The docs don't seem to cover this edge case 😞