Hi all, question regarding deploying flows to work pools and where the information about infrastructure is being pulled from. My setup is currently an AWS ECS Service running a worker listening to a work pool (which creates the pool when it doesn't exist) in Prefect Cloud. I am using the
prefect
and
prefect_aws
libraries in Python to create an "anonymous"
ECSTask
infra block which is used in the deployment via
prefect.deployments.Deployments.build_from_flow
. When I run the flow via "quick run" the run uses the infrastructure block/definition in the work pool instead of the deployment (of which there is none/minimal due to it being created by the worker). When I manually update the work pool configuration with the infra details (cluster name, image, etc). it springs to life and works as expected. Is there a way to have it utilize the deployment infra configuration instead or am I missing something (which is totally possible!)?
a
alex
10/25/2023, 3:04 PM
Hey @Joey Allison! When you’re using a worker, it will use the configuration from the work pool and ignore the infra block on any deployments. You can think of a worker as an agent + an infra block.
j
Joey Allison
10/25/2023, 3:05 PM
Ah ha! Okay, is there a different setup for having a "serverless" ECSTask that utilizes the infra details of the deployment? Is that adviseable or is it "best practice" to utilize work-pools going forward?
Joey Allison
10/25/2023, 3:08 PM
Alternatively, is there a means of defining work pools in python?
Joey Allison
10/25/2023, 3:08 PM
(defining & configuring their infra details)
a
alex
10/25/2023, 3:11 PM
We recommend using work pools and workers moving forward. Since you’re using
Deployment.build_from_flow
, you can use
infra_overrides
to define infra details for your deployment (equivalent to
job_variables
in the new
flow.deploy
method). That will allow you to override any of the variables configured in your work pools base job template.
👍 1
j
Joey Allison
10/25/2023, 3:13 PM
Fantastic! I'll give that a go. Thank you for all your help!
Bring your towel and join one of the fastest growing data communities. Welcome to our second-generation open source orchestration platform, a completely rethought approach to dataflow automation.