I have some questions about `FargateTaskEnvironmen...
# prefect-community
I have some questions about
. 1. Can it just run an existing ECS task definition, or does it always register a new task definition based on the specified family/taskDefinition? 2. If it can just run an existing ECS task definition, what would the minimal necessary arguments be, something like
, assuming AWS credentials are accessible by boto3 via the environment? 3. If it always generates/registers a new task definition, what would the minimal set of required arguments? I could probably go through the process of figuring that out by running
until I get the desired task definition running and then translate that into
arguments. Curious if someone has gone through this process already.
the higher goal I'm trying to accomplish is to run Fargate containers with the cpu/memory resources I need. The default values are causing task containers to run out of memory, and the Zombie Killer stops them.
so I only use the fargate agent to execute the actual flows container and not the fargate environment, but i think the environment support kwargs in the environment that get passed to run_task
you should be able to pass cpu and memory into the task definition that prefect creates or create it ahead of time with what you need and pass taskDefinition kwarg
Thanks! I need to sort out which of the FargateAgent args are for the agent's container itself and which ones are for the Fargate task containers it spins up. Do you configure your Fargate tasks via the Fargate Agent?
so I we added a customization to the fargate agent called 'external kwargs' because we did not want to have the 2 tasks per flow. https://docs.prefect.io/cloud/agents/fargate.html#external-kwargs This allows you to store a json file with kwargs in S3 and the agent will use those when creating task definitions and task runs
as part of our CICD process is give people the ability to define these, we pull it out of the flow file or some other config where the code is stored(github for example), we register the flow then take the resutling flow id to store the kwargs in S3
then when the agent flag is turned on for this...agent will always look to see if the flow run has a set of external kwargs, which will overwrite what you have set on the agent
this way we do not need to use the fargate environment to control the flows task properties
That's a cool approach that avoids the use of FargateTaskEnvironment. I'm not at the point yet where I have CICD set up to generate <external_kwargs_s3_key>/slugified_flow_name>/<flow_id[:8]>.json Guess I could write a pretty simple script, though.
We run our flows completely in Fargate (including Dask cluster). We define the Fargate Agent taskdef in terraform, starting the agent via the command line. We pass args to the agent using a combination of command line args and environment vars (for instance the
was tricky to do on the command line so we did it via env var). Happy to provide more details if it’s helpful.
Thanks for the tips, all! I figured out how to use just the FargateAgent to define executor tasks as needed, so I am not using FargateTaskEnvironment any more.