Hi folks :simple_smile: We are using ECSRun + Dock...
# ask-community
a
Hi folks simple smile We are using ECSRun + Docker storage. We’d like to provide the task definition ARN, but we get the following error:
Copy code
Cannot provide `task_definition_arn` when using `Docker` storage
Looking at the code at https://github.com/PrefectHQ/prefect/blob/master/src/prefect/agent/ecs/agent.py it seems a by design choice. What are the alternatives for running a flow on ECS while providing the task definition ARN?
👀 1
a
You can provide a custom task_definition_arn or task_definition_path to the ECSRun run configuration.
a
This is how we pass the task definition ARN
Copy code
storage = Docker(
    image_name="foo",
    image_tag="bar"
)

config = ECSRun(labels=["my_label"], task_definition_arn="<my_task_arn>")
At the flow level, we set:
Copy code
# f is the flow object
f.run_config = config
Actually, we get the error mentioned above during flow registration
k
What container does the task_definition that you have use?
a
No container actually, the error happens during flow registration
Nope sorry, it doesn’t happen during flow registration, it seems to happen during flow execution
Yes, I confirm it happens during execution. The flow starts, then immediately fails with the error above
k
I guess this is a design choice, maybe because the task_definition_arn might already have a container. Or maybe using it spins up the container before running the Flow. I think you need to change to
task_definition
in your Flow or
task_definition_path
.
a
mmh…ok, will try it. We were using task definition ARNs because in this way we can create the task definition separately and the provide just the ARN. The advantage is that we can separate the creation of the task definition, which is environment-specific, from the flow (which is not environment-specific)…
k
You might have to build the container separately
I am thinking that you might be able to do
Copy code
storage = Docker(...)
storage.add_flow(flow)
storage.build()
But do not do:
Copy code
flow.storage = storage
So that the Flow does not use Docker storage. But at the same time, you would need to use Local Storage to point to the path inside the Docker container where your Flow exists to that ECS can find it and run it. Does that make sense?
And then you can put the image name in your
task_definition
and then just provide the
task_definition_arn
But then I think the easiest honestly is using S3 Storage + your own task definition with an image created and specified
upvote 1
a
Will think about it together with @grandimk simple smile