Lawrence Lee
01/06/2023, 8:55 AMtask_definition
, containerDefinitions.name
must be “prefect”. Otherwise the new registered task definition will contain two separate containerDefinitions
items and runs will fail.
2. I believe that the command
parameter has to be blank/null, otherwise your flow will not be run properly.
3. If using the task_customizations
parameter, you must cast your array(dict) as a JsonPatch
object otherwise the pydantic validation will fail. This conflicts with the documentation and the examples in the dataflow-ops
repo (@Anna Geller)
task_customizations=JsonPatch([
{
"op": "replace",
"path": "/networkConfiguration/awsvpcConfiguration/assignPublicIp",
"value": "DISABLED",
},
])
Anna Geller
01/06/2023, 3:05 PMZanie
01/06/2023, 4:07 PMAnna Geller
01/06/2023, 4:35 PMENTRYPOINT ["prefect", "agent", "start", "-q", "default"]
and I was able to use the same image for my flow run container on infra block without specifying any command. I guess my misinterpretation was because:
• regardless of what entrypoint is set in Dockerfile, if some command is explicitly set on the infra block it will be used
• if no command is set, we take this base run command
correct or still wrong?Zanie
01/06/2023, 4:38 PMcommand
set on the DockerContainer
blockAnna Geller
01/06/2023, 4:39 PM"command": self.command or self._base_flow_run_command()
Zanie
01/06/2023, 4:40 PM