https://prefect.io logo
Title
j

Jane Jeong

05/26/2023, 9:07 PM
Hi team! I've been enjoying using dockerized prefect jobs, but found that a flow run will constantly stuck in 'pending' state (and no logs show up in the Prefect Cloud UI!) if the docker command was fed through the command parameter in a cloud run block. It works fine if you run the flow with a CMD line in the dockerfile however. Is this doable and I'm missing something? Else this would be awesome as a feature request.
n

Nate

05/27/2023, 12:39 AM
hi @Jane Jeong - the default
command
is responsible for starting the prefect engine, so depending on what you replaced it with, that might explain the behavior you're seeing what did you change the
command
field to on your infra block?
j

Jane Jeong

05/30/2023, 3:38 PM
Hi @Nate, the command was [ "python", "{file-entry-point-name.py}" ]
n

Nate

05/30/2023, 4:29 PM
ah yes, normally prefect will run
python -m prefect.engine
. Im pretty sure that bypassing this explains the behaviour you're seeing (here are some docs on what
prefect.engine
is doing) what was your reason for overriding the command?
j

Jane Jeong

05/30/2023, 6:04 PM
I see, thanks @Nate! I overrode the command because we have local packages we need to deploy as part of the flow. So we have a docker image that we deploy to Google Cloud Run for each flow. Method 1: • have the same image name for each google cloud run infra block for each flow, but with a different command in the command field Method 2: • have a different image name for each Google Cloud Run The con for using Method 2 over 1 is that we have to update multiple images upon one package update. It's not bad, but also would be nice for configurability and other projects that require more flexible command field overrides to still have prefect.engine running. Or is there a way we can do it?
a

alex

06/01/2023, 1:33 PM
I think what you want to do is possible with Prefect, but we might need more information to give a recommendation. Can you share the folder structure of your local packages and flows?
j

Jane Jeong

06/01/2023, 5:50 PM
project-name
../dockerfiles
..../flow-1
......Dockerfile
..../flow-2
......Dockerfile
../local-package
....__init__.py
....dependency.py
../flows
....__init__.py
....flow-util.py
....../flow-1
.........flow-1.py
....../flow-2
.........flow-2.py
• Dockerfiles stored in GCP artifact registry • flow-1.py and flow-2.py use local-package/dependency.py and flows/flow-util • flow-1/Dockerfile has CMD ["python", "flows/flow-1/flow-1.py"] (same thing for flow-2, respectively) @alex
a

alex

06/02/2023, 2:40 PM
I think there are a couple of options that you have depending on: 1. You could build a based image that has your base dependencies built in if those dependencies don’t change very often. Both flow1 and flow2 could use that custom image as their base image to have dependencies available. 2. Depending on how you store your flow code, you could store and use this entire directory for each deployment. The
entrypoint
for each deployment would be different for the different flows.
:upvote: 1