:wave::skin-tone-2: Hi all. Looking for some gene...
# ask-community
e
👋🏻 Hi all. Looking for some general guidance - does anyone have any best practices when it comes to deploying many (100+) jobs? For a while I've been using a
prefect.yaml
deployment config for a dozen jobs which has worked well. Coming up, I'm looking to port over a huge pile of older, unmanged pipelines into Prefect. Right now each deployment config in my YAML is about 13 lines. Feels like there has to be a more efficient way to do this rather than configuring 1300 lines where each job has a different name, schedule, and ~2 parameters. Maybe a script to take a flattened input and re-create the prefect YAML every time I deploy?
n
hi @Eric Albanese - do you want to keep everything in yaml? if you're planning on scripting to gather / setup / define deployments, you way want to consider the
.deploy()
method on flows or instead the
deploy(*many_flows)
util
the downside being that its not as declarative as the yaml
e
@Nate Pretty flexible on the format! Looking at the Docs now.. this is the page I'm looking for right?
e
Gotcha.... so this would be a stand alone deployment file that handles all that. Then presumably I could set a loop to read through some config parameters and set them to the flows. One last Q. I have my Flows set up in separate files. I'm thinking I'd just Import them and then config? So maybe something like
Copy code
from prefect import flow, deploy
from flows import flow1, flow2

if __name__ == "__main__":
    deploy(
        flow1.to_deployment("my-deployment-1"),
        flow2.to_deployment("my-deployment-2"),
)
n
yep, thats exactly what i was imagining based on your description
e
great, giving this a spin now. Appreciate the quick info Nate
n
catjam
e
Hmm getting stuck on this. Does this approach always require creating a new Docker container? I already have a container image that's ready to go. Just trying to get something working, I have this so far
Copy code
var_config = {
            "network_mode": "host",
            "volumes": [
                "/var/run/docker.sock:/var/run/docker.sock:ro",
             "/home/ubuntu/.docker/config.json:/root/.docker/config.json:ro"
                ]
            }

    my_flow_1.deploy(
        name='test-my_flow',
        image="xxxx/prefect:v10284.0",
        push=False,
        work_pool_name="ec2-docker",  
        job_variables=var_config
    )
    print(f"Deployed to Prefect Cloud")
heres my exact YAML config
Copy code
- name: poc
    version: null
    tags: null
    description: null
    schedule: {}
    entrypoint: flows/api_trigger_poc.py:execute
    parameters:
      tenant_key: "dummy1"
      data_location: "dummy2"
    work_pool:
      name: ec2-docker
      work_queue_name: null
      job_variables:
        image: "xxxx/prefect:{{ $TAG }}"
        network_mode: "host"
        volumes:
          - "/var/run/docker.sock:/var/run/docker.sock:ro"
          - "/home/ubuntu/.docker/config.json:/root/.docker/config.json:ro"
but getting a
RuntimeError: Failed to generate Dockerfile. Dockerfile already exists in the current directory.
message
n
oh i think you want
build=False
by default we'll try to build you an image on the fly with an autogenerated dockerfile
but if you already have your
image
then you dont need that
e
ok cool
trying rn
alright so got that deployed, finding the image just fine. Stuck now on how to define an entrypoint to my container, I see deploy has a param for
entrypoint_type
. Also checking other configs, I have this under Pull Steps that I think I need to migrate over as well
Copy code
[
  {
    "prefect.deployments.steps.set_working_directory": {
      "directory": "./"
    }
  }
]
n
being a bit pedantic, but i wouldnt mess with the actual entrypoint to the container, since if you're running flows in most cases you want the prefect engine to handle that i assume your code is baked into the image? or pulling down at runtime
e
baked into the image
n
cool then yeah its just a matter of setting the right
set_working_directory
args, which in the python way you can do something like
Copy code
deploy(*[f.from_source(..).to_deployment(...) for f, config in zip(flows, configs)])
where
from_source
will get translated to a
set_working_directory
step in the
pull
section if you pass
source=str(Path(...))
and
entrypoint='filename.py:decorated_fn'
like this