Is it still possible to package flows in Docker im...
# prefect-community
m
Is it still possible to package flows in Docker images? I don't immediately see such an option, and can't seem to figure it out
1
a
you mean for 2.0? we'll definitely provide recipes to do that
👍 1
subscribe to this category on Discourse, plenty of recipes and example repos with CI templates are planned
k
You can also find how to run a flow in a Docker container in the Infrastructure session in the docs
m
Yeah the thing is, in one of the beta versions I was able to store my flow in a docker container in the following way:
Copy code
d = Deployment(
    flow = FlowScript(path='./etl.py'),
    name = "ETL",
    tags=["orion"],
    infrastructure = KubernetesJob(stream_output=True, namespace="orion",),
    packager = DockerPackager(python_environment=PythonEnvironment(python_version='3.8'), registry_url="XXX")
)

d.create(client=get_client())
It would be cool if I could deploy my flows this way, or something analogous @Khuyen Tran
a
This is something you can already do even without packager - just specify docker or kubernetes as --infra and it will just work
Packager or CI would only be important if you would require e.g. installing specific package versions at runtime
m
And how does the orchestration API know where to look for a flow? This is probably specified in the command argument then? Is it for example possible to customise the job manifest?
a
is this question motivated by some issue? if so, could you explain it? the interface is built in a way that you shouldn't even have to think about modifying any paths
k
@Matthias You mean how API finds where your flow is when you run a deployment?
m
My use case is actually something I would consider very basic. I have a Docker container that contains everything that is required to run a flow: dependencies, task and subflow modules and my flow script. And I want to make a deployment so that I can run that flow on a daily basis. How do I do that?
But, reading up on a different thread it seems that it is possible, but in a hacky way
a
it's not hacky at all, I talked to our CTO and he confirmed it's a totally valid and even expected approach
👍 2
k
@Matthias you can just run
deployment build
at the directory that contains all of your flows and its relative imports. Have you tried that?
r
@Anna Geller if it is an expected approach then you should update the documentation and CLI in order to be able to use local filesystem when using K8s. At the moment the docs say you can't use local storage when doing K8s and CLI command deployment build fails if you pass local filesystem with infra K8s