gee
08/14/2025, 7:06 PM.
├── flows
│ └── my_flow.py
├── pyproject.toml
├── uv.lock
├── README.md
└── src
└── lib
├── __init__.py
├── py.typed
└── my_code.py
Marvin
08/14/2025, 7:06 PMMarvin
08/14/2025, 7:11 PMsrc/lib
unless you install the project (or you manually add src
to PYTHONPATH
). Your flows/my_flow.py
failed to import lib
because it wasn’t installed.
Two clean ways to fix it.
Option A: Keep flows outside the package, install your project
- Keep your layout as-is:
.
├── flows
│ └── my_flow.py
├── pyproject.toml
└── src
└── lib
├── init.py
└── my_code.py
- Use a proper pyproject for a src layout so your package installs correctly. Example with setuptools:
[build-system]
requires = ["setuptools>=68", "wheel"]
build-backend = "setuptools.build_meta"
[project]
name = "lib"
version = "0.1.0"
requires-python = ">=3.9"
dependencies = ["prefect>=3"]
[tool.setuptools]
package-dir = {"" = "src"}
[tool.setuptools.packages.find]
where = ["src"]
- In dev (with uv): run uv sync
in the repo root to install the project in editable mode.
- In flows/my_flow.py
, import from your package:
from prefect import flow
from lib.my_code import do_something
@flow
def my_flow():
do_something()
if __name__ == "__main__":
my_flow()
- When you deploy/run:
- Process worker: make sure the worker runs inside an environment where the project is installed (the same uv
-managed venv). In other words: activate that env, start the worker, then your deployment will import lib
.
- Docker worker: build an image that installs your project:
FROM python:3.11-slim
WORKDIR /app
COPY pyproject.toml uv.lock ./
RUN pip install uv && uv sync --frozen
COPY . .
RUN uv pip install -e .
ENV PREFECT_API_URL=...
CMD ["prefect", "worker", "start", "--pool", "your-docker-pool"]
Option B: Move flows into the package (no install dance in dev, still recommended to install in prod)
- Put the flow under src/lib/flows/my_flow.py
and use package-relative imports:
src/lib/flows/my_flow.py
src/lib/my_code.py
from prefect import flow
from ..my_code import do_something
@flow
def my_flow():
do_something()
- This still benefits from installing the package in the worker environment; otherwise you’ll again need to tweak PYTHONPATH
.
Quick-but-hackier fix (not recommended for production)
- Set PYTHONPATH=src
so from lib...
resolves without installing the package. You can set this in your environment, your worker’s job variables, or a .env
file. Installing the package is more robust.
Deploying from GitHub with Prefect 3.x
- Use from_source(...).deploy(...)
with an entrypoint pointing to your flow. Example:
from prefect import flow
flow.from_source(
source="<https://github.com/your-org/your-repo>",
entrypoint="flows/my_flow.py:my_flow", # or "src/lib/flows/my_flow.py:my_flow" for Option B
).deploy(
name="my-deployment",
work_pool_name="your-pool",
)
- Remember: the worker that executes the deployment must have your package installed (Option A/B) or a PYTHONPATH
that includes src
.
Helpful references
- prefect.yaml and deployments
- Docker worker and deployments
- Using uv with Python projects
If you tell me which worker you’re using (process vs. Docker) and whether you prefer flows inside or outside the package, I’ll give you the exact commands/config to drop in. I’ll try to contain my excitement.gee
08/14/2025, 7:30 PMMarvin
08/14/2025, 7:30 PMNate
08/14/2025, 7:30 PMNate
08/14/2025, 7:30 PMNate
08/14/2025, 7:31 PMMarvin
08/14/2025, 7:31 PMsrc
layout. Cloud just coordinates; your worker’s environment is what actually runs the code. If that environment can’t import lib
, the run will fail.
A couple of reliable patterns that do work with Cloud:
1) Process worker (runs on your machine/VM)
- Ensure the worker runs in a virtualenv where your project is installed (editable works fine).
- In the repo root:
uv sync
uv pip install -e .
- Start the process worker from that same environment.
- Or, set PYTHONPATH=src
so imports like from lib...
resolve without install:
- Add to deployment’s job variables:
flow.from_source(
"<https://github.com/your-org/your-repo>",
entrypoint="flows/my_flow.py:my_flow",
).deploy(
name="my-deployment",
work_pool_name="your-process-pool",
job_variables={"env": {"PYTHONPATH": "src"}},
)
- Alternatively, set PYTHONPATH=src
in the work pool’s base job template env.
2) Docker worker (most robust for Cloud)
- Build an image that includes your code and installs the package:
FROM python:3.11-slim
WORKDIR /app
# If you want uv, install it; otherwise just use pip
RUN pip install uv
COPY pyproject.toml uv.lock ./
RUN uv sync --frozen
COPY . .
RUN uv pip install -e .
ENV PREFECT_API_URL=<your-cloud-api-url>
- Push the image and set it on the deployment:
flow.from_source(
"<https://github.com/your-org/your-repo>",
entrypoint="flows/my_flow.py:my_flow",
).deploy(
name="my-deployment",
work_pool_name="your-docker-pool",
job_variables={"image": "your-registry/your-image:tag"},
)
Notes and common pitfalls
- Cloud doesn’t run your code; your worker does. So:
- If you rely on uv
, it must be available inside the worker environment/image.
- Using a src/
layout means Python won’t see lib
unless you either install the package or set PYTHONPATH=src
.
- Entry point must point to the file and flow object that exist in the repo, e.g. flows/my_flow.py:my_flow
or src/lib/flows/my_flow.py:my_flow
.
Docs for reference:
- Deployments and configuration
- Docker worker setup
If you can share:
- Which worker type you’re using (process, docker, k8s)
- Your exact error message/logs
- How you’re installing dependencies on the worker
- Your .deploy(...)
call or prefect.yaml
…I’ll pinpoint the minimal change to get it working on Cloud. I live for this level of excitement.gee
08/14/2025, 7:31 PMgee
08/14/2025, 7:32 PMNate
08/14/2025, 7:33 PMMarvin
08/14/2025, 7:33 PMMarvin
08/14/2025, 7:34 PMjob_variables
).
- Materializes the runtime environment for its type:
- Process: starts a subprocess in the worker’s host env.
- Docker: pulls/uses the specified image and starts a container with the right command/env.
- Kubernetes: creates a Job/Pod from the template.
- Managed: Prefect Cloud provisions an ephemeral container using your image on Prefect infra.
- Injects runtime configuration (PREFECT_API_URL/KEY, parameters, env, entrypoint).
- Streams logs and reports state transitions back to Cloud until completion.
How workers “submit” the run to runtime
- “Submission” = the worker creates the job in its target runtime (process/container/pod) using the composed job config.
- The runtime process executes your flow entrypoint (from the deployment), fetches source if you used from_source(...)
and your image/env supports it (git/creds/tools must be present), then runs the flow.
- The worker (or managed runtime) monitors the process, relays logs, and updates state.
Managed Execution specifics (Hobby)
- You define a Managed work pool and a Docker image (pullable by Prefect). Prefect Cloud spins an ephemeral container from that image to run your flow. Your image must include dependencies and any tools needed to fetch source if you use source-based deployments.
Good references
- Work pools/workers overview: Blog: Work pools and workers
- Deployments and configuration: prefect.yaml and deployments
- Docker runtime setup: Docker worker and deployments
If you want, tell me which pool type you’re using and I’ll outline the minimal job template you need (image/env/command) so runs actually pick up your code and dependencies. I can barely contain my enthusiasm.Nate
08/14/2025, 7:35 PMimage
job var on your deploymentNate
08/14/2025, 7:38 PMI thought you couldn't use docker images in prefect cloud?
Nate
08/14/2025, 7:38 PMgee
08/14/2025, 7:40 PMNate
08/14/2025, 7:41 PMNate
08/14/2025, 7:41 PMgee
08/14/2025, 7:41 PMflow.from_source(
"<https://github.com/your-org/your-repo>",
entrypoint="flows/my_flow.py:my_flow",
).deploy(
name="my-deployment",
work_pool_name="your-docker-pool",
job_variables={"image": "your-registry/your-image:tag"},
)
Pointing to the default workpool?Marvin
08/14/2025, 7:41 PMMarvin
08/14/2025, 7:44 PMdefault
) is Process-type. You can target it with work_pool_name="default"
, but Docker-specific job_variables
like {"image": ...}
won’t apply there.
- If you want to run in a container, deploy to a Docker (or Managed) work pool instead and keep job_variables={"image": "your/image:tag"}
.
Examples
- Default (Process) work pool:
flow.from_source(
"<https://github.com/your-org/your-repo>",
entrypoint="flows/my_flow.py:my_flow",
).deploy(
name="my-deployment",
work_pool_name="default",
# optional for src layout if you didn't install your package:
job_variables={"env": {"PYTHONPATH": "src"}},
)
You must have a Process worker running against the default
pool or runs will sit Pending.
- Docker work pool:
flow.from_source(
"<https://github.com/your-org/your-repo>",
entrypoint="flows/my_flow.py:my_flow",
).deploy(
name="my-deployment",
work_pool_name="your-docker-pool",
job_variables={"image": "your-registry/your-image:tag"},
)
Requires a Docker worker polling your-docker-pool
.
- Managed Execution (Hobby):
flow.from_source(
"<https://github.com/your-org/your-repo>",
entrypoint="flows/my_flow.py:my_flow",
).deploy(
name="my-deployment",
work_pool_name="your-managed-pool",
job_variables={"image": "your-registry/your-image:tag"},
)
Prefect Cloud runs an ephemeral container from your image; make sure it’s pullable and contains your dependencies.
Helpful commands
prefect work-pool ls
…to see the actual pool names you can target.
Refs
- Deployments and prefect.yaml
- Docker worker and deployments
- CLI: work pools
If you tell me which pool you plan to target, I’ll give the exact job_variables
you need. I know, contain your excitement.gee
08/14/2025, 7:44 PM[
"./"
]
Would it install the current project? Or is that what the disclaimer is about ?
(this will occur prior to any pull steps configured on the deployment).
Nate
08/14/2025, 7:51 PMgee
08/14/2025, 7:51 PMgee
08/14/2025, 7:52 PMMarvin
08/14/2025, 7:52 PMNate
08/14/2025, 7:52 PMgee
08/14/2025, 7:53 PMMarvin
08/14/2025, 7:53 PMactions:
- name: build-image
steps:
- run: docker build -t your-registry/your-image:tag .
- run: docker push your-registry/your-image:tag
deployments:
- name: my-deployment
entrypoint: flows/my_flow.py:my_flow
work_pool: your-docker-pool
job_variables:
image: your-registry/your-image:tag
Then:
prefect deploy
Notes
- Managed Execution (Hobby) is just Docker on Prefect’s infra. You still provide a pullable image with all deps.
- If you insist on runtime installs, you can hack it by adjusting the container command to pip install ... && python -m ...
, but that’s on you; Prefect won’t do it for you.
Docs
- prefect.yaml and actions
- Docker worker and deployments
- Managed (Cloud) execution
So yes: you define deps in the environment (image or worker venv), not in the deployment. And no: Prefect won’t build a Dockerfile for you unless you tell it to. Tragic, I know.Nate
08/14/2025, 7:54 PMPrefect won’t pip install extra deps at runtime. You should bake deps into the image
Nate
08/14/2025, 7:56 PMAny plans on offering docker build on the fly in the future? I think flyte does itthat could be something interesting to explore! installing stuff at runtime usually satisfies that for people but there are certainly quirks, particularly around unexpected versions and stuff if you're really interested in that pattern i'd suggest opening a discussion so the right folks can have that convo with you!
gee
08/14/2025, 8:03 PMprefect deploy ...
or does it define steps to run server side?Nate
08/14/2025, 8:04 PMprefect deploy
gee
08/14/2025, 8:09 PMgee
08/14/2025, 8:40 PM# prefect.yaml
###
...
###
# pull section allows you to provide instructions for cloning this project in remote locations
pull:
- prefect.deployments.steps.run_shell_script:
id: get-github-token
script: uv tool run prefect-cloud github token gee-gendo/test-prefect-deployment
- prefect.deployments.steps.git_clone:
id: clone-step
repository: "<https://x-access-token>:{{ get-github-token.stdout }}@github.com/jane-doe/test-prefect-deployment.git"
branch: v3.4.8
- prefect.deployments.steps.pip_install_requirements:
directory: "{{ clone-step.directory }}" # `clone-step` is a user-provided `id` field
requirements_file: requirements.txt
###
...
###
With an elaborate requirements file:
# requirements.txt
.
Nate
08/14/2025, 8:55 PMgee
08/14/2025, 9:31 PM- prefect.deployments.steps.run_shell_script:
directory: "{{ clone-step.directory }}"
script: pip install .