gee
08/15/2025, 12:36 PMMarvin
08/15/2025, 12:36 PMMarvin
08/15/2025, 12:39 PMFlow.from_source(...).deploy(...)
to point at your repo (or local path) and Prefect will fetch the code at run time. Example:
from prefect.flows import Flow
flow = Flow.from_source(
source="<https://github.com/your-org/your-repo>",
entrypoint="flows/etl.py:my_flow", # path to file + :flow function
)
flow.deploy(
name="prod",
work_pool_name="your-pool",
)
- For local code, use source="."
.
- For remote code, use your repo URL. The worker will retrieve the source when it runs the deployment.
References
- Prefect 3 removed the old build/pull/push steps model; deployments are created directly from source: Prefect 3 release notes
- 2.x used pull steps in `prefect.yaml`: Prefect 2 deploy guide
- Flow.from_source API (for the curious): flows.py
If you’re on Prefect 2.x, pull steps are configured in prefect.yaml
. If you’re on 3.x (the default assumption), use from_source
. Which version are you on?gee
08/15/2025, 12:42 PM# prefect.yaml
###
...
###
# pull section allows you to provide instructions for cloning this project in remote locations
pull:
- prefect.deployments.steps.run_shell_script:
id: get-github-token
script: uv tool run prefect-cloud github token gee-gendo/test-prefect-deployment
- prefect.deployments.steps.git_clone:
id: clone-step
repository: "<https://x-access-token>:{{ get-github-token.stdout }}@github.com/jane-doe/test-prefect-deployment.git"
branch: v3.4.8
- prefect.deployments.steps.pip_install_requirements:
directory: "{{ clone-step.directory }}" # `clone-step` is a user-provided `id` field
requirements_file: requirements.txt
###
...
###
Marvin
08/15/2025, 12:43 PMMarvin
08/15/2025, 12:44 PMprefect deploy
reading your prefect.yaml
. With the Python SDK, you specify the code source directly and let the worker fetch it.
How to mirror your YAML in Python
- Code checkout: point your deployment at the repo with Flow.from_source(...)
.
- Auth: include an access token in the URL (or use env vars/secrets to compose it).
- Dependencies: instead of pip_install_requirements
at runtime, bake deps into the image and pass image=...
to deploy
. It’s far more reliable.
Example that mirrors your flow:
import os
from prefect.flows import Flow
# Option A (recommended): read a token from env/variable/secret you control
token = os.environ["GITHUB_TOKEN"]
repo = f"<https://x-access-token:{token}@github.com/jane-doe/test-prefect-deployment.git>"
# Load the flow from your repo; entrypoint = "path/to/file.py:flow_function_name"
flow = Flow.from_source(
source=repo,
entrypoint="flows/etl.py:my_flow",
)
# Pin dependencies by using a prebuilt image (no pip-at-runtime surprises)
flow.deploy(
name="prod",
work_pool_name="your-pool",
image="<http://ghcr.io/jane-doe/prefect:3.4.8|ghcr.io/jane-doe/prefect:3.4.8>", # prebuilt image with your requirements
)
Notes
- Need to fetch a short-lived token like your prefect-cloud github token …
step? You can still shell out before calling from_source
and inject it into the URL, but I’d strongly recommend turning that into a variable/secret so you’re not scripting token fetches at deploy time.
- If you must “pin” a specific branch/tag/commit, use a URL that points at the ref (or a storage object that supports refs). If your ref needs more than a simple URL can express, keep using prefect deploy
with your YAML for that deployment.
- The Python SDK path intentionally replaces pull/build/push “steps” with “declare the source + use an image” for reproducibility.
References
- Prefect 3 changed the deployment model (no build/pull/push pipeline): Prefect 3 release notes
- Flow.from_source and Flow.deploy live here if you want to peek: flows.py
If you tell me which worker you’re using (process, docker, k8s) and whether you truly need to install requirements at runtime, I can tailor this to your setup. Or, yes, you can keep using the prefect.yaml
pull steps with `prefect deploy`—they still work in 3.x. I won’t be offended.