Fugo Takefusa
09/24/2025, 1:29 AMMarvin
09/24/2025, 1:29 AMMarvin
09/24/2025, 1:34 AMprefect.yaml
files. Each project has its own prefect.yaml
, and there’s no built-in include/import mechanism.
Options to reduce duplication:
- Within a single file: use YAML anchors/aliases to reuse fragments in that same prefect.yaml
.
- Centralize infra in Prefect itself: put shared config into workspace resources (e.g., work pools and blocks) and reference them by name in each project’s prefect.yaml
. This keeps YAML light and the real “shared config” lives in Prefect.
- Use environment variables/templating: Prefect 3 supports Jinja templating in prefect.yaml
, so you can inject env vars or computed values. Combine with profiles/`.env` files to standardize values across projects.
- Monorepo: keep a single prefect.yaml
at the repo root and define multiple deployments there.
- Generate config: keep a shared template and render per-project prefect.yaml
in CI (e.g., a small script or Cookiecutter) before running deploys.
- Define deployments in Python with shared helpers: create a small internal package that encapsulates your common defaults and call it from each repo.
Example of a shared Python helper for consistent deployments:
# common_deploy.py (in an internal package)
from prefect import flow
def deploy_standard(*, repo, entrypoint, name, work_pool="default", tags=None, parameters=None):
return flow.from_source(
source=repo,
entrypoint=entrypoint,
).deploy(
name=name,
work_pool_name=work_pool,
tags=tags or [],
parameters=parameters or {},
)
# In each repo:
# from common_deploy import deploy_standard
# deploy_standard(
# repo="<https://github.com/acme/my-service>",
# entrypoint="flows/my_flow.py:my_flow",
# name="my-flow",
# work_pool="acme-prod-pool",
# tags=["prod"],
# )
Docs and references:
- prefect.yaml docs
- Settings and profiles
- Managing Prefect deployments with Terraform (if you want to manage deployments centrally via infra-as-code)
If you can share what you’re trying to “define once” (e.g., work pool config, image tags, actions/steps, env vars) and whether you’re in a mono- or multi-repo setup, I can suggest a concrete pattern.Fugo Takefusa
09/24/2025, 1:39 AMprefect.yaml
, so you can inject env vars or computed values.Marvin
09/24/2025, 1:39 AMMarvin
09/24/2025, 1:41 AM{{ env.MY_VAR }}
to require a var
- {{ env.get('MY_VAR', 'default') }}
for a default
- steps: outputs from earlier build/push actions (by id). Useful for using command outputs (e.g., git SHA) later in the file.
Common patterns and examples
1) Inject env vars into deployment fields
deployments:
- name: "my-flow-{{ env.get('ENV', 'dev') }}"
entrypoint: flows/my_flow.py:my_flow
tags:
- "{{ env.get('ENV', 'dev') }}"
- "team-{{ env.get('TEAM', 'data') }}"
work_pool:
name: "{{ env.get('POOL', 'default') }}"
job_variables:
image: "<http://ghcr.io/acme/my-image:{{|ghcr.io/acme/my-image:{{> env.get('IMAGE_TAG', 'latest') }}"
env:
APP_ENV: "{{ env.get('ENV', 'dev') }}"
LOG_LEVEL: "{{ env.get('LOG_LEVEL', 'INFO') }}"
2) Reuse outputs from earlier steps (e.g., git SHA) across the file
build:
- id: git_sha
run: git rev-parse --short HEAD
- run: docker build -t <http://ghcr.io/acme/app:{{|ghcr.io/acme/app:{{> steps.git_sha.stdout }} .
push:
- run: docker push <http://ghcr.io/acme/app:|ghcr.io/acme/app:>{{ steps.git_sha.stdout }}
deployments:
- name: "my-flow"
entrypoint: flows/my_flow.py:my_flow
version: "{{ steps.git_sha.stdout }}"
work_pool:
name: "prod-pool"
job_variables:
image: "<http://ghcr.io/acme/app:|ghcr.io/acme/app:>{{ steps.git_sha.stdout }}"
3) Lists and maps can contain templated strings
deployments:
- name: "batch-{{ env.get('ENV', 'dev') }}"
tags:
- "{{ env.get('ENV', 'dev') }}"
- "{{ env.get('TEAM', 'data-eng') }}"
job_variables:
env:
DATASET: "{{ env.get('DATASET', 'default') }}"
REGION: "{{ env.get('REGION', 'us-east-1') }}"
Notes and limitations
- Rendering time: Templates are evaluated when you run Prefect commands that read the file (e.g., prefect deploy). They are not re-evaluated at flow-run time. If you want values to vary per run, pass them via parameters, or set env in the worker/job environment.
- Data types: Jinja expressions are embedded in YAML strings, which Prefect then parses. Simple values work best; avoid trying to build complex Python objects in templates.
- Secrets: Be careful injecting secrets via env at deploy time; they’ll be rendered on your machine. Prefer storing credentials in Prefect Blocks or use job/infra env provided to the worker runtime.
- Where env vars come from: Prefect reads your process environment. If you keep values in a .env file, make sure they’re loaded into the shell before running prefect deploy (e.g., direnv, dotenv, or your CI’s env configuration).
Optional: Prefect Variables
- Prefect Variables are workspace-scoped values you can manage via CLI or UI and read inside flows. If you want to centralize some values there:
- Set a variable:
prefect variable set APP_ENV prod
- Get it later in code:
from prefect.variables import get
env = await get("APP_ENV") # in async context
- Whether Variables can be referenced directly in templated prefect.yaml depends on your workflow and security posture; many teams prefer pulling them at runtime inside the flow or injecting them as job env via your worker configuration.
Docs
- Prefect YAML: prefect.yaml docs
- Customizing job variables (e.g., image, env): Customize job variables
- Settings and profiles (where Prefect reads env/config): Settings and profilesMarvin
09/24/2025, 1:41 AM