Manoj Ravi
09/25/2025, 3:03 PMMarvin
09/25/2025, 3:03 PMMarvin
09/25/2025, 3:09 PMparam: str | None = None
or param: int = 5
). Parameters without defaults are Required.
- You generally should not edit parameter_openapi_schema
yourself; Prefect computes and stores it per deployment from the flow’s signature. Each deployment gets its own schema.
- Per-deployment default values belong in the deployment’s parameters
field (in prefect.yaml
or in .deploy(...)
). This does not change required vs optional; it just sets the defaults used when launching from that deployment.
Clean pattern to define optional params
1) Define optionality in the flow function signature
from typing import Optional
from prefect import flow, get_run_logger
@flow
def etl(required_date: str, lookback_days: Optional[int] = None, dry_run: bool = False):
logger = get_run_logger()
<http://logger.info|logger.info>(f"{required_date=}, {lookback_days=}, {dry_run=}")
- required_date
is Required (no default)
- lookback_days
and dry_run
are Optional (have defaults). Using Optional[...] = None
is recommended when “no value” is meaningful.
2) Give each deployment its own entrypoint and parameters
- With Python API
from prefect import flow
flow.from_source(
"<https://github.com/org/repo>",
"flows/etl.py:etl",
).deploy(
name="etl-daily",
work_pool_name="k8s",
parameters={"dry_run": False, "lookback_days": 1}, # per-deployment defaults
)
flow.from_source(
"<https://github.com/org/repo>",
"flows/notify.py:send_notifications",
).deploy(
name="notify-critical",
work_pool_name="k8s",
parameters={"severity": "critical"},
)
- With prefect.yaml and CLI
# prefect.yaml
name: my-project
prefect-version: ">=3.0.0"
deployments:
- name: etl-daily
entrypoint: flows/etl.py:etl
work_pool:
name: k8s
parameters:
dry_run: false
lookback_days: 1
- name: notify-critical
entrypoint: flows/notify.py:send_notifications
work_pool:
name: k8s
parameters:
severity: "critical"
Then run:
prefect deploy
Docs: prefect.yaml guide and Deployments concepts
About parameter_openapi_schema showing up “for all deployments”
- Each deployment stores its own parameter_openapi_schema
derived from the specific flow entrypoint’s signature at deployment time. It should not “bleed” to other deployments unless:
- The deployments in your prefect.yaml
all point to the same entrypoint, or
- You are programmatically updating parameter_openapi_schema
on multiple deployments, or
- You have placed parameter settings at the wrong scope (e.g., top-level instead of per-deployment).
- The clean approach is to:
- Ensure each deployment entry references the correct entrypoint
for that flow.
- Do not manually set `parameter_openapi_schema`; let Prefect generate it.
- If a deployment’s schema looks stale, re-deploy that specific deployment so it’s recomputed from the flow signature.
A few tips
- Required vs Optional is controlled by the flow signature only. The parameters
you supply on a deployment just set defaults for that deployment’s runs; they do not make a required param optional.
- Keep enforce_parameter_schema=True
(the default) so type validation and UI rendering stay consistent with your annotations and defaults.
Can you share:
- Your Prefect version (assuming 3.x)?
- How you’re creating deployments (CLI with prefect.yaml
or Python via .deploy
)?
- A snippet of your prefect.yaml
and one flow’s signature?
- Where/how you’re seeing parameter_openapi_schema
updated “for all deployments”?
With that, I can spot the scoping issue and suggest the exact change.Manoj Ravi
09/25/2025, 3:20 PMMarvin
09/25/2025, 3:21 PMMarvin
09/25/2025, 3:24 PMparameter_openapi_schema
, generated from that deployment’s flow entrypoint at deploy time.
- Changing parameters for one deployment should not change others’ schemas.
If all deployments now have the same schema, it typically means one of:
- A manual update (via UI or API) wrote the same schema to multiple deployment records.
- Multiple deployments actually point to the same entrypoint (so they legitimately share a schema).
- A script or CI step updated deployments in bulk.
How to diagnose quickly
- Confirm entrypoints and current schemas:
prefect deployment ls
prefect deployment inspect "flow-name/deployment-name"
This will show entrypoint
and the current parameter_openapi_schema
.
Docs: Deployments and CLI help (inspect) output shows the schema.
How to fix deployments that have the wrong schema
Pick one of these:
1) Reset to auto-generated in the UI (if you used the form builder)
- In the UI, open the problem deployment -> Run form/Parameters -> switch back to “auto-generated from flow signature” (or equivalent reset), then Save.
- Docs: Form building in Prefect 3
2) Re-deploy each affected deployment so Prefect recomputes the schema from its flow signature
- If you deploy via Python:
from prefect import flow
flow.from_source("<https://github.com/org/repo>", "flows/etl.py:etl").deploy(
name="etl-daily",
work_pool_name="k8s",
parameters={"dry_run": False, "lookback_days": 1},
enforce_parameter_schema=True, # ensure schema is (re)generated
)
- If you deploy with prefect.yaml:
# prefect.yaml
deployments:
- name: etl-daily
entrypoint: flows/etl.py:etl
work_pool:
name: k8s
parameters:
dry_run: false
lookback_days: 1
Then run:
prefect deploy
Docs: prefect.yaml guide
3) Delete and recreate only the incorrect deployments
prefect deployment inspect "flow-a/deploy-1"
prefect deployment delete "flow-a/deploy-1"
prefect deploy # or your .deploy() calls
Preventing a repeat
- Do not manually set parameter_openapi_schema
via API unless you intend a custom form. Let Prefect compute it.
- Ensure each deployment has the correct unique entrypoint
for its flow in prefect.yaml
or your .deploy(...)
calls.
- Keep enforce_parameter_schema=True
(default) during deploys so the schema matches your flow signature.
- Mark parameters optional in your flow signature to control the UI’s Optional tab (e.g., x: int | None = None
or y: str = "default"
). The deployment parameters
field sets per-deployment default values but does not change required vs. optional.
Open questions to help me pinpoint the root cause
- Which Prefect version and environment (Cloud vs. self-hosted)?
- How did you “update” the schema for the one flow — via the UI form builder, API, or code?
- Are all your deployments pointing to unique entrypoints? If possible, paste one or two prefect deployment inspect
outputs showing entrypoint
and the unexpected parameter_openapi_schema
.
If you can share one affected deployment name and how you’re deploying (prefect.yaml vs. Python), I’ll give you exact commands/steps to reset only those deployments.Manoj Ravi
09/25/2025, 3:29 PMMarvin
09/25/2025, 3:30 PMMarvin
09/25/2025, 3:32 PMparameter_openapi_schema
on every deployment:
{"type":"object","title":"Parameters","properties":{"edix_custom_config":{"title":"edix_custom_config","default":{},"position":0}},"definitions":{}}
- That schema only defines one property (edix_custom_config
) with a default {}
and no other parameters.
Why that’s a problem
- That JSON is a custom run-form schema. It’s not the auto-generated schema from your flow signature. If Prefect generated the schema from your example flow, it would include:
- tenant_id
(required, no default)
- edix_custom_config
(required if no default, optional if you add a default)
- config
(optional, because Optional[...] = None
)
- The presence of a position
field and a default {}
strongly suggests a custom form was saved somewhere (UI or API) and then applied to multiple deployments. Prefect does not normally propagate a deployment’s schema to other deployments.
Likely causes
- A script or migration updated parameter_openapi_schema
for many deployments.
- The wrong scope was used when updating via API (e.g., iterating deployments and setting the same schema).
- All deployments share a single edited schema object in code and you’re writing it repeatedly.
- Less likely: all deployments legitimately point to the same entrypoint and you customized the form for that deployment, then redeployed in a way that overwrote others. Still, you’d normally see the full param list even then.
How to fix it
Pick one path and verify with inspect afterward.
1) Reset in the UI (if you used the form builder)
- Open each deployment in the UI, go to the Run form/Parameters section, and reset to “auto-generated from flow signature” (wording may vary). Save.
- Docs: Form building in Prefect 3
2) Re-deploy each deployment so Prefect recomputes the schema from the flow signature
- Python API example:
from prefect import flow
flow.from_source(
"<https://github.com/org/repo>",
"flows/forecaster.py:forecaster_evaluation_flow_run",
).deploy(
name="forecaster-eval",
work_pool_name="k8s",
enforce_parameter_schema=True, # recompute and store schema
)
- Or with `prefect.yaml`:
# prefect.yaml
deployments:
- name: forecaster-eval
entrypoint: flows/forecaster.py:forecaster_evaluation_flow_run
work_pool:
name: k8s
Then run:
prefect deploy
Docs: prefect.yaml guide and Deployments concepts
3) Delete and recreate the affected deployments (only if you prefer a clean slate)
prefect deployment inspect "Forecaster Evaluation Flow/forecaster-eval"
prefect deployment delete "Forecaster Evaluation Flow/forecaster-eval"
prefect deploy # or run your .deploy() code again
Make the flow’s optionality explicit
If edix_custom_config
should be optional, avoid using {}
as a default (mutable defaults). Use None
and handle it inside the flow; also prefer standard typing so the schema generator can infer a good JSON schema:
from typing import Optional, Mapping, Any
from prefect import flow
@flow(name="Forecaster Evaluation Flow", flow_run_name="Forecaster Evaluation Run")
def forecaster_evaluation_flow_run(
tenant_id: str,
edix_custom_config: Optional[Mapping[str, Any]] = None,
config: Optional[Mapping[str, Any]] = None, # consider avoiding DictConfig in the signature
):
...
- Using DictConfig
in the type annotation can confuse schema generation. If you need it, annotate as Mapping[str, Any] | None
and convert internally.
Verify after the fix
- Before and after, run:
prefect deployment inspect "Forecaster Evaluation Flow/forecaster-eval"
- You should see parameter_openapi_schema
include all three parameters with the right required/optional status.Marvin
09/25/2025, 3:32 PMparameter_openapi_schema
)?
- Are all your deployments created from a script/CI that could be reusing a single hard-coded schema?
If you share how you create deployments (prefect.yaml vs Python) and one prefect deployment inspect
output for an affected deployment (showing its entrypoint
and parameter_openapi_schema
), I can give you exact, minimal steps to reset only those deployments.Manoj Ravi
09/25/2025, 3:40 PMMarvin
09/25/2025, 3:41 PMMarvin
09/25/2025, 3:42 PMparameter_openapi_schema
at deployment time (or when you explicitly edit the run form).
Key points
- Don’t write to the Prefect DB directly for deployments. It bypasses the logic that builds per-deployment schemas from the flow signature.
- Your edited schema includes a default {}
for edix_custom_config
, which makes it optional in the UI even though your flow signature makes it required. If a parameter should be required, do not give it a default in the schema or in the flow signature.
How to fix (clean and reliable)
1) Re-deploy each deployment so Prefect recomputes the schema from the flow’s signature
- If you manage deployments with prefect.yaml
, run this in each project’s repo:
prefect deploy
Docs: prefect.yaml guide
- If you deploy via Python:
from prefect import flow
flow.from_source(
"<https://github.com/org/repo>",
"flows/forecaster.py:forecaster_evaluation_flow_run",
).deploy(
name="forecaster-eval",
work_pool_name="k8s",
enforce_parameter_schema=True, # recompute and store per-deployment schema
)
2) Or, reset per-deployment in the UI
- Open each deployment -> Run form/Parameters -> switch back to auto-generated from flow signature (remove the custom JSON), then Save.
Docs: Advanced form building
3) Verify after the fix
prefect deployment inspect "Forecaster Evaluation Flow/forecaster-eval"
You should see a schema that includes:
- tenant_id
as required (no default)
- edix_custom_config
as required (no default)
- config
as optional (since it’s Optional[...] = None
)
Prevent this going forward
- Avoid updating deployment.parameter_openapi_schema
via SQL. If you need to change the run form, do it in the UI (per deployment) or re-deploy.
- Keep parameters required/optional strictly via the flow signature:
- Required: no default (e.g., edix_custom_config: dict
)
- Optional: give a default (e.g., config: Mapping[str, Any] | None = None
)
- If you previously used a default {}
to force a “mandatory” field, remove it. A default means optional. For required, there must be no default.
If you share how you usually create deployments (prefect.yaml vs Python) and one prefect deployment inspect
output for an affected deployment, I can give you the exact minimal set of steps to fix just those deployments.Manoj Ravi
09/25/2025, 3:44 PMMarvin
09/25/2025, 3:44 PMMarvin
09/25/2025, 3:45 PMparameter_openapi_schema
when you change the DB directly. Whatever is in that column is what the UI uses. Setting it to an empty object won’t trigger regeneration; it just gives you a blank/custom form.
Ways to properly reset the schema so it’s generated per deployment from the flow signature
- Easiest: re-deploy the affected deployments so Prefect recomputes the schema from the entrypoint’s function signature.
- If you use prefect.yaml, run this in each project repo:
prefect deploy
Docs: prefect.yaml guide
- If you deploy via Python:
flow.from_source("<https://github.com/org/repo>", "flows/forecaster.py:forecaster_evaluation_flow_run").deploy(
name="forecaster-eval",
work_pool_name="k8s",
enforce_parameter_schema=True,
)
- UI-based per-deployment reset (no code changes): open each deployment in the UI → Run form/Parameters → switch back to “auto-generated from flow signature” (remove custom JSON) → Save.
Docs: Form building
- If you need a bulk fix without redeploying code: we can script it against the API
- For each deployment, compute the schema from the flow entrypoint (using Prefect’s schema generator) and update the deployment via the API.
- The generator you want is prefect.utilities.callables.parameter_schema_from_entrypoint(entrypoint)
.
- Then update that deployment’s parameter_openapi_schema
via the Prefect API. If you want, I can help draft this script once you confirm where your deployments live (single repo vs many) and whether async Python is okay in your environment.
What you should see after the reset
- For your example flow:
- tenant_id
— required (no default)
- edix_custom_config
— required (no default). Do not add a default in the schema or in the flow if you want it required.
- config
— optional (Optional[...] = None
)
- Verify with:
prefect deployment inspect "Forecaster Evaluation Flow/forecaster-eval"
This prints the current parameter_openapi_schema
.
Next steps/questions so I can give you the shortest path
- Is re-running prefect deploy
feasible in your setup? If yes, that’s the cleanest fix.
- Roughly how many deployments need resetting? If it’s a lot, I’ll share a small script to iterate deployments, compute parameter_openapi_schema
from their entrypoints, and update them via the API.Manoj Ravi
09/25/2025, 3:48 PMMarvin
09/25/2025, 3:48 PMMarvin
09/25/2025, 3:49 PM@flow(name="Forecaster Evaluation Flow", flow_run_name="Forecaster Evaluation Run")
def forecaster_evaluation_flow_run(
tenant_id: str,
edix_custom_config: dict,
config: Optional[DictConfig] = None,
) -> pd.DataFrame:
...
Here’s a clean prefect.yaml
that will produce a deployment with an auto-generated schema:
name: forecasting-project
prefect-version: ">=3.0.0"
# Optional: project-level defaults
# build: false
# push: false
deployments:
- name: forecaster-eval
entrypoint: flows/forecaster.py:forecaster_evaluation_flow_run
description: "Run the forecaster evaluation flow"
work_pool:
name: k8s # or your pool
# job_variables: {} # if you need them
# parameters here are per-deployment defaults; they DO NOT change required/optional
# Leave out required parameters so the UI shows them as Required
# parameters:
# config: null # optional param; you can omit entirely if you want it blank
tags:
- forecasting
# Optional schedule examples (pick one or omit)
# schedules:
# - interval: 86400 # seconds
# - cron: "0 6 * * *"
What this does
- On prefect deploy
, Prefect reads your flow signature from entrypoint
and writes a per-deployment parameter_openapi_schema
.
- With your signature:
- tenant_id
will be Required (no default)
- edix_custom_config
will be Required (no default)
- config
will be Optional (has a default of None
)
- Since you’re resetting the DB schema to empty, running prefect deploy
is what triggers regeneration.
Commands to run
# From the project root containing prefect.yaml and the flows/forecaster.py file
prefect deploy
# Verify the regenerated schema
prefect deployment inspect "Forecaster Evaluation Flow/forecaster-eval"
Notes and tips
- Do not put a default for edix_custom_config
in the flow signature or in the custom form; if you add a default in the schema, the UI will treat it as optional.
- The parameters
section in prefect.yaml
just pre-fills defaults for runs launched from that deployment. It does not change required vs. optional.
- The schema generator may not fully understand DictConfig
. If the UI form for config
looks odd, consider annotating it as a standard mapping type and converting internally:
from typing import Optional, Mapping, Any
config: Optional[Mapping[str, Any]] = None
Docs
- Prefect YAML: prefect.yaml guide
- Deployments overview: Deployments concepts
- Form building (if you later want to customize per-deployment forms): Advanced form building
If you share your real entrypoint path (file:func) and work pool name, I can tailor the exact prefect.yaml
for your repo layout.Manoj Ravi
09/25/2025, 3:54 PMMarvin
09/25/2025, 3:54 PMMarvin
09/25/2025, 3:55 PMedix_custom_config
is optional, make that clear in the flow signature and avoid a mutable default.
Recommended flow signature
from typing import Optional, Mapping, Any
from prefect import flow
@flow(name="Forecaster Evaluation Flow", flow_run_name="Forecaster Evaluation Run")
def forecaster_evaluation_flow_run(
tenant_id: str,
edix_custom_config: Optional[Mapping[str, Any]] = None, # optional
config: Optional[Mapping[str, Any]] = None, # optional
):
...
Notes:
- Using = None
makes it optional in Prefect’s schema and shows it under the Optional tab in the UI.
- Avoid edix_custom_config: dict = {}
(mutable defaults).
- If you must use DictConfig
, still annotate as Mapping[str, Any] | None
for schema generation, and convert to DictConfig
inside the flow.
prefect.yaml for this flow
name: forecasting-project
prefect-version: ">=3.0.0"
deployments:
- name: forecaster-eval
entrypoint: flows/forecaster.py:forecaster_evaluation_flow_run # update to your real path
description: "Run the forecaster evaluation flow"
work_pool:
name: k8s # update to your real pool
# parameters: {} # omit; let the UI render from the flow signature
tags:
- forecasting
# schedules: # optional; add if needed
# - cron: "0 6 * * *"
After you run your SQL UPDATE (clearing schemas)
- Re-deploy so Prefect regenerates each deployment’s parameter_openapi_schema
from the flow signature:
prefect deploy
- Verify the regenerated schema includes tenant_id
as Required and both edix_custom_config
and config
under Optional:
prefect deployment inspect "Forecaster Evaluation Flow/forecaster-eval"
If you prefer not to re-deploy code
- You can reset per-deployment in the UI: open the deployment → Run form/Parameters → restore to auto-generated from flow signature → Save.
- Or we can script a bulk reset via the API by computing the schema from each deployment’s entrypoint and updating parameter_openapi_schema
for each. I can share that script if you want.
Docs for reference:
- prefect.yaml guide
- Deployments concepts
- Form building and custom run forms
If you share your actual entrypoint path and work pool name, I’ll tailor the YAML exactly for your repo.Manoj Ravi
09/25/2025, 3:57 PMMarvin
09/25/2025, 3:57 PMMarvin
09/25/2025, 3:58 PMtenant_id
required in the flow signature and set it per deployment via parameters. Mark edix_custom_config
optional in the signature so it shows under Optional in the UI.
Flow (make edix_custom_config optional with a safe default)
from typing import Optional, Mapping, Any
from prefect import flow
@flow(name="Forecaster Evaluation Flow", flow_run_name="Forecaster Evaluation Run")
def forecaster_evaluation_flow_run(
tenant_id: str, # required
edix_custom_config: Optional[Mapping[str, Any]] = None, # optional
config: Optional[Mapping[str, Any]] = None, # optional
):
...
prefect.yaml with per-tenant deployments
name: forecasting-project
prefect-version: ">=3.0.0"
deployments:
- name: forecaster-eval-tenant-a
entrypoint: flows/forecaster.py:forecaster_evaluation_flow_run # update path
description: "Forecaster evaluation for tenant A"
work_pool:
name: k8s # update to your pool
parameters:
tenant_id: "tenant-a" # pre-filled per deployment
# edix_custom_config: null # optional; omit or set to null
# config: null # optional; omit or set to null
tags: [forecasting, tenant-a]
- name: forecaster-eval-tenant-b
entrypoint: flows/forecaster.py:forecaster_evaluation_flow_run
description: "Forecaster evaluation for tenant B"
work_pool:
name: k8s
parameters:
tenant_id: "tenant-b"
tags: [forecasting, tenant-b]
What you’ll see in the UI (Custom Run) after you reset the DB column and redeploy
- tenant_id
remains Required (because the flow signature has no default). The deployment pre-fills it with the value you set in parameters
, but it stays under Required.
- edix_custom_config
and config
appear under Optional (because they have = None
).
Steps to regenerate schemas after your SQL reset
1) Run deploy in the repo with the prefect.yaml above:
prefect deploy
2) Verify the schema is correct:
prefect deployment inspect "Forecaster Evaluation Flow/forecaster-eval-tenant-a"
Notes
- Avoid mutable defaults like {}
for edix_custom_config
. Use None
and handle it inside the flow.
- The deployment parameters
section sets per-deployment defaults but does not change required vs optional; only the flow signature controls that.
- If you previously used the UI form builder for a deployment, make sure it’s set to auto-generate from the flow signature (per deployment) rather than a custom JSON form if you want this behavior.
Refs:
- prefect.yaml guide
- Deployments concepts
- Custom run forms and schemaManoj Ravi
09/25/2025, 4:03 PMMarvin
09/25/2025, 4:03 PMMarvin
09/25/2025, 4:04 PMparameter_openapi_schema
stored from your DB edits (or an older custom form). The UI only renders what’s in that schema; it won’t regenerate it automatically.
What to do
1) Verify the schema on an affected deployment
prefect deployment inspect "General Forecast Flow Run/<your-deployment-name>"
If you see an empty/custom schema or anything not matching your flow signature, that’s the reason the UI isn’t showing the Optional params.
2) Reset the run form schema per deployment
Pick one:
- UI: Deployment -> Run form/Parameters -> switch to “auto-generated from flow signature” -> Save.
- Or re-deploy so Prefect recomputes from the flow signature.
Your flow (keep optional params with None defaults)
from typing import Optional, Mapping, Any
from prefect import flow
@flow(
name="General Forecast Flow Run",
retries=3,
retry_delay_seconds=10,
)
def general_forecast_flow_run(
tenant_id: str, # required
resolution_type: str, # required
forecast_type: str, # required
parent_run_info: Optional[Mapping[str, Any]] = None, # optional
override_config: Optional[Mapping[str, Any]] = None, # optional
edix_prefect_config: Optional[Mapping[str, Any]] = None, # optional
) -> dict:
...
Notes:
- Avoid mutable defaults like `{}`; use None
.
- Using Mapping[str, Any] | None
helps schema generation more than custom types.
Example prefect.yaml for per-tenant deployments
name: forecasting-project
prefect-version: ">=3.0.0"
deployments:
- name: general-forecast-tenant-a
entrypoint: flows/general_forecast.py:general_forecast_flow_run # update path
description: "General forecast for tenant A"
work_pool:
name: k8s # your pool
parameters:
tenant_id: "tenant-a"
resolution_type: "daily"
forecast_type: "baseline"
tags: [forecasting, tenant-a]
- name: general-forecast-tenant-b
entrypoint: flows/general_forecast.py:general_forecast_flow_run
description: "General forecast for tenant B"
work_pool:
name: k8s
parameters:
tenant_id: "tenant-b"
resolution_type: "daily"
forecast_type: "baseline"
tags: [forecasting, tenant-b]
Then run:
prefect deploy
What you should see after redeploy/reset
- Required tab: tenant_id
, resolution_type
, forecast_type
(because no defaults)
- Optional tab: parent_run_info
, override_config
, edix_prefect_config
(because they default to None)
If you prefer bulk fixing without redeploying code, we can script against the API to compute and write the correct parameter_openapi_schema
for each deployment from its entrypoint. If that’s preferable, tell me roughly how many deployments and how you want to target them (by tag, name pattern, etc.), and I’ll share a targeted script.Manoj Ravi
09/25/2025, 4:10 PMMarvin
09/25/2025, 4:10 PMMarvin
09/25/2025, 4:10 PMedix_prefect_config: dict | None = None
- edix_prefect_config: dict[str, Any] | None = None
- edix_prefect_config: Mapping[str, Any] | None = None
- Mapping is just a bit more flexible as an input type (accepts any mapping-like object, including things like OmegaConf’s DictConfig), while not committing your API to a concrete dict. But semantically for Prefect parameters, both map to JSON objects and work the same in practice.
What actually controls “Optional” in the UI
- Optional vs Required is determined only by the flow signature’s defaults, not by choosing dict vs Mapping.
- Required: no default (e.g., tenant_id: str
)
- Optional: has a default (e.g., edix_prefect_config: dict | None = None
)
- If your UI is not showing your optional params under Optional, it’s because the deployment’s stored parameter_openapi_schema
is stale or custom — not because you used dict.
Safety tip
- Avoid mutable defaults like `{}`:
# bad: mutable default
edix_prefect_config: dict = {}
# good: optional with safe default
edix_prefect_config: dict | None = None
- If you’re using DictConfig
in your code, prefer annotating as Mapping[str, Any] | None
(or dict[str, Any] | None
) and convert internally — Prefect’s schema generator understands standard types better than custom ones.
If you reset the DB schema and re-deploy, a parameter declared as:
edix_prefect_config: dict | None = None
will appear under Optional in the Custom Run form.
References:
- Form building and schemas: Form building
- Deployments and prefect.yaml: prefect.yaml guideManoj Ravi
09/25/2025, 4:15 PMMarvin
09/25/2025, 4:15 PMMarvin
09/25/2025, 4:16 PMparameter_openapi_schema
to NULL. Then re-deploy (or reset per-deployment in the UI) so Prefect regenerates it from the flow signature.
Postgres-safe snippets (pick one)
- Reset ALL deployments
UPDATE deployment
SET parameter_openapi_schema = NULL;
- Reset only rows that currently contain the edix_custom_config-only schema (broad match)
UPDATE deployment
SET parameter_openapi_schema = NULL
WHERE parameter_openapi_schema::text LIKE '%"edix_custom_config"%';
- Reset only rows that exactly match the known bad JSON (exact match with dollar-quoting)
UPDATE deployment
SET parameter_openapi_schema = NULL
WHERE parameter_openapi_schema::text = $json$
{"type":"object","title":"Parameters","properties":{"edix_custom_config":{"title":"edix_custom_config","default":{},"position":0}},"definitions":{}}
$json$;
- Reset for a single deployment by name (plus flow name)
UPDATE deployment d
SET parameter_openapi_schema = NULL
FROM flow f
WHERE d.flow_id = f.id
AND f.name = 'Forecaster Evaluation Flow'
AND d.name = 'forecaster-eval-tenant-a';
Recommended workflow
1) Run one of the UPDATEs above (wrap in a transaction if you like).
2) Re-deploy the affected deployments so Prefect recomputes the schema from the flow signature:
prefect deploy
3) Verify:
prefect deployment inspect "Your Flow Name/your-deployment-name"
Note
- Leaving the column as an empty JSON object will not trigger regeneration; setting it to NULL is the clean reset.
- Optional vs required is still controlled by your flow signature defaults. After redeploy, your Optional params (with = None
) will show under the Optional tab.Manoj Ravi
09/30/2025, 10:40 AMMarvin
09/30/2025, 10:40 AMMarvin
09/30/2025, 10:44 AMprefect.yaml
lives in a different repo than the code, you need to ensure the code is importable at deploy time.
How it works in Prefect 3.x
- The parameter schema that appears in the UI is captured when you create/update the deployment (e.g., via prefect deploy
or flow.deploy(...)
).
- Prefect infers parameters by importing the flow function and inspecting its signature. If the flow cannot be imported at deploy time, Prefect cannot infer parameters, and the UI won’t show a structured parameter form.
What this means for your setup
- If repo A has prefect.yaml
and repo B has the flow code:
- If you run prefect deploy
from repo A without a way to import the flow in repo B, Prefect will not infer parameters.
- You have two reliable options to make parameter inference work:
1) Run a Python-based deploy that references the code source and can import the flow:
- Run this in an environment that can fetch/clone or otherwise import the code (and has access tokens for private repos).
- Example:
from prefect.flows import Flow
flow = Flow.from_source(
source="<https://github.com/<owner>/<repo>>",
entrypoint="path/to/flows.py:my_flow",
)
flow.deploy(
name="prod",
work_pool_name="my-pool",
)
This approach will only capture parameters if that script is able to resolve/import my_flow
. If the environment cannot import it, parameters won’t be inferred.
2) Use prefect deploy
but make the code importable first:
- In CI or locally, clone repo B (or install the project as a package), then run prefect deploy
from a project that points to that entrypoint so the CLI can import the flow.
- If your prefect.yaml
is in repo A, consider running the deploy job from a workspace where both repos are available (e.g., check out both repos into the job) or switch to the Python-based approach above.
Notes and limitations
- Private GitHub repos: provide credentials (e.g., GITHUB_TOKEN or SSH key) so the deploy environment can access the code.
- If the flow can’t be imported during deploy, the deployment will still work at runtime (the worker can pull the code), but the UI won’t show a structured parameters form; you may just see an empty/default JSON editor.
- You can still set deployment-level default values in prefect.yaml
under deployments[].parameters
, but this does not replace full schema inference from the flow’s signature.
Helpful links
- prefect.yaml reference
- Deploy a flow
- Store flow code for deployments
- Flow.from_source API
A couple of quick questions to make sure I point you to the best setup:
- How are you creating the deployment today: prefect deploy
from repo A, or a Python script calling flow.deploy()
?
- Is repo B (the code) public or private?
- Are you running this locally or in CI (and can that environment clone/import repo B)?Manoj Ravi
09/30/2025, 10:52 AMMarvin
09/30/2025, 10:52 AMMarvin
09/30/2025, 10:53 AMprefect deploy
. That means when prefect deploy --all --prefect-file $filename
runs (from the repo that just contains the prefect.yaml), it cannot import pipelines/forecaster_pipeline.py:forecaster_training_flow_run
, so Prefect cannot infer parameters. The deployment will work at runtime, but the UI won’t have the structured parameter form.
Two reliable ways to fix this in 2.x:
Option A: Add build steps so the deploy process can import the flow
- Clone and install your flow repo in build steps (these run during prefect deploy
), set the working directory so the entrypoint
resolves, then keep your pull steps for runtime.
Example (adjust repo URL, branch, and install commands as needed):
prefect-version: 2.14.12
build:
- prefect.deployments.steps.set_working_directory:
directory: /opt/prefect
- prefect.deployments.steps.git_clone:
repository: <https://github.com/<owner>/<repo>>
access_token: "{{ prefect.blocks.secret.ghb-tkn }}"
branch: main
id: code
- prefect.deployments.steps.set_working_directory:
directory: /opt/prefect/{{ code.directory }}
- prefect.deployments.steps.run_shell_script:
script: |
poetry install --no-root
pull:
- prefect.deployments.steps.set_working_directory:
directory: /opt/prefect
- prefect.deployments.steps.git_clone:
repository: <https://github.com/<owner>/<repo>>
access_token: "{{ prefect.blocks.secret.ghb-tkn }}"
branch: main
id: clone
- prefect.deployments.steps.set_working_directory:
directory: /opt/prefect/{{ clone.directory }}
- prefect.deployments.steps.run_shell_script:
script: |
poetry install --no-root
deployments:
- name: Inventory Forecaster Pipeline
tags: ["Inventory Intelligence"]
description: "Run the forecaster pipeline"
entrypoint: pipelines/forecaster_pipeline.py:forecaster_training_flow_run
parameters:
# You can set defaults here; schema still comes from the flow signature
tenant_id: null
work_pool:
name: ecs-work-pool
job_variables:
vpc_id: ""
cluster: ""
task_definition_arn: ""
env: {}
schedule:
cron: "0 4 * * *"
timezone: UTC
day_or: true
Notes:
- Ensure Poetry is available in the environment where prefect deploy
runs, otherwise the build step will fail.
- With the working directory set to the cloned repo, your entrypoint
remains pipelines/forecaster_pipeline.py:forecaster_training_flow_run
.
- I removed the empty tag ""
and fixed a stray quote on tenant_id
.
Option B: Checkout the code in your CI before running prefect deploy
- In your workflow, checkout both repos (or just the repo that contains the flow code), then run:
prefect --no-prompt deploy --all --prefect-file $filename
- Since the code is present locally, Prefect can import the flow during deploy and infer the parameter schema for the UI.
What happens if you don’t do either?
- The deployment still works at runtime (because your worker executes the pull steps and can fetch/install the code), but the UI won’t show the flow’s structured parameter form; you’ll see a plain JSON editor instead.
If you’re open to Prefect 3.x in the future, Flow.from_source(...).deploy(...)
makes this pattern more straightforward, but your current 2.14.12 setup will work well with one of the two options above.
Questions to confirm:
- Is the flow repo private (i.e., will the build environment have valid credentials for cloning)?Marvin
09/30/2025, 10:53 AMprefect deploy
?
Helpful reading (3.x pages, but the principle about needing to import the flow to infer parameters applies in 2.x as well):
- prefect.yaml reference
- Store flow code for deploymentsManoj Ravi
09/30/2025, 10:56 AMMarvin
09/30/2025, 10:56 AMMarvin
09/30/2025, 10:58 AMFlow.from_source(...)
that points to your GitHub repo.
Recommended approach (Prefect 3.1.0)
- Use a small deploy script that references your code repo and entrypoint, then calls deploy()
.
- Run this in CI (or locally) with access to the repo (e.g., GITHUB_TOKEN for private repos). This allows Prefect to import the flow during deploy and capture the parameter schema for the UI.
Example deploy script
from prefect.flows import Flow
# Source: your code repo; Entrypoint: path to file and flow function
flow = Flow.from_source(
source="<https://github.com/<owner>/<repo>>",
entrypoint="pipelines/forecaster_pipeline.py:forecaster_training_flow_run",
)
# Create or update the deployment
flow.deploy(
name="inventory-forecaster-prod",
work_pool_name="ecs-work-pool",
tags=["Inventory Intelligence"],
cron="0 4 * * *", # optional schedule
job_variables={
"vpc_id": "",
"cluster": "",
"task_definition_arn": "",
"env": {}, # your env map
},
parameters={
# optional deploy-time defaults; schema comes from the flow signature
"tenant_id": None,
},
)
Key points
- Parameter schema is inferred at deploy time by importing the flow. If the deploy environment can’t access the repo/entrypoint, Prefect cannot infer parameters; the UI will show a plain JSON editor instead of a structured form.
- Using Flow.from_source(...)
records where to fetch code at runtime too, so workers can pull the code when running.
- If your repo is private, ensure your CI has credentials to access it (e.g., GITHUB_TOKEN
).
If you prefer the CLI
- In 3.x, prefect deploy
expects to run from a project that can import the entrypoint. That usually means checking out the code repo in the same job before running prefect deploy
. Since your YAML lives in a different repo, the Python approach above is typically smoother.
Docs
- Deploy a flow
- Flow.from_source API
- Store flow code for deployments
Quick checks
- Is the code repo public or private?
- Will you run the deploy from CI (so we can set up the Git credentials there)?Manoj Ravi
09/30/2025, 11:00 AMMarvin
09/30/2025, 11:00 AMMarvin
09/30/2025, 11:02 AMprefect-version: 3.1.0
build:
- prefect.deployments.steps.set_working_directory:
directory: /opt/prefect
- prefect.deployments.steps.git_clone:
repository: <https://github.com/<owner>/<repo>>
access_token: "{{ prefect.blocks.secret.ghb-tkn }}"
branch: main
id: code
- prefect.deployments.steps.set_working_directory:
directory: /opt/prefect/{{ code.directory }}
- prefect.deployments.steps.run_shell_script:
script: |
poetry install --no-root
pull:
- prefect.deployments.steps.set_working_directory:
directory: /opt/prefect
- prefect.deployments.steps.git_clone:
repository: <https://github.com/<owner>/<repo>>
access_token: "{{ prefect.blocks.secret.ghb-tkn }}"
branch: main
id: clone
- prefect.deployments.steps.set_working_directory:
directory: /opt/prefect/{{ clone.directory }}
- prefect.deployments.steps.run_shell_script:
script: |
poetry install --no-root
deployments:
- name: Inventory Forecaster Pipeline
tags: ["Inventory Intelligence"]
description: "Run the forecaster pipeline"
entrypoint: pipelines/forecaster_pipeline.py:forecaster_training_flow_run
parameters:
tenant_id: null
work_pool:
name: ecs-work-pool
job_variables:
vpc_id: ""
cluster: ""
task_definition_arn: ""
env: {}
schedule:
cron: "0 4 * * *"
timezone: UTC
day_or: true
Notes:
- Keep running prefect deploy from the directory containing this prefect.yaml. Your CI command can remain nearly the same; if possible, run it from the repo with the yaml to avoid needing special flags.
- Ensure your deploy environment has git and poetry available (build steps run at deploy time).
- Private repos: your access_token
block must be resolvable during deploy.
Option 2: No yaml changes — check out code in CI before deploy
- In your workflow, clone the code repo (the one with pipelines/forecaster_pipeline.py) before running:
prefect --no-prompt deploy --all
- Since the code is present locally during deploy, Prefect can import your flow and infer parameters for the UI. This keeps your prefect.yaml untouched.
Why this matters
- Prefect infers parameters by importing your flow during deployment. If the code is only fetched in pull steps (runtime), the UI won’t show a structured parameter form. Either making the code available in build steps or checking it out in CI solves that.
Helpful docs
- prefect.yaml reference
- Store flow code for deployments
Want me to adapt your exact yaml (with your repo URL and any private token handling) and your CI snippet step-by-step?Manoj Ravi
09/30/2025, 11:04 AMMarvin
09/30/2025, 11:04 AMMarvin
09/30/2025, 11:08 AMprefect --no-prompt deploy --all --prefect-file prefect.yaml
- Set the resulting image in work_pool.job_variables.image
.
2) Build via a build step (shell) in prefect.yaml
- Add a step that runs docker build/push. Your deploy environment must have Docker and registry creds.
Example:
build:
- prefect.deployments.steps.run_shell_script:
script: |
aws ecr get-login-password --region $AWS_REGION | docker login --username AWS --password-stdin <acct>.dkr.ecr.$<http://AWS_REGION.amazonaws.com|AWS_REGION.amazonaws.com>
docker build -t <repo>:${GIT_SHA} .
docker tag <repo>:${GIT_SHA} <acct>.dkr.ecr.$<http://AWS_REGION.amazonaws.com/<repo>:${GIT_SHA}|AWS_REGION.amazonaws.com/<repo>:${GIT_SHA}>
docker push <acct>.dkr.ecr.$<http://AWS_REGION.amazonaws.com/<repo>:${GIT_SHA}|AWS_REGION.amazonaws.com/<repo>:${GIT_SHA}>
deployments:
- name: Inventory Forecaster Pipeline
work_pool:
name: ecs-work-pool
job_variables:
image: "<acct>.dkr.ecr.<region>.<http://amazonaws.com/<repo>:${GIT_SHA}|amazonaws.com/<repo>:${GIT_SHA}>"
env: {}
- Note: This runs at deploy time, not on the worker.
ECS precedence reminder:
- If task_definition_arn
is set, ECS uses it and ignores other container settings like image
in job_variables.
- If no ARN, and image
is set, Prefect’s ECS worker will register/use a task definition for that image.
Docs:
- prefect.yaml reference
- ECS work pool guide
Do you want me to wire up a concrete build step for your registry (ECR/GCR/ACR) and update your yaml accordingly, or will you handle the image build in CI?Manoj Ravi
09/30/2025, 1:12 PMMarvin
09/30/2025, 1:12 PMMarvin
09/30/2025, 1:13 PMprefect deploy
runs, Prefect will import the actual flow functions from your repo and infer their parameters (including optional defaults) for the UI.
Concretely:
- Remove the part of sync.sh
that writes dummy files like:
from prefect import Flow
@Flow
def <flow>(edix_custom_config=dict()):
...
Those placeholders override your real code and prevent accurate schema inference.
- Add build steps in your prefect.yaml that:
- git clone your code repo
- set the working directory to the cloned repo
- install dependencies (e.g., poetry install)
This ensures that when prefect deploy
runs, the entrypoints in your yaml point to real files, and Prefect can import your real flow functions and see the true parameter signatures.
Example shape (3.1.0):
prefect-version: 3.1.0
build:
- prefect.deployments.steps.set_working_directory:
directory: /opt/prefect
- prefect.deployments.steps.git_clone:
repository: <https://github.com/<owner>/<repo>>
access_token: "{{ prefect.blocks.secret.ghb-tkn }}"
branch: main
id: code
- prefect.deployments.steps.set_working_directory:
directory: /opt/prefect/{{ code.directory }}
- prefect.deployments.steps.run_shell_script:
script: |
poetry install --no-root
pull:
- prefect.deployments.steps.set_working_directory:
directory: /opt/prefect
- prefect.deployments.steps.git_clone:
repository: <https://github.com/<owner>/<repo>>
access_token: "{{ prefect.blocks.secret.ghb-tkn }}"
branch: main
id: clone
- prefect.deployments.steps.set_working_directory:
directory: /opt/prefect/{{ clone.directory }}
- prefect.deployments.steps.run_shell_script:
script: |
poetry install --no-root
deployments:
- name: Inventory Forecaster Pipeline
entrypoint: pipelines/forecaster_pipeline.py:forecaster_training_flow_run
tags: ["Inventory Intelligence"]
description: "Run the forecaster pipeline"
parameters:
# Optional: deployment-level defaults. Schema still comes from the flow signature.
tenant_id: null
work_pool:
name: ecs-work-pool
job_variables:
image: "<your prebuilt image or leave ARN>"
# OR use task_definition_arn if you prefer a fixed TD
env: {}
schedule:
cron: "0 4 * * *"
timezone: UTC
day_or: true
Then your sync.sh
can be simplified to just:
filename=$1
prefect config set PREFECT_API_URL=$PREFECT_API_URL
prefect --no-prompt deploy --all --prefect-file $filename
Requirements for this to work:
- The deploy environment must be able to clone the repo (access token present) and install dependencies (git/poetry available).
- The entrypoint
paths in your yaml must resolve to the actual files in the cloned repo.
- Keep the pull
steps so workers fetch/install the code at runtime as well.
If those conditions are met, Prefect will import your real flows during deploy and the UI will show your actual optional parameters and defaults. Links for reference:
- prefect.yaml reference
- Store flow code for deploymentsManoj Ravi
10/01/2025, 9:20 AMMarvin
10/01/2025, 9:20 AMMarvin
10/01/2025, 9:24 AMprefect --no-prompt ...
(verified with prefect --help
)
- prefect deploy --prefect-file <file>
- prefect deploy --all
- entrypoint
format should be path/to/file.py:flow_function_name
.
- prefect deploy
should be run from the project root where your prefect.yaml
lives, so the cd
is fine.
What needs attention
- Creating empty Python files for each entrypoint will fail. prefect deploy
imports the entrypoint at deploy time to validate the callable. An empty file without a @flow
function will cause an import/attribute error.
- Your yq
invocation will likely include quotes and split on whitespace. Use raw output and a safe read loop to avoid breakage.
- prefect config set PREFECT_API_URL=$PREFECT_API_URL
is fine, but make sure PREFECT_API_KEY
is set too if you’re targeting Prefect Cloud. Also note this mutates your current profile.
- rm -rf ~/.prefect/*
is destructive. It deletes profiles, settings, and local caches. Only do this in an ephemeral CI environment, not on a dev machine.
Safer version of your script
#!/usr/bin/env bash
set -euo pipefail
PREFECT_FILE="${1:-prefect.yaml}"
cd "$(dirname "$PREFECT_FILE")"
PREFECT_FILE="$(basename "$PREFECT_FILE")"
# Optional: set API URL (and ensure API key is present for Cloud)
if [[ -n "${PREFECT_API_URL:-}" ]]; then
prefect config set "PREFECT_API_URL=${PREFECT_API_URL}"
fi
# Validate entrypoints exist and are importable (avoid creating empty stubs)
if command -v yq >/dev/null 2>&1; then
# Use raw output (-r) to avoid quoted strings
while IFS= read -r entry; do
# Expect format path/to/file.py:flow_func
file="${entry%%:*}"
func="${entry##*:}"
if [[ ! -f "$file" ]]; then
echo "Error: entrypoint file '$file' not found for '$entry'" >&2
exit 1
fi
# Optional: basic importability check
python - <<PY
import importlib.util, sys
from pathlib import Path
p = Path("$file")
spec = importlib.util.spec_from_file_location(p.stem, p)
m = importlib.util.module_from_spec(spec)
spec.loader.exec_module(m) # may raise
assert hasattr(m, "$func"), "Module has no attribute '$func'"
PY
done < <(yq -r '.deployments[].entrypoint' "$PREFECT_FILE")
fi
prefect --no-prompt deploy --all --prefect-file "$PREFECT_FILE"
# Only do this in CI/ephemeral environments
# rm -rf ~/.prefect/*
Notes and alternatives
- If your flows live in a different repo and you’re trying to avoid checking them out locally, don’t create empty files. Either:
- Run prefect deploy
in a checkout that actually contains the flows referenced by entrypoint
, or
- Switch to the Python-first pattern and deploy from source:
- In Prefect 3.x: flow.from_source("<git-url>@<ref>", entrypoint="path/to/file.py:flow").deploy(...)
- prefect.yaml
reference and options:
- Prefect 3.x prefect.yaml
guide: Prefect YAML (3.x)
- Entrypoint type/format: Entrypoint type
- Settings overview: Manage settings
Quick check: Is your goal to deploy from a CI job without having the flow code present at deploy time? If so, I can suggest the cleanest approach for your setup (CLI vs. Python from_source
, and whether you’re using a git pull step in your runtime).Manoj Ravi
10/01/2025, 9:29 AMMarvin
10/01/2025, 9:29 AMMarvin
10/01/2025, 9:34 AMprefect deploy
can import the entrypoint and infer parameters
- In pull: clone the same repo so workers fetch the code at runtime
- Do not create any placeholder Python files
Example prefect.yaml
name: my-project
prefect-version: ">=3.0.0"
build:
- prefect.deployments.steps.utility.run_shell_script:
script: |
rm -rf flows-repo
git clone --depth 1 <https://github.com/your-org/your-flows-repo.git> flows-repo
deployments:
- name: my-deployment
entrypoint: flows-repo/path/to/flow.py:my_flow
# Optional: defaults to show in the UI and used when scheduling
parameters:
foo: "bar"
num_rows: 100
work_pool:
name: process-pool
tags: [prod]
# Runtime prep for workers: fetch the same code before execution
pull:
- prefect.deployments.steps.pull.git_clone:
repository: <https://github.com/your-org/your-flows-repo.git>
branch: main
# or, if private:
# access_token: ${GITHUB_TOKEN}
- prefect.deployments.steps.pull.set_working_directory:
directory: flows-repo
Why this works
- prefect deploy
imports flows-repo/path/to/flow.py:my_flow
during deploy, so it can read the function signature and infer parameters (defaults appear in the UI). Any values you set under parameters:
become the deployment’s defaults.
- The pull
steps ensure your workers clone the same repo at run time so the entrypoint is importable on the agent/worker side as well.
Option B — Python-first with Flow.from_source (avoids cloning during deploy)
- If you’d rather not clone during deploy, define deployments in a small Python script that points to your repo:
from prefect.flows import Flow
flow = Flow.from_source(
"<https://github.com/your-org/your-flows-repo.git@main>",
entrypoint="path/to/flow.py:my_flow",
)
flow.deploy(
name="my-deployment",
work_pool_name="process-pool",
parameters={"foo": "bar", "num_rows": 100},
)
- Run this in CI to create/update the deployment. Prefect will use the source reference and entrypoint without needing the code locally.
- This is the 3.x replacement pattern for old 2.x “build from flow” commands.
Clean up your current script
- Remove the part that creates empty/dummy Python files. That’s what’s breaking parameter inference and entrypoint validation.
- If you keep a sync.sh
, it should only git clone
(and maybe set working directory), nothing else.
Docs for reference
- Prefect YAML (3.x): prefect.yaml guide
- Entrypoint format: Entrypoint type
- Managing settings: Manage settings
Quick check
- Do you prefer to keep the YAML-first approach with prefect deploy
, or would you like a ready-to-run Python from_source().deploy(...)
script for each deployment? I can tailor either to your exact repo paths and work pool.Manoj Ravi
10/01/2025, 9:36 AMMarvin
10/01/2025, 9:36 AMMarvin
10/01/2025, 9:38 AMprefect deploy
can import the true entrypoints and infer parameters.
Use this minimal, safe sync.sh
#!/usr/bin/env bash
set -euo pipefail
PREFECT_FILE="${1:?Usage: sync.sh path/to/prefect.yaml}"
cd "$(dirname "$PREFECT_FILE")"
PREFECT_FILE="$(basename "$PREFECT_FILE")"
# Optional: set API URL for this run (and ensure PREFECT_API_KEY is set for Cloud)
if [[ -n "${PREFECT_API_URL:-}" ]]; then
prefect config set "PREFECT_API_URL=${PREFECT_API_URL}"
fi
# Do not create dummy files. Build/pull steps in prefect.yaml will fetch the real code.
prefect --no-prompt deploy --all --prefect-file "$PREFECT_FILE"
# Only do this in ephemeral CI containers. Never on a developer machine.
# rm -rf ~/.prefect/*
Make sure your prefect.yaml does the cloning
- Build: clone repo so deploy-time import/parameter inference works
- Pull: clone repo so workers can import at runtime
- Reference the entrypoint path relative to the project root after build (e.g., include the folder name you clone into)
Example prefect.yaml
name: my-project
prefect-version: ">=3.0.0"
build:
- prefect.deployments.steps.utility.run_shell_script:
script: |
rm -rf flows-repo
git clone --depth 1 <https://github.com/your-org/your-flows-repo.git> flows-repo
deployments:
- name: my-deployment
entrypoint: flows-repo/path/to/flow.py:my_flow
parameters:
foo: "bar"
num_rows: 100
work_pool:
name: process-pool
tags: [prod]
pull:
- prefect.deployments.steps.pull.git_clone:
repository: <https://github.com/your-org/your-flows-repo.git>
branch: main
# If private:
# access_token: ${GITHUB_TOKEN}
- prefect.deployments.steps.pull.set_working_directory:
directory: flows-repo
Notes
- Remove the loop that writes empty Python files — that’s what breaks parameter inference.
- Ensure the entrypoint path in prefect.yaml matches the location after the build clone (e.g., flows-repo/...
).
- For private repos, pass a token via access_token
in the git_clone
step using an env var (e.g., ${GITHUB_TOKEN}
).
Refs:
- Prefect YAML guide: prefect.yaml (3.x)
- Entrypoint format: Entrypoint type
- Settings: Manage settings
If you share your exact repo URL and entrypoint path, I can double-check the YAML and script for you.Manoj Ravi
10/01/2025, 9:46 AMMarvin
10/01/2025, 9:46 AMMarvin
10/01/2025, 9:47 AMprefect.deployments.steps.pull.git_clone
- set_working_directory: prefect.deployments.steps.pull.set_working_directory
- run_shell_script: prefect.deployments.steps.utility.run_shell_script
- Keep paths consistent between build and pull. If you set_working_directory
into the repo directory in pull, do the same in build and make the entrypoint
path relative to the repo root (no top-level repo dir prefix).
Example build step
build:
- prefect.deployments.steps.pull.git_clone:
repository: <https://github.com/your-org/your-flows-repo.git>
access_token: "{{ prefect.blocks.secret.ghb-tkn }}"
branch: 1.2.0
id: build_clone
- prefect.deployments.steps.pull.set_working_directory:
directory: "{{ build_clone.directory }}"
# Optional: sanity check importability at deploy time
- prefect.deployments.steps.utility.run_shell_script:
script: |
python - <<'PY'
import importlib.util, pathlib, sys
p = pathlib.Path("path/to/flow.py")
assert p.exists(), f"Missing {p}"
spec = importlib.util.spec_from_file_location(p.stem, p)
m = importlib.util.module_from_spec(spec)
spec.loader.exec_module(m)
assert hasattr(m, "my_flow"), "Flow function 'my_flow' not found"
PY
Then your deployment should use an entrypoint relative to the repo root:
deployments:
- name: my-deployment
entrypoint: path/to/flow.py:my_flow
parameters:
foo: bar
num_rows: 100
Your pull step (adjusted to correct step paths) looks like:
pull:
- prefect.deployments.steps.pull.set_working_directory:
directory: /opt/prefect
- prefect.deployments.steps.pull.git_clone:
repository: <https://github.com/your-org/your-flows-repo.git>
access_token: "{{ prefect.blocks.secret.ghb-tkn }}"
branch: 1.2.0
id: clone
- prefect.deployments.steps.pull.set_working_directory:
directory: /opt/prefect/{{ clone.directory }}
- prefect.deployments.steps.utility.run_shell_script:
script: |
poetry install --no-root
Notes
- Ensure poetry
is available in your worker image, or consider using prefect.deployments.steps.utility.pip_install_requirements
if you have a requirements file.
- If the repo is private, the access_token
in both build and pull steps must be valid in those environments.
Docs:
- YAML steps and schema: prefect.yaml (3.x)
- Entrypoint format: Entrypoint typeManoj Ravi
10/01/2025, 9:51 AMMarvin
10/01/2025, 9:51 AMMarvin
10/01/2025, 9:53 AMbuild:
- prefect.deployments.steps.pull.git_clone:
repository: <https://github.com/your-org/your-flows-repo.git>
access_token: "{{ prefect.blocks.secret.ghb-tkn }}"
branch: 1.2.0
id: build_clone
- prefect.deployments.steps.pull.set_working_directory:
directory: "{{ build_clone.directory }}"
# optional: sanity check imports
- prefect.deployments.steps.utility.run_shell_script:
script: |
python - <<'PY'
import importlib.util, pathlib
p = pathlib.Path("path/to/flow.py")
assert p.exists(), f"Missing {p}"
spec = importlib.util.spec_from_file_location(p.stem, p)
m = importlib.util.module_from_spec(spec)
spec.loader.exec_module(m)
assert hasattr(m, "my_flow"), "Flow function 'my_flow' not found"
PY
And keep your pull steps using /opt/prefect:
pull:
- prefect.deployments.steps.pull.set_working_directory:
directory: /opt/prefect
- prefect.deployments.steps.pull.git_clone:
repository: <https://github.com/your-org/your-flows-repo.git>
access_token: "{{ prefect.blocks.secret.ghb-tkn }}"
branch: 1.2.0
id: clone
- prefect.deployments.steps.pull.set_working_directory:
directory: /opt/prefect/{{ clone.directory }}
- prefect.deployments.steps.utility.run_shell_script:
script: |
poetry install --no-root
This separation keeps deploy-time logic portable to any CI/host while runtime logic targets your worker filesystem layout.
Docs: prefect.yaml (3.x)Manoj Ravi
10/01/2025, 10:03 AMschedule
key in the deployment has been
deprecated. It will not be available in new releases after Sep 2024. Please use
schedules
instead by renaming the schedule
key to schedules
and providing
a list of schedule objects.
Traceback (most recent call last):
File "<frozen importlib._bootstrap_external>", line 936, in exec_module
File "<frozen importlib._bootstrap_external>", line 1073, in get_code
File "<frozen importlib._bootstrap_external>", line 1130, in get_data
FileNotFoundError: [Errno 2] No such file or directory: '/home/runner/work/prefect-manifests/prefect-manifests/environments/dev/dbf2939a-42f1-48a2-aec0-8c2ed9ed3cd0/edix_forecasting_intelligence/pipelines/main_forecasting_pipeline.py'Marvin
10/01/2025, 10:03 AMMarvin
10/01/2025, 10:04 AMschedule:
with schedules:
and provide a list. Example:
deployments:
- name: my-deployment
schedules:
- cron: "0 2 * * *"
2) FileNotFoundError during deploy
- Prefect is trying to import your entrypoint at deploy time and can’t find the file:
/home/runner/work/prefect-manifests/prefect-manifests/environments/dev/dbf2939a-42f1-48a2-aec0-8c2ed9ed3cd0/edix_forecasting_intelligence/pipelines/main_forecasting_pipeline.py
This means the entrypoint path in your deployment does not match where the file actually exists after your build step runs. Since you’re cloning the flows repo in build, you need your entrypoint to be relative to the cloned repo root (and the build working directory should be set accordingly).
Do this in your build
build:
- prefect.deployments.steps.pull.git_clone:
repository: <https://github.com/your-org/your-flows-repo.git>
access_token: "{{ prefect.blocks.secret.ghb-tkn }}"
branch: 1.2.0
id: build_clone
- prefect.deployments.steps.pull.set_working_directory:
directory: "{{ build_clone.directory }}"
# Optional: debug what’s on disk
- prefect.deployments.steps.utility.run_shell_script:
script: |
pwd
ls -R
python -c "import pathlib; p=pathlib.Path('edix_forecasting_intelligence/pipelines/main_forecasting_pipeline.py'); print('Exists?', p.exists())"
Then update the entrypoint to match repo layout
- If the file is at repo_root/edix_forecasting_intelligence/pipelines/main_forecasting_pipeline.py and your build working dir is set to the repo root (as above), use:
deployments:
- name: forecasting branch items - Synthetic
entrypoint: edix_forecasting_intelligence/pipelines/main_forecasting_pipeline.py:your_flow_function
Why the error showed environments/dev/...?
- That path is coming from your current entrypoint. It likely still points to a previous local path you used to generate dummy files. Now that dummy-file step is removed, that path doesn’t exist. Point it to the path inside the cloned repo instead.
Sanity checks
- Ensure the file exists at that path in the 1.2.0 branch/tag you’re cloning.
- Keep your pull step aligned with build. You’re already doing:
pull:
- prefect.deployments.steps.pull.set_working_directory:
directory: /opt/prefect
- prefect.deployments.steps.pull.git_clone:
repository: <https://github.com/your-org/your-flows-repo.git>
access_token: "{{ prefect.blocks.secret.ghb-tkn }}"
branch: 1.2.0
id: clone
- prefect.deployments.steps.pull.set_working_directory:
directory: /opt/prefect/{{ clone.directory }}
- prefect.deployments.steps.utility.run_shell_script:
script: |
poetry install --no-root
- With this, workers will also import the flow at the same relative path at runtime.
If you paste the deployment’s entrypoint:
and the repo’s actual file path, I can confirm the exact string you should use.Manoj Ravi
10/01/2025, 10:29 AMpyproject_toml_table_header
is set in model_config but will be ignored because no PyprojectTomlConfigSettingsSource source is configured. To use this config key, add a PyprojectTomlConfigSettingsSource source to the settings sources via the settings_customise_sources hook.
self._settings_warn_unused_config_keys(sources, self.model_config)
/opt/hostedtoolcache/Python/3.11.4/x64/lib/python3.11/site-packages/pydantic_settings/main.py425 UserWarning: Config key toml_file
is set in model_config but will be ignored because no TomlConfigSettingsSource source is configured. To use this config key, add a TomlConfigSettingsSource source to the settings sources via the settings_customise_sources hook.
self._settings_warn_unused_config_keys(sources, self.model_config)
Set 'PREFECT_API_URL' to 'http://localhost:4200/api'.
PREFECT_API_URL is also set by an environment variable which will override your
config value. Run unset PREFECT_API_URL
to clear it.
Updated profile 'ephemeral'.
/opt/hostedtoolcache/Python/3.11.4/x64/lib/python3.11/site-packages/pydantic_settings/main.py425 UserWarning: Config key pyproject_toml_table_header
is set in model_config but will be ignored because no PyprojectTomlConfigSettingsSource source is configured. To use this config key, add a PyprojectTomlConfigSettingsSource source to the settings sources via the settings_customise_sources hook.
self._settings_warn_unused_config_keys(sources, self.model_config)
/opt/hostedtoolcache/Python/3.11.4/x64/lib/python3.11/site-packages/pydantic_settings/main.py425 UserWarning: Config key toml_file
is set in model_config but will be ignored because no TomlConfigSettingsSource source is configured. To use this config key, add a TomlConfigSettingsSource source to the settings sources via the settings_customise_sources hook.
self._settings_warn_unused_config_keys(sources, self.model_config)
Deploying all flows with an existing deployment configuration...
╭──────────────────────────────────────────────────────────────────────────────╮
│ Deploying forecasting branch items - Synthetic │
╰──────────────────────────────────────────────────────────────────────────────╯
Defining a schedule via the schedule
key in the deployment has been
deprecated. It will not be available in new releases after Sep 2024. Please use
schedules
instead by renaming the schedule
key to schedules
and providing
a list of schedule objects.
Traceback (most recent call last):
File "<frozen importlib._bootstrap_external>", line 936, in exec_module
File "<frozen importlib._bootstrap_external>", line 1073, in get_code
File "<frozen importlib._bootstrap_external>", line 1130, in get_data
FileNotFoundError: [Errno 2] No such file or directory: '/home/runner/work/prefect-manifests/prefect-manifests/environments/dev/dbf2939a-42f1-48a2-aec0-8c2ed9ed3cd0/edix_forecasting_intelligence/pipelines/main_forecasting_pipeline.py'
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/opt/hostedtoolcache/Python/3.11.4/x64/lib/python3.11/site-packages/prefect/flows.py", line 1742, in load_flow_from_entrypoint
flow = import_object(entrypoint)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/hostedtoolcache/Python/3.11.4/x64/lib/python3.11/site-packages/prefect/utilities/importtools.py", line 205, in import_object
module = load_script_as_module(script_path)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/hostedtoolcache/Python/3.11.4/x64/lib/python3.11/site-packages/prefect/utilities/importtools.py", line 168, in load_script_as_module
raise ScriptError(user_exc=exc, path=path) from exc
prefect.exceptions.ScriptError: Script at 'edix_forecasting_intelligence/pipelines/main_forecasting_pipeline.py' encountered an exception: FileNotFoundError(2, 'No such file or directory')
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/opt/hostedtoolcache/Python/3.11.4/x64/lib/python3.11/site-packages/prefect/cli/_utilities.py", line 42, in wrapper
return fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^
File "/opt/hostedtoolcache/Python/3.11.4/x64/lib/python3.11/site-packages/prefect/cli/_types.py", line 153, in sync_fn
return asyncio.run(async_fn(*args, **kwargs))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/hostedtoolcache/Python/3.11.4/x64/lib/python3.11/asyncio/runners.py", line 190, in run
return runner.run(main)
^^^^^^^^^^^^^^^^
File "/opt/hostedtoolcache/Python/3.11.4/x64/lib/python3.11/asyncio/runners.py", line 118, in run
return self._loop.run_until_complete(task)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/hostedtoolcache/Python/3.11.4/x64/lib/python3.11/asyncio/base_events.py", line 653, in run_until_complete
return future.result()
^^^^^^^^^^^^^^^
File "/opt/hostedtoolcache/Python/3.11.4/x64/lib/python3.11/site-packages/prefect/cli/deploy.py", line 438, in deploy
await _run_multi_deploy(
File "/opt/hostedtoolcache/Python/3.11.4/x64/lib/python3.11/site-packages/prefect/cli/deploy.py", line 877, in _run_multi_deploy
await _run_single_deploy(deploy_config, actions, prefect_file=prefect_file)
File "/opt/hostedtoolcache/Python/3.11.4/x64/lib/python3.11/site-packages/prefect/client/utilities.py", line 103, in with_injected_client
return await fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/hostedtoolcache/Python/3.11.4/x64/lib/python3.11/site-packages/prefect/cli/deploy.py", line 501, in _run_single_deploy
flow = load_flow_from_entrypoint(deploy_config["entrypoint"])
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/hostedtoolcache/Python/3.11.4/x64/lib/python3.11/site-packages/prefect/flows.py", line 1751, in load_flow_from_entrypoint
flow = safe_load_flow_from_entrypoint(entrypoint)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/hostedtoolcache/Python/3.11.4/x64/lib/python3.11/site-packages/prefect/flows.py", line 2001, in safe_load_flow_from_entrypoint
func_def, source_code = _entrypoint_definition_and_source(entrypoint)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/hostedtoolcache/Python/3.11.4/x64/lib/python3.11/site-packages/prefect/flows.py", line 2235, in _entrypoint_definition_and_source
source_code = Path(path).read_text()
^^^^^^^^^^^^^^^^^^^^^^
File "/opt/hostedtoolcache/Python/3.11.4/x64/lib/python3.11/pathlib.py", line 1058, in read_text
with self.open(mode='r', encoding=encoding, errors=errors) as f:
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/hostedtoolcache/Python/3.11.4/x64/lib/python3.11/pathlib.py", line 1044, in open
return io.open(self, mode, buffering, encoding, errors, newline)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
FileNotFoundError: [Errno 2] No such file or directory: 'edix_forecasting_intelligence/pipelines/main_forecasting_pipeline.py'
An exception occurred.Marvin
10/01/2025, 10:29 AMpyproject_toml_table_header
is set in model_config but will be ignored because no PyprojectTomlConfigSettingsSource source is configured To use this config key add a PyprojectTomlConfigSettingsSource source to the settings sources via the settings_customise_sources hook self _settings_warn_unused_config_keys(sources self model_config) /opt/hostedtoolcache/Python/3 11 4/x64/lib/python3 11/site-packages/pydantic_settings/main py 425 UserWarning Config key toml_file
is set in model_config but will be ignored because no TomlConfigSettingsSource source is configured To use this config key add a TomlConfigSettingsSource source to the settings sources via the settings_customise_sources hook self _settings_warn_unused_config_keys(sources self model_config) Set 'PREFECT_API_URL' to '<http //localhost 4200/api>' PREFECT_API_URL is also set by an environment variable which will override your config value Run unset PREFECT_API_URL
to clear it Updated profile 'ephemeral' /opt/hostedtoolcache/Python/3 11 4/x64/lib/python3 11/site-packages/pydantic_settings/main py 425 UserWarning Config key pyproject_toml_table_header
is set in model_config but will be ignored because no PyprojectTomlConfigSettingsSource source is configured To use this config key add a PyprojectTomlConfigSettingsSource source to the settings sources via the settings_customise_sources hook self _settings_warn_unused_config_keys(sources self model_config) /opt/hostedtoolcache/Python/3 11 4/x64/lib/python3 11/site-packages/pydantic_settings/main py 425 UserWarning Config key toml_file
is set in model_config but will be ignored because no TomlConfigSettingsSource source is configured To use this config key add a TomlConfigSettingsSource source to the settings sources via the settings_customise_sources hook self _settings_warn_unused_config_keys(sources self model_config) Deploying all flows with an existing deployment configuration ╭──────────────────────────────────────────────────────────────────────────────╮ │ Deploying forecasting branch items - Synthetic │ ╰──────────────────────────────────────────────────────────────────────────────╯ Defining a schedule via the schedule
key in the deployment has been deprecated It will not be available in new releases after Sep 2024 Please use schedules
instead by renaming the schedule
key to schedules
and providing a list of schedule objects Traceback (most recent call last) File <frozen importlib _bootstrap_external> line 936 in exec_module File <frozen importlib _bootstrap_external> line 1073 in get_code File <frozen importlib _bootstrap_external> line 1130 in get_data FileNotFoundError [Errno 2] No such file or directory '/home/runner/work/prefect-manifests/prefect-manifests/environments/dev/dbf2939a-42f1-48a2-aec0-8c2ed9ed3cd0/edix_forecasting_intelligence/pipelines/main_forecasting_pipeline py' The above exception was the direct cause of the following exception Traceback (most recent call last) File /opt/hostedtoolcache/Python/3 11 4/x64/lib/python3 11/site-packages/prefect/flows py line 1742 in load_flow_from_entrypoint flow = import_object(entrypoint) ^^^^^^^^^^^^^^^^^^^^^^^^^ File /opt/hostedtoolcache/Python/3 11 4/x64/lib/python3 11/site-packages/prefect/utilities/importtools py line 205 in import_obMarvin
10/01/2025, 10:29 AMManoj Ravi
10/01/2025, 10:30 AMpyproject_toml_table_header
is set in model_config but will be ignored because no PyprojectTomlConfigSettingsSource source is configured To use this config key add a PyprojectTomlConfigSettingsSource source to the settings sources via the settings_customise_sources hook self _settings_warn_unused_config_keys(sources self model_config) /opt/hostedtoolcache/Python/3 11 4/x64/lib/python3 11/site-packages/pydantic_settings/main py 425 UserWarning Config key toml_file
is set in model_config but will be ignored because no TomlConfigSettingsSource source is configured To use this config key add a TomlConfigSettingsSource source to the settings sources via the settings_customise_sources hook self _settings_warn_unused_config_keys(sources self model_config) Set 'PREFECT_API_URL' to '<http //localhost 4200/api>' PREFECT_API_URL is also set by an environment variable which will override your config value Run unset PREFECT_API_URL
to clear it Updated profile 'ephemeral' /opt/hostedtoolcache/Python/3 11 4/x64/lib/python3 11/site-packages/pydantic_settings/main py 425 UserWarning Config key pyproject_toml_table_header
is set in model_config but will be ignored because no PyprojectTomlConfigSettingsSource source is configured To use this config key add a PyprojectTomlConfigSettingsSource source to the settings sources via the settings_customise_sources hook self _settings_warn_unused_config_keys(sources self model_config) /opt/hostedtoolcache/Python/3 11 4/x64/lib/python3 11/site-packages/pydantic_settings/main py 425 UserWarning Config key toml_file
is set in model_config but will be ignored because no TomlConfigSettingsSource source is configured To use this config key add a TomlConfigSettingsSource source to the settings sources via the settings_customise_sources hook self _settings_warn_unused_config_keys(sources self model_config) Deploying all flows with an existing deployment configuration ╭──────────────────────────────────────────────────────────────────────────────╮ │ Deploying forecasting branch items - Synthetic │ ╰──────────────────────────────────────────────────────────────────────────────╯ Defining a schedule via the schedule
key in the deployment has been deprecated It will not be available in new releases after Sep 2024 Please use schedules
instead by renaming the schedule
key to schedules
and providing a list of schedule objects Traceback (most recent call last) File <frozen importlib _bootstrap_external> line 936 in exec_module File <frozen importlib _bootstrap_external> line 1073 in get_code File <frozen importlib _bootstrap_external> line 1130 in get_data FileNotFoundError [Errno 2] No such file or directory '/home/runner/work/prefect-manifests/prefect-manifests/environments/dev/dbf2939a-42f1-48a2-aec0-8c2ed9ed3cd0/edix_forecasting_intelligence/pipelines/main_forecasting_pipeline py' The above exception was the direct cause of the following exception Traceback (most recent call last) File /opt/hostedtoolcache/Python/3 11 4/x64/lib/python3 11/site-packages/prefect/flows py line 1742 in load_flow_from_entrypoint flow = import_object(entrypoint) ^^^^^^^^^^^^^^^^^^^^^^^^^ File /opt/hostedtoolcache/Python/3 11 4/x64/lib/python3 11/site-packages/prefect/utilities/importtools py line 205 in import_ob
[11:29 AM]
ject module = load_script_as_module(script_path) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File /opt/hostedtoolcache/Python/3 11 4/x64/lib/python3 11/site-packages/prefect/utilities/importtools py line 168 in load_script_as_module raise ScriptError(user_exc=exc path=path) from exc prefect exceptions ScriptError Script at 'edix_forecasting_intelligence/pipelines/main_forecasting_pipeline py' encountered an exception FileNotFoundError(2 'No such file or directory') During handling of the above exception another exception occurred Traceback (most recent call last) File /opt/hostedtoolcache/Python/3 11 4/x64/lib/python3 11/site-packages/prefect/cli/_utilities py line 42 in wrapper return fn(*args *kwargs) ^^^^^^^^^^^^^^^^^^^ File /opt/hostedtoolcache/Python/3 11 4/x64/lib/python3 11/site-packages/prefect/cli/_types py line 153 in sync_fn return asyncio run(async_fn(*args *kwargs)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File /opt/hostedtoolcache/Python/3Marvin
10/01/2025, 10:30 AMpyproject_toml_table_header
is set in model_config but will be ignored because no PyprojectTomlConfigSettingsSource source is configured To use this config key add a PyprojectTomlConfigSettingsSource source to the settings sources via the settings_customise_sources hook self _settings_warn_unused_config_keys(sources self model_config) /opt/hostedtoolcache/Python/3 11 4/x64/lib/python3 11/site-packages/pydantic_settings/main py 425 UserWarning Config key toml_file
is set in model_config but will be ignored because no TomlConfigSettingsSource source is configured To use this config key add a TomlConfigSettingsSource source to the settings sources via the settings_customise_sources hook self _settings_warn_unused_config_keys(sources self model_config) Set 'PREFECT_API_URL' to '<http //localhost 4200/api>' PREFECT_API_URL is also set by an environment variable which will override your config value Run unset PREFECT_API_URL
to clear it Updated profile 'ephemeral' /opt/hostedtoolcache/Python/3 11 4/x64/lib/python3 11/site-packages/pydantic_settings/main py 425 UserWarning Config key pyproject_toml_table_header
is set in model_config but will be ignored because no PyprojectTomlConfigSettingsSource source is configured To use this config key add a PyprojectTomlConfigSettingsSource source to the settings sources via the settings_customise_sources hook self _settings_warn_unused_config_keys(sources self model_config) /opt/hostedtoolcache/Python/3 11 4/x64/lib/python3 11/site-packages/pydantic_settings/main py 425 UserWarning Config key toml_file
is set in model_config but will be ignored because no TomlConfigSettingsSource source is configured To use this config key add a TomlConfigSettingsSource source to the settings sources via the settings_customise_sources hook self _settings_warn_unused_config_keys(sources self model_config) Deploying all flows with an existing deployment configuration ╭──────────────────────────────────────────────────────────────────────────────╮ │ Deploying forecasting branch items - Synthetic │ ╰──────────────────────────────────────────────────────────────────────────────╯ Defining a schedule via the schedule
key in the deployment has been deprecated It will not be available in new releases after Sep 2024 Please use schedules
instead by renaming the schedule
key to schedules
and providing a list of schedule objects Traceback (most recent call last) File <frozen importlib _bootstrap_external> line 936 in exec_module File <frozen importlib _bootstrap_external> line 1073 in get_code File <frozen importlib _bootstrap_external> line 1130 in get_data FileNotFoundError [Errno 2] No such file or directory '/home/runner/work/prefect-manifests/prefect-manifests/environments/dev/dbf2939a-42f1-48a2-aec0-8c2ed9ed3cd0/edix_forecasting_intelligence/pipelines/main_forecasting_pipeline py' The above exception was the direct cause of the following exception Traceback (most recent call last) File /opt/hostedtoolcache/Python/3 11 4/x64/lib/python3 11/site-packages/prefect/flows py line 1742 in load_flow_from_entrypoint flow = import_object(entrypoint) ^^^^^^^^^^^^^^^^^^^^^^^^^ File /opt/hostedtoolcache/Python/3 11 4/x64/lib/python3 11/site-packages/prefect/utilities/importtools py line 20Marvin
10/01/2025, 10:30 AMManoj Ravi
10/01/2025, 10:31 AMpyproject_toml_table_header
is set in model_config but will be ignored because no PyprojectTomlConfigSettingsSource source is configured. To use this config key, add a PyprojectTomlConfigSettingsSource source to the settings sources via the settings_customise_sources hook.
self._settings_warn_unused_config_keys(sources, self.model_config)
/opt/hostedtoolcache/Python/3.11.4/x64/lib/python3.11/site-packages/pydantic_settings/main.py425 UserWarning: Config key toml_file
is set in model_config but will be ignored because no TomlConfigSettingsSource source is configured. To use this config key, add a TomlConfigSettingsSource source to the settings sources via the settings_customise_sources hook.
self._settings_warn_unused_config_keys(sources, self.model_config)
Set 'PREFECT_API_URL' to 'http://localhost:4200/api'.
PREFECT_API_URL is also set by an environment variable which will override your
config value. Run unset PREFECT_API_URL
to clear it.
Updated profile 'ephemeral'.
/opt/hostedtoolcache/Python/3.11.4/x64/lib/python3.11/site-packages/pydantic_settings/main.py425 UserWarning: Config key pyproject_toml_table_header
is set in model_config but will be ignored because no PyprojectTomlConfigSettingsSource source is configured. To use this config key, add a PyprojectTomlConfigSettingsSource source to the settings sources via the settings_customise_sources hook.
self._settings_warn_unused_config_keys(sources, self.model_config)
/opt/hostedtoolcache/Python/3.11.4/x64/lib/python3.11/site-packages/pydantic_settings/main.py425 UserWarning: Config key toml_file
is set in model_config but will be ignored because no TomlConfigSettingsSource source is configured. To use this config key, add a TomlConfigSettingsSource source to the settings sources via the settings_customise_sources hook.
self._settings_warn_unused_config_keys(sources, self.model_config)
Deploying all flows with an existing deployment configuration...
╭──────────────────────────────────────────────────────────────────────────────╮
│ Deploying forecasting branch items - Synthetic │
╰──────────────────────────────────────────────────────────────────────────────╯
Defining a schedule via the schedule
key in the deployment has been
deprecated. It will not be available in new releases after Sep 2024. Please use
schedules
instead by renaming the schedule
key to schedules
and providing
a list of schedule objects.
Traceback (most recent call last):
File "<frozen importlib._bootstrap_external>", line 936, in exec_module
File "<frozen importlib._bootstrap_external>", line 1073, in get_code
File "<frozen importlib._bootstrap_external>", line 1130, in get_data
FileNotFoundError: [Errno 2] No such file or directory: '/home/runner/work/prefect-manifests/prefect-manifests/environments/dev/dbf2939a-42f1-48a2-aec0-8c2ed9ed3cd0/edix_forecasting_intelligence/pipelines/main_forecasting_pipeline.py'
The actual step in deployment is
- name: Syncing Dev changed files
run: |
# Loop through each changed file
envdir='environments/dev'
for FILE in $(git diff --name-only HEAD^ HEAD); do
is_temp=$(echo $FILE | grep Template | wc -l)
if [[ "$is_temp" -eq 0 && "$FILE" == "$envdir/"* ]]; then
echo "syncing file: $FILE"
PREFECT_API_URL="http://localhost:${PREFECT_PORT}/api" ./environments/sync-file.sh $FILE
fi
doneMarvin
10/01/2025, 10:31 AMMarvin
10/01/2025, 10:32 AMbuild:
- prefect.deployments.steps.pull.git_clone:
repository: <https://github.com/your-org/your-flows-repo.git>
access_token: "{{ prefect.blocks.secret.ghb-tkn }}"
branch: 1.2.0
id: build_clone
- prefect.deployments.steps.pull.set_working_directory:
directory: "{{ build_clone.directory }}"
# Optional: debug to ensure your file is present
- prefect.deployments.steps.utility.run_shell_script:
script: |
echo "CI build PWD:"
pwd
echo "Repo tree:"
ls -R | head -n 200
python - <<'PY'
import pathlib
p = pathlib.Path("edix_forecasting_intelligence/pipelines/main_forecasting_pipeline.py")
print("Exists?", p.exists(), "->", p.resolve())
PY
2) Update the deployment entrypoint to be relative to the cloned repo root
- If your file is at repo_root/edix_forecasting_intelligence/pipelines/main_forecasting_pipeline.py and the flow function is my_flow:
deployments:
- name: forecasting branch items - Synthetic
entrypoint: edix_forecasting_intelligence/pipelines/main_forecasting_pipeline.py:my_flow
# replace schedule with schedules (list)
schedules:
- cron: "0 2 * * *"
# keep any parameters you need here as defaults
- Do not include environments/dev/... in the entrypoint — that was the old placeholder path and won’t exist after build.
Align your pull steps (runtime) with the same repo layout
- Your pull section is good conceptually; just ensure you use fully-qualified step names:
pull:
- prefect.deployments.steps.pull.set_working_directory:
directory: /opt/prefect
- prefect.deployments.steps.pull.git_clone:
repository: <https://github.com/your-org/your-flows-repo.git>
access_token: "{{ prefect.blocks.secret.ghb-tkn }}"
branch: 1.2.0
id: clone
- prefect.deployments.steps.pull.set_working_directory:
directory: /opt/prefect/{{ clone.directory }}
- prefect.deployments.steps.utility.run_shell_script:
script: |
poetry install --no-root
Why we don’t set /opt/prefect in build
- Build runs on your CI host; /opt/prefect is a path inside your worker image. Use the cloned repo directory in build; use /opt/prefect in pull.
Action items for you
- Add the build section above.
- Change the entrypoint to the repo-relative path.
- Replace schedule with schedules as shown.
- Re-run deploy. If it still fails, paste the pwd/ls
output from the build step and your exact entrypoint so I can pinpoint the path.
Refs
- prefect.yaml (3.x)
- Entrypoint formatManoj Ravi
10/01/2025, 10:35 AMMarvin
10/01/2025, 10:36 AMMarvin
10/01/2025, 10:36 AMprefect.deployments.steps.pull.git_clone
- set_working_directory: prefect.deployments.steps.pull.set_working_directory
- run_shell_script: prefect.deployments.steps.utility.run_shell_script
- Indentation under pull:
must be indented as a list (your snippet shows it at the root). Same idea for build:
.
2) Make the build step noisy and fail fast if the file isn’t present
- Explicitly stream output and add a hard failure if the flow file can’t be found. This will prove the build step is running and that the path is correct.
Try this exact structure
name: my-project
prefect-version: ">=3.0.0"
build:
- prefect.deployments.steps.pull.git_clone:
repository: <https://github.com/Intelmatix-AI/edix-forecasting-intelligence.git>
access_token: "{{ prefect.blocks.secret.ghb-tkn }}"
branch: 1.5.0
id: build_clone
- prefect.deployments.steps.pull.set_working_directory:
directory: "{{ build_clone.directory }}"
- prefect.deployments.steps.utility.run_shell_script:
stream_output: true
script: |
set -euo pipefail
echo "Build PWD: $(pwd)"
echo "Repo contents (top-level):"
ls -la
python - <<'PY'
import pathlib, sys
p = pathlib.Path("edix_forecasting_intelligence/pipelines/main_forecasting_pipeline.py")
print("Flow file exists?", p.exists(), "->", p.resolve())
if not p.exists():
sys.exit("ERROR: flow file not found at expected path during build")
PY
pull:
- prefect.deployments.steps.pull.set_working_directory:
directory: /opt/prefect
- prefect.deployments.steps.pull.git_clone:
repository: <https://github.com/Intelmatix-AI/edix-forecasting-intelligence.git>
access_token: "{{ prefect.blocks.secret.ghb-tkn }}"
branch: 1.5.0
id: clone
- prefect.deployments.steps.pull.set_working_directory:
directory: /opt/prefect/{{ clone.directory }}
- prefect.deployments.steps.utility.run_shell_script:
stream_output: true
script: |
set -euo pipefail
echo "Runtime PWD: $(pwd)"
poetry install --no-root
deployments:
- name: forecasting branch items - Synthetic
entrypoint: edix_forecasting_intelligence/pipelines/main_forecasting_pipeline.py:general_forecast_flow_run
# migrate away from 'schedule:' to 'schedules:' per deprecation
# schedules:
# - cron: "0 2 * * *"
Notes
- Make sure the repo actually has the file at that path on branch/tag 1.5.0. If you’re targeting a tag, using branch: "1.5.0"
will typically work, but if not, switch to commit_sha:
with the tag’s SHA.
- The entrypoint should be repo-relative (no environments/dev prefix).
- Your earlier pull
used prefect.deployments.steps.set_working_directory
(missing .pull.
) and the list item was not indented under pull:
— that would be ignored.
Extra debugging if needed
- You can force more logs by setting PREFECT_LOGGING_LEVEL=DEBUG
in your CI job when running prefect deploy
.
- If you still don’t see the “Build PWD”/“Repo contents” lines, it means the build block isn’t being parsed (usually due to YAML structure). Paste the top of your prefect.yaml (name/prefect-version/build/pull/deployments) and I’ll validate the structure.Manoj Ravi
10/01/2025, 10:40 AMMarvin
10/01/2025, 10:40 AMMarvin
10/01/2025, 10:40 AM#!/usr/bin/env bash
set -euo pipefail
PREFECT_FILE="${1:?Usage: sync.sh path/to/prefect.yaml}"
cd "$(dirname "$PREFECT_FILE")"
PREFECT_FILE="$(basename "$PREFECT_FILE")"
echo "Using prefect file: $PREFECT_FILE"
# Prefer env vars over config set to avoid warning/no-op
# If you really want to store it in the profile instead of env, unset first:
# if [[ -n "${PREFECT_API_URL:-}" ]]; then
# unset PREFECT_API_URL
# prefect config set "PREFECT_API_URL=${YOUR_URL}"
# fi
# Optional: more verbose logs while debugging build/pull steps
# export PREFECT_LOGGING_LEVEL=DEBUG
prefect --no-prompt deploy --all --prefect-file "$PREFECT_FILE"
# Only do this in CI
if [[ "${CI:-}" == "true" ]]; then
rm -rf ~/.prefect/*
fi
Notes
- The warning you saw (“PREFECT_API_URL is also set by an environment variable…”) is because env vars override profile settings. Rely on the env var and skip config set to avoid it.
- Ensure your prefect.yaml build/pull sections have the fully qualified step names and proper indentation so you see the debug output from your build steps.
- YAML guide: prefect.yaml (3.x)
- Confirm that entrypoint: edix_forecasting_intelligence/pipelines/main_forecasting_pipeline.py:general_forecast_flow_run
is correct relative to the repo root after the build step sets the working directory.Manoj Ravi
10/01/2025, 10:48 AMschedule
key in the deployment has been
deprecated. It will not be available in new releases after Sep 2024. Please use
schedules
instead by renaming the schedule
key to schedules
and providing
a list of schedule objects.
104448.840 | DEBUG | prefect.client - Connecting to API at http://localhost:4200/api/
104448.865 | DEBUG | prefect.client - Connecting to API at http://localhost:4200/api/
Traceback (most recent call last):
File "<frozen importlib._bootstrap_external>", line 936, in exec_module
File "<frozen importlib._bootstrap_external>", line 1073, in get_code
File "<frozen importlib._bootstrap_external>", line 1130, in get_data
FileNotFoundError: [Errno 2] No such file or directory: '/home/runner/work/prefect-manifests/prefect-manifests/environments/dev/dbf2939a-42f1-48a2-aec0-8c2ed9ed3cd0/edix_forecasting_intelligence/pipelines/main_forecasting_pipeline.py'
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/opt/hostedtoolcache/Python/3.11.4/x64/lib/python3.11/site-packages/prefect/flows.py", line 1742, in load_flow_from_entrypoint
flow = import_object(entrypoint)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/hostedtoolcache/Python/3.11.4/x64/lib/python3.11/site-packages/prefect/utilities/importtools.py", line 205, in import_object
module = load_script_as_module(script_path)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/hostedtoolcache/Python/3.11.4/x64/lib/python3.11/site-packages/prefect/utilities/importtools.py", line 168, in load_script_as_module
raise ScriptError(user_exc=exc, path=path) from exc
prefect.exceptions.ScriptError: Script at 'edix_forecasting_intelligence/pipelines/main_forecasting_pipeline.py' encountered an exception: FileNotFoundError(2, 'No such file or directory')
How to know whether build step is pulling the code from github ?Marvin
10/01/2025, 10:48 AMMarvin
10/01/2025, 10:49 AMname: my-project
prefect-version: ">=3.0.0"
build:
- prefect.deployments.steps.pull.git_clone:
repository: <https://github.com/Intelmatix-AI/edix-forecasting-intelligence.git>
access_token: "{{ prefect.blocks.secret.ghb-tkn }}"
branch: 1.5.0
id: clone
- prefect.deployments.steps.utility.run_shell_script:
stream_output: true
script: |
set -euo pipefail
echo "BUILD PWD: $(pwd)"
echo "Cloned dir: {{ clone.directory }}"
# Normalize path to a fixed folder under project root so entrypoint resolves
rm -rf flows-repo
mv "{{ clone.directory }}" "flows-repo"
echo "Top-level after move:"
ls -la
echo "Checking for flow file..."
python - <<'PY'
import pathlib, sys
p = pathlib.Path("flows-repo/edix_forecasting_intelligence/pipelines/main_forecasting_pipeline.py")
print("Flow file exists?", p.exists(), "->", p.resolve())
if not p.exists():
sys.exit("ERROR: flow file not found during build")
PY
pull:
- prefect.deployments.steps.pull.set_working_directory:
directory: /opt/prefect
- prefect.deployments.steps.pull.git_clone:
repository: <https://github.com/Intelmatix-AI/edix-forecasting-intelligence.git>
access_token: "{{ prefect.blocks.secret.ghb-tkn }}"
branch: 1.5.0
id: pull_clone
- prefect.deployments.steps.utility.run_shell_script:
stream_output: true
script: |
set -euo pipefail
echo "RUNTIME PWD: $(pwd)"
rm -rf flows-repo
mv "{{ pull_clone.directory }}" "flows-repo"
echo "Using flows-repo as runtime working dir"
- prefect.deployments.steps.pull.set_working_directory:
directory: /opt/prefect/flows-repo
- prefect.deployments.steps.utility.run_shell_script:
stream_output: true
script: |
set -euo pipefail
poetry install --no-root
deployments:
- name: forecasting branch items - Synthetic
entrypoint: flows-repo/edix_forecasting_intelligence/pipelines/main_forecasting_pipeline.py:general_forecast_flow_run
# Migrate to 'schedules:' (list) instead of 'schedule:'
# schedules:
# - cron: "0 2 * * *"
How to confirm build ran
- With stream_output: true, you should see “BUILD PWD”, “Cloned dir”, and the “Flow file exists?” line in your CI logs. If you don’t see them:
- The build section is not being parsed (usually YAML structure issue or you’re pointing to a different prefect.yaml).
- Echo the file being used in your sync script (you already print it). You can also grep -n "build:" "$PREFECT_FILE"
before running deploy to confirm the build block is present.
Why your earlier entrypoint failed
- Prefect tried to import edix_forecasting_intelligence/... relative to the project root, but your clone wasn’t under that path. By normalizing the clone into flows-repo and referencing flows-repo/... in entrypoint, the importer can find the file.
Optional debugging
- Keep PREFECT_LOGGING_LEVEL=DEBUG
set for deploy runs so step logs are visible.Marvin
10/01/2025, 10:49 AMManoj Ravi
10/01/2025, 10:55 AMMarvin
10/01/2025, 10:55 AMMarvin
10/01/2025, 10:56 AMbuild:
- prefect.deployments.steps.pull.git_clone:
repository: <https://github.com/Intelmatix-AI/edix-forecasting-intelligence.git>
access_token: "{{ prefect.blocks.secret.ghb-tkn }}"
branch: 1.5.0
id: clone
- prefect.deployments.steps.utility.run_shell_script:
stream_output: true
script: |
set -euo pipefail
echo "BUILD PWD: $(pwd)"
echo "Cloned dir: {{ clone.directory }}"
rm -rf flows-repo
mv "{{ clone.directory }}" "flows-repo"
echo "Top-level after move:"
ls -la
echo "Checking for flow file..."
python - <<'PY'
import pathlib, sys
p = pathlib.Path("flows-repo/edix_forecasting_intelligence/pipelines/main_forecasting_pipeline.py")
print("Flow file exists?", p.exists(), "->", p.resolve())
if not p.exists():
sys.exit("ERROR: flow file not found during build")
PY
4) Entrypoint import and parameter inference (still on CI)
- After build, Prefect imports your entrypoint. Key rule: the entrypoint path is evaluated relative to the project root (where prefect.yaml lives).
- Choose one consistent pattern:
- If you move the repo to flows-repo/, set entrypoint to flows-repo/...
- Or keep entrypoint as edix_forecasting_intelligence/... and move the repo contents under the project root such that that path exists.
- Prefect reads the flow function signature to infer parameters; your deployment defaults under parameters: are applied and visible in the UI.
Example deployment with consistent entrypoint
deployments:
- name: forecasting branch items - Synthetic
entrypoint: flows-repo/edix_forecasting_intelligence/pipelines/main_forecasting_pipeline.py:general_forecast_flow_run
# migrate away from 'schedule:'
# schedules:
# - cron: "0 2 * * *"
5) Deployment is created/updated in the API
- The deployment now contains your pull steps. These do not run during deploy.
6) A worker picks up a flow run and executes pull steps (runtime)Marvin
10/01/2025, 10:56 AMpull:
- prefect.deployments.steps.pull.set_working_directory:
directory: /opt/prefect
- prefect.deployments.steps.pull.git_clone:
repository: <https://github.com/Intelmatix-AI/edix-forecasting-intelligence.git>
access_token: "{{ prefect.blocks.secret.ghb-tkn }}"
branch: 1.5.0
id: pull_clone
- prefect.deployments.steps.utility.run_shell_script:
stream_output: true
script: |
set -euo pipefail
rm -rf flows-repo
mv "{{ pull_clone.directory }}" "flows-repo"
echo "RUNTIME PWD: $(pwd)"
# If you need deps:
# poetry install --no-root
- Now the entrypoint flows-repo/edix_forecasting_intelligence/... is valid at runtime as well.
How to know build ran
- You’ll see the “BUILD PWD”, “Cloned dir”, and “Checking for flow file” lines in your CI logs (we set stream_output: true).
- If you don’t see them:
- The build block may not be under the same prefect.yaml you’re deploying
- Or YAML indentation is off
- Or you’re not passing the intended file to sync.sh
- In your sync.sh, print the file you use and enable debug logging:
echo "Using prefect file: $PREFECT_FILE"
export PREFECT_LOGGING_LEVEL=DEBUG
Common pitfalls to avoid
- Using absolute paths in build (e.g., /opt/prefect) — that path exists in your worker image, not on CI.
- Using entrypoint paths that don’t exist under the project root after build completes.
- Relying on working-directory changes to change how entrypoint is resolved — keep entrypoint consistent with the project-root file layout.
- Still using schedule instead of schedules (you’re seeing the deprecation).
References
- Prefect YAML (3.x): prefect.yaml guide
- Entrypoint format: Entrypoint type
- Settings: Manage settings
If you paste your exact top-level prefect.yaml (name, prefect-version, build, pull, deployments) I can sanity-check indentation and confirm the final entrypoint string for your repo layout.Manoj Ravi
10/01/2025, 11:09 AMMarvin
10/01/2025, 11:09 AMMarvin
10/01/2025, 11:10 AMecho "Deploying with prefect file: $PREFECT_FILE"
test -f "$PREFECT_FILE" || { echo "File not found"; exit 1; }
- Dump the build section that Prefect will read:
- Add:
echo "Build steps from $PREFECT_FILE:"
yq -C '.build' "$PREFECT_FILE" || true
- If this prints null or nothing, Prefect has no build steps in the file you’re passing.
Force-proof that build runs
- Put an intentionally noisy, fail-fast step at the top of build. If deploy still jumps to an entrypoint import error without showing this, build is not running.
build:
- prefect.deployments.steps.utility.run_shell_script:
stream_output: true
script: |
set -euo pipefail
echo "BUILD START – this should print"
# exit 123 # uncomment to prove build executes before entrypoint import
- prefect.deployments.steps.pull.git_clone:
repository: <https://github.com/Intelmatix-AI/edix-forecasting-intelligence.git>
access_token: "{{ prefect.blocks.secret.ghb-tkn }}"
branch: 1.5.0
id: clone
- prefect.deployments.steps.utility.run_shell_script:
stream_output: true
script: |
set -euo pipefail
echo "BUILD PWD: $(pwd)"
echo "Cloned dir is: {{ clone.directory }}"
rm -rf flows-repo
mv "{{ clone.directory }}" "flows-repo"
ls -la
python - <<'PY'
import pathlib, sys
p = pathlib.Path("flows-repo/edix_forecasting_intelligence/pipelines/main_forecasting_pipeline.py")
print("Flow file exists?", p.exists(), "->", p.resolve())
if not p.exists():
sys.exit("ERROR: flow file not found during build")
PY
Common reasons build doesn’t run
- Wrong file: The file your CI passes to sync.sh isn’t the one you edited. Your logs show “syncing file: environments/dev/.../forecaster.yaml” — make sure that file contains the build block exactly as above.
- YAML structure/indentation: build:
must be a top-level key (same level as name
, prefect-version
, pull
, deployments
). The list items under it must be indented two spaces.
- Step names: Must be fully-qualified:
- prefect.deployments.steps.pull.git_clone
- prefect.deployments.steps.pull.set_working_directory
- prefect.deployments.steps.utility.run_shell_script
- Entrypoint path vs project root: Prefect resolves entrypoint:
relative to the project root (directory of the prefect file), not relative to any cd in build. That’s why we move the clone into a known folder like flows-repo/
under the project root and set:
entrypoint: flows-repo/edix_forecasting_intelligence/pipelines/main_forecasting_pipeline.py:general_forecast_flow_run
Also update schedules
- You’re seeing a deprecation for schedule:
. Rename to schedules:
and give a list:
deployments:
- name: forecasting branch items - Synthetic
entrypoint: flows-repo/edix_forecasting_intelligence/pipelines/main_forecasting_pipeline.py:general_forecast_flow_run
schedules:
- cron: "0 2 * * *"
Extra visibility
- In sync.sh, enable debug:
export PREFECT_LOGGING_LEVEL=DEBUG
- After adding the noisy build step with stream_output: true
, your CI logs should show:
- “BUILD START – this should print”
- “BUILD PWD: …”
- “Cloned dir is: …”
- “Flow file exists? True …”
If you don’t see these lines, Prefect is not reading your build block (wrong file or YAML shape).
If you can share the top of the exact file you’re deploying (from the first line through the end of the build/pull/deployments sections), I’ll sanity‑check indentation and the step paths.