Colin Buzza
10/02/2025, 1:07 PMMarvin
10/02/2025, 1:07 PMPyHannes
10/02/2025, 1:15 PMMarvin
10/02/2025, 1:17 PMsnapshot_date between flows scheduled independently (via Automations) are:
1) Event → Automation → RunDeployment with parameters (preferred)
- Have the producer flow emit an event that includes the value in its payload.
- Create an Automation that triggers on that event and runs the downstream deployment, templating the value from the event into the downstream’s parameters.
Producer flow:
from prefect import flow
from prefect.events.utilities import emit_event
@flow
def producer():
    # compute your value
    snapshot_date = "2025-10-01"
    # emit a custom event with the value in the payload
    emit_event(
        event="myapp.snapshot.created",
        resource={"prefect.resource.id": f"snapshot:{snapshot_date}"},
        payload={"snapshot_date": snapshot_date},
    )
Automation (UI):
- Trigger: Event trigger on myapp.snapshot.created
- Action: Run Deployment
- Parameters (JSON):
{
  "snapshot_date": "{{ event.payload.snapshot_date }}"
}
Downstream flow:
from prefect import flow
@flow
def consumer(snapshot_date: str):
    # use snapshot_date
    ...
Notes and links:
- This keeps your value explicit, typed, and auditable via events.
- Events utilities: Event-driven flows with Prefect
- Artifacts also emit events you can trigger on if you prefer that surface: Artifacts API
2) If you truly cannot add a parameter to the downstream: pass via job variables
- RunDeployment actions can set job variables.
- The downstream flow can read them at runtime via prefect.runtime.flow_run.job_variables.
Automation (UI):
- Trigger: Same as above (your custom event or an artifact.created if you go the artifact route)
- Action: Run Deployment
- Job variables (JSON):
{
  "SNAPSHOT_DATE": "{{ event.payload.snapshot_date }}"
}
Downstream flow:
from prefect import flow
from prefect.runtime import flow_run
@flow
def consumer():
    snapshot_date = flow_run.job_variables.get("SNAPSHOT_DATE")
    ...
Other options and tradeoffs
- Artifacts: You can create an artifact containing the value and trigger on artifact.created, then template fields from the event into parameters/job variables. This works well if you already publish artifacts, but you’ll want to ensure the artifact’s event payload reliably contains the value you need. API: Artifacts API
- Variables/Blocks: Persist to a “latest” location and have the downstream read from it (e.g., write latest_snapshot_date to a custom Block or external store). This is simple but “last write wins” can be racy if multiple producers run concurrently; it’s also less traceable than the event approach. Blocks overview: Blocks API
What I’d recommend
- If you can, add an optional snapshot_date parameter to the downstream and use the event → automation → parameter pattern. It’s the cleanest and most maintainable.
- If you can’t modify the downstream signature, use job variables as a transport and read them via prefect.runtime.flow_run.job_variables in the flow.
Quick question to tailor this: Are you able to update the downstream flow to accept an optional parameter? And do you prefer emitting a small custom event from the producer or reusing an artifact-created event from something you already generate?