<@ULVA73B9P> I have a job that is scheduled from t...
# ask-marvin
c
@Marvin I have a job that is scheduled from the Prefect UI and it doesn’t take parameters. This job generates a value internally (a snapshot_date). I want to pass this value to the next job using automations. Right now I’m considering embedding the value in the flow run name and then using Jinja templating + string parsing to extract it for downstream jobs, but that feels fragile. What’s the best pattern in Prefect for passing internally-generated values between scheduled jobs when parameters aren’t available? Should I be using tags, artifacts, or some other approach?
m
thought for 596.0 seconds
p
You can any time emit custom events in your flow runs that hold custom data. Based on this you can trigger other automations.
🙏 1
🐐 1
m
You’re right to be wary of encoding data in the flow run name — that’s brittle and hard to evolve. In Prefect 3.x, the two most robust patterns for passing a runtime value like
snapshot_date
between flows scheduled independently (via Automations) are: 1) Event → Automation → RunDeployment with parameters (preferred) - Have the producer flow emit an event that includes the value in its payload. - Create an Automation that triggers on that event and runs the downstream deployment, templating the value from the event into the downstream’s parameters. Producer flow:
Copy code
from prefect import flow
from prefect.events.utilities import emit_event

@flow
def producer():
    # compute your value
    snapshot_date = "2025-10-01"
    # emit a custom event with the value in the payload
    emit_event(
        event="myapp.snapshot.created",
        resource={"prefect.resource.id": f"snapshot:{snapshot_date}"},
        payload={"snapshot_date": snapshot_date},
    )
Automation (UI): - Trigger: Event trigger on
myapp.snapshot.created
- Action: Run Deployment - Parameters (JSON):
Copy code
{
  "snapshot_date": "{{ event.payload.snapshot_date }}"
}
Downstream flow:
Copy code
from prefect import flow

@flow
def consumer(snapshot_date: str):
    # use snapshot_date
    ...
Notes and links: - This keeps your value explicit, typed, and auditable via events. - Events utilities: Event-driven flows with Prefect - Artifacts also emit events you can trigger on if you prefer that surface: Artifacts API 2) If you truly cannot add a parameter to the downstream: pass via job variables - RunDeployment actions can set job variables. - The downstream flow can read them at runtime via
prefect.runtime.flow_run.job_variables
. Automation (UI): - Trigger: Same as above (your custom event or an
artifact.created
if you go the artifact route) - Action: Run Deployment - Job variables (JSON):
Copy code
{
  "SNAPSHOT_DATE": "{{ event.payload.snapshot_date }}"
}
Downstream flow:
Copy code
from prefect import flow
from prefect.runtime import flow_run

@flow
def consumer():
    snapshot_date = flow_run.job_variables.get("SNAPSHOT_DATE")
    ...
Other options and tradeoffs - Artifacts: You can create an artifact containing the value and trigger on
artifact.created
, then template fields from the event into parameters/job variables. This works well if you already publish artifacts, but you’ll want to ensure the artifact’s event payload reliably contains the value you need. API: Artifacts API - Variables/Blocks: Persist to a “latest” location and have the downstream read from it (e.g., write
latest_snapshot_date
to a custom Block or external store). This is simple but “last write wins” can be racy if multiple producers run concurrently; it’s also less traceable than the event approach. Blocks overview: Blocks API What I’d recommend - If you can, add an optional
snapshot_date
parameter to the downstream and use the event → automation → parameter pattern. It’s the cleanest and most maintainable. - If you can’t modify the downstream signature, use job variables as a transport and read them via
prefect.runtime.flow_run.job_variables
in the flow. Quick question to tailor this: Are you able to update the downstream flow to accept an optional parameter? And do you prefer emitting a small custom event from the producer or reusing an artifact-created event from something you already generate?