I have created a single flow that I want to run ag...
# ask-community
f
I have created a single flow that I want to run against different environments at the same time. For example I have dev, preprod, and prod environments. Based on the given environment I can pass in the the given parameters that have the correct values(usernames, password, etc.). How do I do this with only one flow ... how do I register that flow with different parameters? Is this possible.
k
Hi @Felipe Saldana! This is certainly a known painpoint. Some people automate the registering of a flow in their CI/CD pipeline and push the flow across different environments that way. Some users on the enterprise plan take advantage of having multiple tenants and structure it that way.
You can register with different parameters by using the Schedules. Parameters can be attached to schedules and then you can schedule for each environment
f
Can schedules be set at the same time? This flow happens to be a short duration but we have some take many hours and it would be difficult to stagger and still have a consistent experience across environments.
k
I was kind of wrong earlier. One schedule is composed of multiple clocks. Parameters are attached to clocks. See: https://docs.prefect.io/core/concepts/schedules.html#varying-parameter-values
f
ok I see ... but why offsetting the time? Do I need to stagger the times?
Copy code
clock2   = clocks.IntervalClock(start_date=now + datetime.timedelta(seconds=30),
z
You don't need to stagger the times; your schedules can overlap
👍 1
f
Thanks! I will give this a try tomorrow AM
@Kevin Kho @Zanie I tested setting the clocks with different params at the same time and that looked to work. A problem is I really dont know which scheduled/clock run is pointed to which environment without digging through the logs. On a similar note, I tested out StartFlowRun like the below.
Copy code
data_qa1 = StartFlowRun(flow_name="data_flow_multi", project_name="FelipeFirst", wait=True, parameters=qa1_params)
data_qa2 = StartFlowRun(flow_name="data_flow_multi", project_name="FelipeFirst", wait=True, parameters=qa2_params)

with Flow("data_qa1_flow") as data_qa1_flow:
    data_qa1()

with Flow("data_qa2_flow") as data_qa2_flow:
    data_qa2()
This gives me a descriptive name and also works when kicking off manually around the same time. The logs have a link to the underlying flow which is helpful but not entirely ideal
Flow Run: <https://cloud.prefect.io/enverus-data/flow-run/4e8b06ab-83d5-4e9c-ad48-d3926514be76>
1. Are there any plans or already a way to have to underlying flow have its logs expanded directly in the outer flow log? 2. What are your thoughts on what I have using
StartFlowRun
vs
clock
to point a single base flow
data_flow_multi
to multiple environments?
z
1. Yes. I'm working on this currently. (https://github.com/PrefectHQ/prefect/pull/4563) 2. Either of these seem pretty reasonable and whatever is easier for you to work with makes sense
👍 1
f
This is good news on 1. ... thanks for the update. I am leaning towards using the StartFlowRun pattern at the moment