I’m trying to dynamically name a flow following these instructions. But is there some way to name the flow run with the labels used or the environment variables present in the flow run?https://github.com/PrefectHQ/prefect/discussions/3881
k
Kevin Kho
02/11/2022, 3:09 PM
Labels is hard. You literally might need to use GraphQL to pull Flow run information. For env variables, you might just be able to pull with
os.environ
if it’s not in context in some way?
j
Josh
02/11/2022, 5:38 PM
Would that work dynamically in the method called by the state handler? Would it be the env value on the agent running the flow?
Copy code
def change_name(flow, old_state, new_state):
import os
return os.getenv("ENV VALUE")
k
Kevin Kho
02/11/2022, 6:42 PM
I think the Flow has a separate env, not the agent. You need to use the
--env
flag on the agent to propagate it because of stuff like Docker/Kubernetes/ECS where the agent is not the execution environment
Kevin Kho
02/11/2022, 6:42 PM
That will only work on local
j
Josh
02/11/2022, 6:43 PM
I’m using
--env
flag when creating the agent. My question is whether that method called by the state handler will get the env value from the agent starting the flow executing?
k
Kevin Kho
02/11/2022, 6:44 PM
No because there is a distinction between agent env and flow env. But if you use the
--env
flag on the agent, then I believe the Flow will be able to pull it from the Flow env because the agent sent it over when it started the flow process. Does that make sense?
j
Josh
02/11/2022, 6:50 PM
Right I understand that the
--env
flag will get the environment variable set up on the agent and passed to the flow. I’m asking how to access that environment variable in the state handler method dynamically while the flow is running. Whether
os.env
will work within the state handler method to get the flow’s env values?
k
Kevin Kho
02/11/2022, 6:57 PM
I think so unless you are on Dask because that is part of the task and on Dask, a worker may not have the env variable
Bring your towel and join one of the fastest growing data communities. Welcome to our second-generation open source orchestration platform, a completely rethought approach to dataflow automation.