Hello everyone! Just another question here, I have...
# ask-community
i
Hello everyone! Just another question here, I have one Flow running on the Prefect Cloud, and the agent is running locally. The Flow is quite simple, he reads a .ini file and runs some Data Processing based on the .ini values (I'm using configparser). I noticed that if I change the values on the .ini file without registering the Flow again, he will run as it has the "old" .ini file, not detecting the changes made on the file. So, to properly "detect" the changes, I need to re-run the Flow and register again on the Cloud. So here's the question, it's possible to detect changes on local files without registering the Flow again? I'm asking because I have some pipelines that need to read some CSVs and .txt files locally, and I'm afraid that I have to re-run and register the script every day to detect the changes on the files.
k
Hey @Italo Barros, I think you are reading that file outside a task so it gets loaded before Flow registration. Could you somehow defer reading that by placing it inside a task?
🥰 1
i
Hi @Kevin Kho, the file was a global variable, so you're right, thank you! Would be better to create a Task that reads the file and returns it as an object, or to read it inside the flow? It's possible to pass this object between flows if I have a Flow-of-Flows?
k
The Flow is not deferred so it has to be a task. You can pass the file_path as a parameter to the next flow if they all run on the same agent. You can also do pass the whole value as a parameter if it fits.
i
Got it, thank you!
Hey @Kevin Kho, sorry to disturb you again, but let suppose that I have a class that is responsible to do some interaction an API endpoint (authentication, get, post, eg). Would be better to return this Class object as a Task result than call this a Global variable, right?
k
No worries! If you end up using the
DaskExecutor
, it’s better to create that Client inside the task because functions and objects are shipped to Dask workers by serializing them with
cloudpickle
. Client objects and connections normally cannot be serialized. If you are running everything on local, then it might work, but you will get an error when serializing the flow upon registration. You need to store for Flow as a script. See this for more details. This will execute the Flow when the script is downloaded. In general, to make this work with serialization, create the connection inside the task
❤️ 1