Hi there 🙂One question about dynamic flows (build at runtime) and the prefect server/cloud.Our ETL process contains entities to load (stored in the database). The list of these entities can change over time (because of development).
How can I achieve a dynamic load, which will load all entities existing at runtime and be able to manage this via the Server/Cloud?I can not use the task-mapping function, because the entities have own dependencies (also stored in the database) and these will be dynamically resolved at runtime.Is there a way to solve this kind of problem with prefect/UI?I developed this with the localDaskRun and the explicit run-call. Then its working, because the flow is built everytime.
Now via Server, I got errors when the task is changing:
KeyError: 'Task slug XX not found in the current Flow; this is usually caused by changing the Flow without reregistering it with the Prefect API.'
When I try to use the gitlab storage for this dynamic flow I got this error:
raise ValueError("No flow found in file.")
But for sure, this flow is built in this file. The same file is working without when I use the localStorage instead.I think, I have a general problem because my implementation of the flow is they way it should.
1 year ago
Hi @Michael Hadorn when using the GitLab storage are you also manually committing the file to the git repo? Git-based storage does not auto upload the flow file to the location at build time like some of the other storage options
1 year ago
yes. the same is working with the easy flow
(and I use the same repo, like i use to register the flow initially)
is there somewhere a explanation, what with the files in the storage exactly happens? is the flow rebuilt, when the flow is stored as file? what then will be registered in the gui (the same output of the flow)?