Hey
@Joe Schmid, I believe the answer is a bit tricky here, and we need to go over a couple of concepts.
First, is that you can use the Dask cluster both as a Resource Manager or as an Executor. If you are using it as a resource manager, then it becomes easier.
If you are using it as an executor, I think you need to create a function and do
flow.executor = create_dask_executor()
Inside this function you can then
def create_dask_executor():
prefect.context.get("parameters")
return spec
This is possible because the executor is not serialized with the flow and is loaded from the flow storage. Because it’s not in the flow context though, I don’t think you can directly pass parameters in.
On number 2, the RunConfig does not take the executor. Clocks have parameters and schedules are comprised of
multiple clocks. So you can attach your parameters and maybe change your executor that way, but it won’t be in the RunConfig.