<In the documentation here> it shows how you would...
# ask-community
t
In the documentation here it shows how you would configure Dask using a gateway rather than using a temporary cluster:
Copy code
gateway = Gateway()
cluster = gateway.new_cluster()
executor = DaskExecutor(
    address=cluster.scheduler_address,
    client_kwargs={"security": cluster.security}
)
flow.run(executor=executor)
How is this supposed to work with flows that use Docker storage? The specific executor needs to be resolved at import time, and using the example code in the docs it would mean creating a cluster at import time.
k
You might be able to pass the the
dask_gateway.GatewayCluster
callable into
DaskExecutor
with something like
Copy code
executor = DaskExecutor(
    cluster_class="dask_gateway.GatewayCluster",
    cluster_kwargs={
        "image": "prefecthq/prefect:latest",
        "n_workers": 5,
        ...
    },
)
As Callable:
Copy code
executor = DaskExecutor(
    cluster_class=dask_gateway.GatewayCluster,
    cluster_kwargs={
        "image": "prefecthq/prefect:latest",
        "n_workers": 5,
        ...
    },
)
t
Thank you! 💪