Hi all, I'm creating a dask-kubernetes cluster within a task (not using DaskExecutor). I'd like to pass the cluster around to other tasks, but it's not cloud pickleable. As a workaround, I'm passing the
cluster.scheduler_address
around, and I'm able to create client to the cluster by running
client = Client(scheduler_address)
. I want to be sure I shut down the cluster in an ending task which always runs as I've seen the scheduler stay up sometimes when an error is thrown during the flow, but I can't find a way to do that directly from the client object. I believe I can only shut down the cluster with the original cluster object which I'm not sure how to recover. Have y'all seen this and if so, do you have any suggestions for workarounds?
k
Kevin Kho
06/09/2021, 2:26 PM
Hey @Adam Lewis, not 100% but does the
client.shutdown()
work for your use case?
a
Adam Lewis
06/09/2021, 2:27 PM
Oh my 🤦, yes, by looking at the documentation again, it does look like it will work. Thank you for the help 🎉
Bring your towel and join one of the fastest growing data communities. Welcome to our second-generation open source orchestration platform, a completely rethought approach to dataflow automation.