Tsang Yong
03/23/2021, 12:19 AMcluster = KubeCluster.from_yaml(dask_worker_spec_file_path)
cluster.adapt(minimum=1, maximum=10)
executor = DaskExecutor(cluster.scheduler_address)
state = flow.run(executor=executor)
but when I try to access the state I'm getting this.
Python 3.8.6 (default, Dec 11 2020, 14:38:29)
Type 'copyright', 'credits' or 'license' for more information
IPython 7.21.0 -- An enhanced Interactive Python. Type '?' for help.
In [1]: state
Out[1]: <Failed: "Unexpected error: TypeError('Could not serialize object of type Failed.\nTraceback (most recent call last):\n File "/usr/local/lib/python3.8/site-packages/distributed/protocol/pickle.py", line 49, in dumps\n result = pickle.dumps(x, **dump_kwargs)\nTypeError: cannot pickle \'_thread.RLock\' object\n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n File "/usr/local/lib/python3.8/site-packages/distributed/protocol/serialize.py", line 307, in serialize\n header, frames = dumps(x, context=context) if wants_context else dumps(x)\n File "/usr/local/lib/python3.8/site-packages/distributed/protocol/serialize.py", line 58, in pickle_dumps\n frames[0] = pickle.dumps(\n File "/usr/local/lib/python3.8/site-packages/distributed/protocol/pickle.py", line 60, in dumps\n result = cloudpickle.dumps(x, **dump_kwargs)\n File "/usr/local/lib/python3.8/site-packages/cloudpickle/cloudpickle_fast.py", line 73, in dumps\n cp.dump(obj)\n File "/usr/local/lib/python3.8/site-packages/cloudpickle/cloudpickle_fast.py", line 563, in dump\n return Pickler.dump(self, obj)\nTypeError: cannot pickle \'_thread.RLock\' object\n')">
any idea what I'm doing wrong?Tsang Yong
03/23/2021, 12:20 AMcloudpickle==1.6.0
dask==2021.3.0
dask-kubernetes==2021.3.0
prefect==0.14.12
Chris White
Chris White
Tsang Yong
03/23/2021, 12:50 AMChris White