Narasimhan Ramaswamy
10/21/2020, 12:27 PMUnexpected error: TypeError('Could not serialize object of type Success.\nTraceback (most recent call last):\n File "/usr/local/lib/python3.8/dist-packages/distributed/protocol/pickle.py", line 49, in dumps\n result = pickle.dumps(x, **dump_kwargs)\nTypeError: cannot pickle \'SSLContext\' object\n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n File "/usr/local/lib/python3.8/dist-packages/distributed/protocol/serialize.py", line 258, in serialize\n header, frames = dumps(x, context=context) if wants_context else dumps(x)\n File "/usr/local/lib/python3.8/dist-packages/distributed/protocol/serialize.py", line 61, in pickle_dumps\n frames[0] = pickle.dumps(\n File "/usr/local/lib/python3.8/dist-packages/distributed/protocol/pickle.py", line 60, in dumps\n result = cloudpickle.dumps(x, **dump_kwargs)\n File "/usr/local/lib/python3.8/dist-packages/cloudpickle/cloudpickle_fast.py", line 73, in dumps\n cp.dump(obj)\n File "/usr/local/lib/python3.8/dist-packages/cloudpickle/cloudpickle_fast.py", line 563, in dump\n return Pickler.dump(self, obj)\nTypeError: cannot pickle \'SSLContext\' object\n')
Cant we serialize output of a task and use daskexecutor to run parallel mapped tasks?josh
10/21/2020, 1:20 PMNarasimhan Ramaswamy
10/21/2020, 10:44 PM