Hi guys, need some help on daskexecutor - the agen...
# prefect-community
Hi guys, need some help on daskexecutor - the agent is running on kubernetes and have registered the flow on cloud. I am using task result to mapped to another task: and i am getting following error:
Copy code
Unexpected error: TypeError('Could not serialize object of type Success.\nTraceback (most recent call last):\n File "/usr/local/lib/python3.8/dist-packages/distributed/protocol/pickle.py", line 49, in dumps\n result = pickle.dumps(x, **dump_kwargs)\nTypeError: cannot pickle \'SSLContext\' object\n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n File "/usr/local/lib/python3.8/dist-packages/distributed/protocol/serialize.py", line 258, in serialize\n header, frames = dumps(x, context=context) if wants_context else dumps(x)\n File "/usr/local/lib/python3.8/dist-packages/distributed/protocol/serialize.py", line 61, in pickle_dumps\n frames[0] = pickle.dumps(\n File "/usr/local/lib/python3.8/dist-packages/distributed/protocol/pickle.py", line 60, in dumps\n result = cloudpickle.dumps(x, **dump_kwargs)\n File "/usr/local/lib/python3.8/dist-packages/cloudpickle/cloudpickle_fast.py", line 73, in dumps\n cp.dump(obj)\n File "/usr/local/lib/python3.8/dist-packages/cloudpickle/cloudpickle_fast.py", line 563, in dump\n return Pickler.dump(self, obj)\nTypeError: cannot pickle \'SSLContext\' object\n')
Cant we serialize output of a task and use daskexecutor to run parallel mapped tasks?
Hi @Narasimhan Ramaswamy The output of a task can definitely be used when running tasks in parallel using dask but I believe this could be due to your task returning some piece of data that is not pickleable by cloudpickle. (e.g. a client object from some library is a common cause)
Thanks @josh - i am able to check the flow and pickling using below docker, but doesn’t work on cloud. https://docs.prefect.io/core/advanced_tutorials/local-debugging.html#locally-check-your-flow-s-docker-storage
it works when i logon to the container and pickle load
i have tried is_serializable as well and it outputs True
also one more question - does it need to LocalEnvironment(executor=DaskExecutor)) or do i have to run DaskKubernetes executor? my agent in running on Kubernetes