I'm running into what looks like a pythonpath/impo...
# prefect-community
b
I'm running into what looks like a pythonpath/importing error when attempting to run on a local dask cluster:
Unexpected error occured in FlowRunner: ModuleNotFoundError("No module named 'utils'")
, referencing shared static functions in a package called
utils
. Any ideas how I might go about debugging this to make sure that code is available at the time of execution? When I look at the barf from the dask workers themselves, it looks like it's crapping out trying to deserialize a task:
Copy code
distributed.worker - WARNING - Could not deserialize task
Traceback (most recent call last):
  File "/Users/bmcfeeley/.virtualenvs/spark3.7/lib/python3.7/site-packages/distributed/worker.py", line 1272, in add_task
    self.tasks[key] = _deserialize(function, args, kwargs, task)
  File "/Users/bmcfeeley/.virtualenvs/spark3.7/lib/python3.7/site-packages/distributed/worker.py", line 3060, in _deserialize
    function = pickle.loads(function)
  File "/Users/bmcfeeley/.virtualenvs/spark3.7/lib/python3.7/site-packages/distributed/protocol/pickle.py", line 61, in loads
    return pickle.loads(x)
ModuleNotFoundError: No module named 'utils'
Does all the code have to live in the same file as the flow/task definitions somehow?
c
do the workers also have access to this shared module?
no it doesn’t need to live in the same file, it just needs to be importable from the same path
on the workers
b
ahh ok