Michael Hadorn
06/11/2021, 9:06 AMRuntimeError:
An attempt has been made to start a new process before the
current process has finished its bootstrapping phase.
This probably means that you are not using fork to start your
child processes and you have forgotten to use the proper idiom
in the main module:
if __name__ == '__main__':
freeze_support()
...
The "freeze_support()" line can be omitted if the program
is not going to be frozen to produce an executable.
I build my flow on-the-fly, and I see, that this flow is built in every subprocess.Zanie
06/11/2021, 2:20 PMif __name__ == '__main__':
blockMichael Hadorn
06/11/2021, 2:28 PMUnexpected error: TypeError("cannot pickle 'sqlalchemy.cprocessors.UnicodeResultProcessor' object")
Traceback (most recent call last):
File "/usr/local/lib/python3.8/site-packages/distributed/protocol/pickle.py", line 49, in dumps
result = pickle.dumps(x, **dump_kwargs)
_pickle.PicklingError: Can't pickle <function Tasks.join_checks at 0x7fb04870c040>: it's not the same object as cdwhprefect.model.entity_transformation_impl.join_checks.Tasks.join_checks
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.8/site-packages/prefect/engine/runner.py", line 48, in inner
new_state = method(self, state, *args, **kwargs)
File "/usr/local/lib/python3.8/site-packages/prefect/engine/flow_runner.py", line 618, in get_flow_run_state
task_states[task] = executor.submit(
File "/usr/local/lib/python3.8/site-packages/prefect/executors/dask.py", line 393, in submit
fut = self.client.submit(fn, *args, **kwargs)
File "/usr/local/lib/python3.8/site-packages/distributed/client.py", line 1586, in submit
futures = self._graph_to_futures(
File "/usr/local/lib/python3.8/site-packages/distributed/client.py", line 2554, in _graph_to_futures
dsk = dsk.__dask_distributed_pack__(self, keyset)
File "/usr/local/lib/python3.8/site-packages/dask/highlevelgraph.py", line 955, in __dask_distributed_pack__
"state": layer.__dask_distributed_pack__(
File "/usr/local/lib/python3.8/site-packages/dask/highlevelgraph.py", line 392, in __dask_distributed_pack__
dsk = toolz.valmap(dumps_task, dsk)
File "/usr/local/lib/python3.8/site-packages/toolz/dicttoolz.py", line 83, in valmap
rv.update(zip(d.keys(), map(func, d.values())))
File "/usr/local/lib/python3.8/site-packages/distributed/worker.py", line 3560, in dumps_task
d["kwargs"] = warn_dumps(task[3])
File "/usr/local/lib/python3.8/site-packages/distributed/worker.py", line 3572, in warn_dumps
b = dumps(obj, protocol=4)
File "/usr/local/lib/python3.8/site-packages/distributed/protocol/pickle.py", line 60, in dumps
result = cloudpickle.dumps(x, **dump_kwargs)
File "/usr/local/lib/python3.8/site-packages/cloudpickle/cloudpickle_fast.py", line 73, in dumps
cp.dump(obj)
File "/usr/local/lib/python3.8/site-packages/cloudpickle/cloudpickle_fast.py", line 563, in dump
return Pickler.dump(self, obj)
TypeError: cannot pickle 'sqlalchemy.cprocessors.UnicodeResultProcessor' object
Zanie
06/11/2021, 2:46 PMMichael Hadorn
06/11/2021, 2:52 PMZanie
06/11/2021, 2:55 PMTask
class so nevermind.Michael Hadorn
06/11/2021, 3:04 PMZanie
06/11/2021, 3:08 PMMichael Hadorn
06/11/2021, 3:14 PM