An Hoang
10/21/2019, 9:47 PM[2019-10-21 21:17:43,141] INFO - prefect.FlowRunner | Beginning Flow run for 'dask-example'
[2019-10-21 21:17:43,144] INFO - prefect.FlowRunner | Starting flow run.
[2019-10-21 21:17:43,145] ERROR - prefect.FlowRunner | Unexpected error: ValueError("Unexpected keyword arguments: ['processes', 'silence_logs']")
Traceback (most recent call last):
File "/lab/corradin_data/FOR_AN/anaconda3/envs/learn-new-pkgs/lib/python3.7/site-packages/prefect/engine/runner.py", line 48, in inner
new_state = method(self, state, *args, **kwargs)
File "/lab/corradin_data/FOR_AN/anaconda3/envs/learn-new-pkgs/lib/python3.7/site-packages/prefect/engine/flow_runner.py", line 393, in get_flow_run_state
with executor.start():
File "/lab/corradin_data/FOR_AN/anaconda3/envs/learn-new-pkgs/lib/python3.7/contextlib.py", line 112, in __enter__
return next(self.gen)
File "/lab/corradin_data/FOR_AN/anaconda3/envs/learn-new-pkgs/lib/python3.7/site-packages/prefect/engine/executors/dask.py", line 74, in start
self.address, processes=self.local_processes, **self.kwargs
File "/lab/corradin_data/FOR_AN/anaconda3/envs/learn-new-pkgs/lib/python3.7/site-packages/distributed/client.py", line 649, in __init__
"Unexpected keyword arguments: {}".format(str(sorted(kwargs)))
ValueError: Unexpected keyword arguments: ['processes', 'silence_logs']
[2019-10-21 21:17:43,149] ERROR - prefect.Flow: dask-example | Unexpected error occured in FlowRunner: ValueError("Unexpected keyword arguments: ['processes', 'silence_logs']")
conda versions:
# Name Version Build Channel
dask 2.6.0 py_0 conda-forge
dask-core 2.6.0 py_0 conda-forge
dask-glm 0.2.0 py_1 conda-forge
dask-jobqueue 0.6.3 py_0 conda-forge
dask-ml 1.0.0 py_1 conda-forge
# Name Version Build Channel
prefect 0.6.6 py_0 conda-forge
cluster configuration:
python
from dask_jobqueue import LSFCluster
cluster = LSFCluster(queue='all_corradin',
cores= 47,
#processes = 2,
walltime='5000:00',
memory='250GB',
local_directory = "/tmp",
job_extra=['-o /dev/null', '-e /dev/null'],
scheduler_port = 8786,
worker_dashboard_address = 8788
)
cluster.scale(17)
Chris White
dask-distributed
this issue will go awayAn Hoang
10/21/2019, 10:04 PMresult = flow.run(executor=executor)
result.result
{<Task: 2>: <Success: "None">,
<Task: 14>: <Success: "None">,
<Task: 17>: <Success: "None">,
<Task: 21>: <Success: "None">,
<Task: 20>: <Success: "None">,
<Task: List>: <Success: "Task run succeeded.">,
<Task: 3>: <Success: "None">,
<Task: 13>: <Success: "None">,
<Task: 6>: <Success: "None">,
<Task: run_script>: <Mapped: "Mapped tasks submitted for execution.">,
<Task: 10>: <Success: "None">,
<Task: 16>: <Success: "None">,
<Task: 11>: <Success: "None">,
<Task: 19>: <Success: "None">,
<Task: 4>: <Success: "None">,
<Task: 5>: <Success: "None">,
<Task: 9>: <Success: "None">,
<Task: 12>: <Success: "None">,
<Task: 7>: <Success: "None">,
<Task: 8>: <Success: "None">,
<Task: 15>: <Success: "None">,
<Task: 1>: <Success: "None">,
<Task: 18>: <Success: "None">}
and don't know where to go from here.
my task is :
@task(name="run_script")
def run_MS_analysis(chr_num):
papermill.execute_notebook(
"MS_HWE_QC_input.ipynb",
f"MS_HWE_QC_chr{chr_num}.ipynb",
parameters={"chr_num":chr_num})
with Flow("dask-example") as flow:
run_MS_analysis.map(list(range(1,23)))
Chris White