Marwan Sarieddine
05/21/2020, 10:56 PMMarked "Failed" by a Zombie Killer process.
- basically I created a simple flow and the flow runs fine when the size of the input data is small, but when I increase the size of the input data (but remain well below the memory limit of the dask worker) - parts of the flow run, but some parts get marked as “failed” by a zombie killer process -
Also the entire flow’s status gets stuck at “running” even though the task has been marked as “failed”
I am sharing my flow’s code and dask-worker spec under this issue here (https://github.com/PrefectHQ/prefect/issues/1954) - to give a concrete example of the failure