Peter Peter
02/09/2022, 2:28 PMKevin Kho
Peter Peter
02/09/2022, 2:29 PMKevin Kho
Peter Peter
02/09/2022, 2:38 PMpod_spec_kwargs = {'memory_limit': '4G'}
return DaskExecutor(cluster_class=lambda: KubeCluster(make_pod_spec(image=pod_spec_image, **pod_spec_kwargs)),
adapt_kwargs={
"minimum": min_workers,
"maximum": max_workers
})
Are you saying that if there is any failure in a dask worker then the flow will fail and there is no way around this?Kevin Kho
Peter Peter
02/09/2022, 3:06 PMKevin Kho
pod_spec_kwargs
in your snippet right?
I am not sure this will work but maybe you can try using a trigger
on the downstream task
@task(trigger=always_run)
def something():
return 1
so it will always run even if the upstream fails.Peter Peter
02/09/2022, 3:16 PMKevin Kho
Peter Peter
02/09/2022, 3:27 PMKevin Kho
Peter Peter
02/09/2022, 8:47 PM