Aleksandr Liadov
12/13/2021, 5:11 PMserver
as backend all logs are displayer correctly (both Prefect logs and my custom logs).Kevin Kho
12/13/2021, 5:13 PMAleksandr Liadov
12/13/2021, 5:13 PM"PREFECT__LOGGING__LEVEL": "DEBUG",
"PREFECT__LOGGING__FORMAT": "[%(asctime)s] %(levelname)s - %(name)s | %(message)s",
"PREFECT__LOGGING__EXTRA_LOGGERS": "['lib1', 'lib2']"
I provide the following env variables for my flows.KubernetesRun
and for my local test LocalRun
Kevin Kho
12/13/2021, 5:17 PMAleksandr Liadov
12/13/2021, 5:36 PMLocalRun
, LocalDaskExecutor
and for KubernetesRun
DaskExecutor
Kevin Kho
12/13/2021, 5:41 PMDaskExecutor
uses distributed
where LocalDaskExecutor
uses dask
and multiprocessing
. distributed
on a Dask cluster does not shuffle logs between workers and the scheduler. This happens with or without Prefect so you need some external service with to ship the logs to a centralized location. The question here would be if you see those logs in the worker logs.Aleksandr Liadov
12/13/2021, 5:43 PM"PREFECT__LOGGING__EXTRA_LOGGERS": "['distributed']"
, but this doesn’t work ( like you said “Dask cluster does not shuffle logs between workers and the scheduler”).
Do you know, maybe someone has tried to resolve the similar problem?Kevin Kho
12/14/2021, 3:45 PMAleksandr Liadov
12/14/2021, 4:57 PM