Hello guys, Does anyone have a problem with displa...
# prefect-server
a
Hello guys, Does anyone have a problem with displaying logs from custom libs (they use the standard python logger) in cloud Prefect? However, if I run my flows using
server
as backend all logs are displayer correctly (both Prefect logs and my custom logs).
k
Hi @Aleksandr Liadov, I assume you added the extra loggers right? Do you have the same RunConfig and Executor for both runs?
a
"PREFECT__LOGGING__LEVEL": "DEBUG",
"PREFECT__LOGGING__FORMAT": "[%(asctime)s] %(levelname)s - %(name)s | %(message)s",
"PREFECT__LOGGING__EXTRA_LOGGERS": "['lib1', 'lib2']"
I provide the following env variables for my flows.
No for PrefectCloud we use
KubernetesRun
and for my local test
LocalRun
k
And both are local executor?
a
For
LocalRun
,
LocalDaskExecutor
and for
KubernetesRun
DaskExecutor
k
I think this is expected because
DaskExecutor
uses
distributed
where
LocalDaskExecutor
uses
dask
and
multiprocessing
.
distributed
on a Dask cluster does not shuffle logs between workers and the scheduler. This happens with or without Prefect so you need some external service with to ship the logs to a centralized location. The question here would be if you see those logs in the worker logs.
a
Thanks @Kevin Kho, I’ll see the workers logs!
@Jean-David Fiquet
Hello @Kevin Kho, after several tests, I see my customs logs in the worker logs, so that works. I tried a very naive approche to declare in
"PREFECT__LOGGING__EXTRA_LOGGERS": "['distributed']"
, but this doesn’t work ( like you said “Dask cluster does not shuffle logs between workers and the scheduler”). Do you know, maybe someone has tried to resolve the similar problem?
k
So the reason why that doesn’t work is cuz the logger gets serialized and then sent to workers. When it gets deserialized, it loses the configuration. I think you may find more people asking about this in Dask Github issues/channels
🙌 1
a
@Côme Arvis