Hi all! Does anyone have a good solution for aggre...
# ask-community
a
Hi all! Does anyone have a good solution for aggregating worker logs when executing with a DaskExecutor + external cluster? When running with LocalExecutor, its nice to see all the logs for each task printed to stdout of wherever im running it. When running with an external Dask cluster, I need to go in to each worker's logs to see the logs for the task that the worker receives. It would be nice to still get a global sense of how all the tasks are running
j
If you're running with Prefect Cloud/Prefect Server, the task logs are all sent to the backend allowing for uniform debugging. If running locally (with
flow.run()
) you'd need to manage logs yourself. You might find the
get_worker_logs
method on
dask.distributed.Client
useful. Many of dask's cluster managers also have
get_logs
methods as well.
a
I see, thanks @Jim Crist-Harif. I'll look into grabbing the logs manually from the cluster I'm using
s
@Jim Crist-Harif I posted a related issue here https://prefect-community.slack.com/archives/CL09KU1K7/p1616957545069800 before finding this one. Any ideas on why my task logs from the Dask worker nodes might not be appearing in Prefect Cloud?