Hi all! Does anyone have a good solution for aggregating worker logs when executing with a DaskExecutor + external cluster? When running with LocalExecutor, its nice to see all the logs for each task printed to stdout of wherever im running it. When running with an external Dask cluster, I need to go in to each worker's logs to see the logs for the task that the worker receives. It would be nice to still get a global sense of how all the tasks are running
j
Jim Crist-Harif
03/24/2021, 2:55 PM
If you're running with Prefect Cloud/Prefect Server, the task logs are all sent to the backend allowing for uniform debugging. If running locally (with
flow.run()
) you'd need to manage logs yourself. You might find the
get_worker_logs
method on
dask.distributed.Client
useful. Many of dask's cluster managers also have
get_logs
methods as well.
a
Aaron Richter
03/24/2021, 3:04 PM
I see, thanks @Jim Crist-Harif. I'll look into grabbing the logs manually from the cluster I'm using
Bring your towel and join one of the fastest growing data communities. Welcome to our second-generation open source orchestration platform, a completely rethought approach to dataflow automation.