Tim-Oliver
11/18/2022, 11:03 AMDaskTaskRunner
workers with the current main-branch version of Prefect, which works now locally thanks to some very recent changes (thanks a lot 💐). However, I am having troubles getting the logs from tasks which are running on a DaskTaskRunner
which uses dask_jobqueue.SLURMCluster
. The logs from tasks are written into the slurm-output file, but not propagated back to the flow-log or the cloud UI. Happy, to test some things if it would be helpful.Tim Galvin
11/18/2022, 11:13 AMos.system("sleep 5")
right before my flow finished I got the logs I was missing.Tim-Oliver
11/18/2022, 12:08 PMos.system("sleep 5")
the logs get through. Will try a real long running computation and see if the logs appear or if I always need a os.system("sleep 5")
. Thanks for the pointer!<http://logger.info|logger.info>(...)
. Then I don't even need the sleep.Tim Galvin
11/18/2022, 1:51 PMZanie
11/18/2022, 5:35 PMTim Galvin
11/19/2022, 3:39 AMos.system("sleep 4")
approach? If I recall correctly I did time the more pythonic time.sleep(3)
type approach, but I feel like it cause somed other strange error I was not pleased with. Would just a sleep command block all threads from running code, or just the thread running the time.sleep?Zanie
11/19/2022, 5:03 AMTim Galvin
11/19/2022, 6:27 AMZanie
11/19/2022, 5:45 PMTim Galvin
11/20/2022, 6:00 AMTim-Oliver
11/21/2022, 7:28 AM@task()
def a_task(i):
logger = get_run_logger()
<http://logger.info|logger.info>(f"I am logging from a task. {i}")
<http://logger.info|logger.info>("Done.")
@flow(
name="Logger Test",
task_runner=runner,
)
def flow():
logger = get_run_logger()
<http://logger.info|logger.info>("Flow starts!")
a_task.map(range(3))
a_task.submit(4)
<http://logger.info|logger.info>("Flow ends!")
And all logs appear as expected. (Previously this only worked with the sleep command in place.)Zanie
11/21/2022, 4:25 PMTim Galvin
11/22/2022, 1:11 AM