Hi !
I am deploying a flow that is using a custom python package. This custom python package has its own logger named "data-pipeline".
When I run a flow, I can see the logs coming from this logger when I am looking at the agent in a terminal, but I can't see those logs in the UI even if I set up the environment variable with the command
prefect config set PREFECT_LOGGING_EXTRA_LOGGERS=data-pipeline
I can also see in the Settings of the UI that this variable is set, like below in my screenshot. But in the logs in the UI I only get the logs from
Task run 'task-name'
or
Flow run 'flow-name'
.
Could you help me to display the "data-pipeline" python package logs in the UI please ?
Thank you
Alric
06/14/2023, 4:10 PM
The flow is running in a docker container
t
Tim Galvin
06/15/2023, 1:37 AM
I had a weirdness similar to this. What is your compute cluster and task runner? I was using a dask task runner running on a SLURMCluster
For the moment the container in which the flow run is running on the same machine where the prefect2 server is running. So no cluster. And the task runner is the default one. I'll try to comment the line you specified in your ticket, but I am not sure how I'll do this since it's in the build of the docker image, it may be a bite tricky.
Alric
06/15/2023, 7:04 AM
Or is it in prefect2 package of the prefect2 server ?
Bring your towel and join one of the fastest growing data communities. Welcome to our second-generation open source orchestration platform, a completely rethought approach to dataflow automation.