Hi. I am using prefect server 1.0. Flows and tasks...
# prefect-ui
r
Hi. I am using prefect server 1.0. Flows and tasks have been running ok. But now they only show log entries for the very beginning and end. I.e. I lost the richness of all those logger.info I made. Logs do show up if I run locally. What have I done and how to fix?
k
Are you using the Dask executor?
r
Yes I have Dask turned on
k
You can see this for more info
r
I am using LocalDaskExecutor. It has shown logs always in the past. So it must be something else…
k
Ah LocalDask definitely shows logs. How do you use the logger? Inside tasks? Could you show me a code sample?
r
I am not allowed to show code. But it is the same type of use as in all your examples. I get the logger from prefect context within each task and then use logger.info to write to it. That has been the same and working for at least six months now
k
I really can’t think of any reason off the top of my head why that wouldn’t work. Did you fiddle with logging level and do they show on LocalExecutor?
r
If I run locally the log is displayed as expected with all lines. This is both with and without LocalDaskExecutor turned on.
k
Local runs will not always be indicative of Cloud behavior when it comes to logging, especially with custom modules and multiprocessing. Though in this case I fully expect it to be, but it’s not a good gauge.
Like for example if there is a new process created, local execution can capture that in logs I think through watching stdout but it’s not the same for cloud backed runs
r
In this case I am in Core Server. With everything on the same local machine. Dunno….
k
You’re not on Prefect Cloud? Hmm…does it also behave this way for a simple flow run?
r
Good point! I took your logging example and that works. So it's something in the big stack of config.toml to all the module imports and whatnot. Something in there has stopped all but the start and end log statements. Ok well at least I can come at it from that angle now. Thank you
👍 1
k
Of course!
r
Ok looks like it is a LocalDaskExecutor issue after all. If I add that with “processes” to your logging example from your docs it doesn't show all the logs in the server ui. But if with “threads” it did show logs. This is new behaviour. I always had logs with LocalDaskExecutor on server up before regardless of scheduler.
k
I just tried and can’t replicate. This was my script:
Copy code
from prefect import Flow, task
import prefect
from prefect.executors import LocalDaskExecutor

@task
def test(x):
    <http://prefect.context.logger.info|prefect.context.logger.info>(f"test-{x}")
    return x+1

with Flow("processes") as flow:
    test.map([1,2,3,4,5,6,7,8,9,10])

flow.executor = LocalDaskExecutor(scheduler="processes")
flow.register("databricks")
and my logs shows stuff
Could you show me which example you used?
r
I used the code from advanced tutorials deployment custom log