https://prefect.io logo
r

Russell Brooks

03/16/2022, 10:53 PM
Hi. I am using prefect server 1.0. Flows and tasks have been running ok. But now they only show log entries for the very beginning and end. I.e. I lost the richness of all those logger.info I made. Logs do show up if I run locally. What have I done and how to fix?
k

Kevin Kho

03/16/2022, 10:54 PM
Are you using the Dask executor?
r

Russell Brooks

03/16/2022, 11:48 PM
Yes I have Dask turned on
k

Kevin Kho

03/16/2022, 11:51 PM
You can see this for more info
r

Russell Brooks

03/17/2022, 12:05 AM
I am using LocalDaskExecutor. It has shown logs always in the past. So it must be something else…
k

Kevin Kho

03/17/2022, 12:10 AM
Ah LocalDask definitely shows logs. How do you use the logger? Inside tasks? Could you show me a code sample?
r

Russell Brooks

03/17/2022, 12:13 AM
I am not allowed to show code. But it is the same type of use as in all your examples. I get the logger from prefect context within each task and then use logger.info to write to it. That has been the same and working for at least six months now
k

Kevin Kho

03/17/2022, 12:16 AM
I really can’t think of any reason off the top of my head why that wouldn’t work. Did you fiddle with logging level and do they show on LocalExecutor?
r

Russell Brooks

03/17/2022, 12:25 AM
If I run locally the log is displayed as expected with all lines. This is both with and without LocalDaskExecutor turned on.
k

Kevin Kho

03/17/2022, 12:27 AM
Local runs will not always be indicative of Cloud behavior when it comes to logging, especially with custom modules and multiprocessing. Though in this case I fully expect it to be, but it’s not a good gauge.
Like for example if there is a new process created, local execution can capture that in logs I think through watching stdout but it’s not the same for cloud backed runs
r

Russell Brooks

03/17/2022, 12:36 AM
In this case I am in Core Server. With everything on the same local machine. Dunno….
k

Kevin Kho

03/17/2022, 12:40 AM
You’re not on Prefect Cloud? Hmm…does it also behave this way for a simple flow run?
r

Russell Brooks

03/17/2022, 1:00 AM
Good point! I took your logging example and that works. So it's something in the big stack of config.toml to all the module imports and whatnot. Something in there has stopped all but the start and end log statements. Ok well at least I can come at it from that angle now. Thank you
👍 1
k

Kevin Kho

03/17/2022, 1:05 AM
Of course!
r

Russell Brooks

03/17/2022, 1:47 AM
Ok looks like it is a LocalDaskExecutor issue after all. If I add that with “processes” to your logging example from your docs it doesn't show all the logs in the server ui. But if with “threads” it did show logs. This is new behaviour. I always had logs with LocalDaskExecutor on server up before regardless of scheduler.
k

Kevin Kho

03/17/2022, 2:19 AM
I just tried and can’t replicate. This was my script:
Copy code
from prefect import Flow, task
import prefect
from prefect.executors import LocalDaskExecutor

@task
def test(x):
    <http://prefect.context.logger.info|prefect.context.logger.info>(f"test-{x}")
    return x+1

with Flow("processes") as flow:
    test.map([1,2,3,4,5,6,7,8,9,10])

flow.executor = LocalDaskExecutor(scheduler="processes")
flow.register("databricks")
and my logs shows stuff
Could you show me which example you used?
r

Russell Brooks

03/17/2022, 3:49 AM
I used the code from advanced tutorials deployment custom log
2 Views