I think there is no simple way but you might be able to do this with Python logging filters. For example (see :
Copy code
import prefect
from prefect import task, Flow, Parameter
from prefect.triggers import all_successful, any_failed, always_run, all_failed
import logging
class MyFilter(logging.Filter):
def filter(self,record):
return 'Task' not in record.msg
@task()
def success_handler():
logger = prefect.context.get("logger")
<http://logger.info|logger.info>("Handled task success")
with Flow("TestFlow") as flow:
success_handler()
# Add the filter
logging.getLogger("prefect.TaskRunner").addFilter(MyFilter())
flow.run()
Kevin Kho
08/25/2021, 2:27 PM
But then I think you need to store your Flow as a script for this Filter to apply.
t
Tom Forbes
08/25/2021, 2:28 PM
Argh, yeah. This also won’t work if you’re using a remote Dask cluster I think.
Tom Forbes
08/25/2021, 2:29 PM
Just an FYI, the messages are not entirely useful especially if you’re launching a large number of tasks. It makes it much harder to see errors or other conditions you care about
k
Kevin Kho
08/25/2021, 2:33 PM
Will elevate the feedback and see if something can be done around this.
Bring your towel and join one of the fastest growing data communities. Welcome to our second-generation open source orchestration platform, a completely rethought approach to dataflow automation.