Hi! I'm trying to setup third party (sqlalchemy) l...
# ask-community
a
Hi! I'm trying to setup third party (sqlalchemy) logs to output in flow logs. I'm using Docker storage and KubernetesAgent. I do something similar to https://docs.prefect.io/core/concepts/logging.html#extra-loggers in my register_flow.py script and I see some logs when Prefect is capturing flow object for serialization, hence I know it's working. But during the run, no logs from sqlalchemy are written. It seems that logging configuration is not serialized during flow serialization. What is the preferred way to setup third-party logging in flows?
k
Are you using the env variable? That should work. Normal cases where it doesn’t show up is the logs are handled by different process without Prefect’s handler attached, or you are using a DaskExecutor
a
No, I'm using logging setup through `getLogger`/`setLevel`
In this specific case I can switch to Env variable, but in general case I would like to control logLevel for each specific logger. (Our in-house libs in DEBUG, sqlalchemy in INFO)
I think that wrapping logging setup into task that runs first should do the trick, but it seems like a hack to me
We're using KubernetesAgent, that spawns Job for each run.
k
Using the
getLogger
is not preferred. The way to do it would be the env variable below:
Copy code
export PREFECT__LOGGING__EXTRA_LOGGERS="['snowflake.connector', 'boto3', 'custom_lib']"
but I think the
getLogger
can work if you use a script-based storage by using
stored_as_script=True
with Docker. This stored the Flow as a script, and then evaluates it during execution. Because it’s evaluated, logging configuration will stick. The default serializes the Flow, and the logger settings get lost cuz of this. You can read more abotu script based storage here
a
Thanks!
k
I know what you mean though. You can only set one level for all of them with the env var