Samo K.
09/29/2025, 7:20 AMupload_from_file_object
function logs every single file upload (at logging level INFO), which never need to no about in my case - I will only ever care about failed uploads. Especially since I am working in an environment with extreme storage constraints (Postgres DB server only has 10 GB of free storage), I want to remove unnecessary logs to a minimum. Is there any way to disable those logs upon every file upload (or set them to only be logged at DEBUG level, rather than INFO)?Marvin
09/29/2025, 7:20 AMMarvin
09/29/2025, 7:30 AMS3Bucket.upload_from_file_object
to silence or demote these logs. But you have a few good options to suppress them cleanly in Prefect 3.x.
Options (from most targeted to least invasive):
1) Create a Prefect logging config that demotes S3 logs to WARNING
- Point Prefect to a custom logging config and set the prefect_aws.s3
logger to WARNING so INFO upload messages are dropped but errors still show.
- Example logging.yml:
version: 1
disable_existing_loggers: false
formatters:
standard:
format: "%(asctime)s | %(levelname)s | %(name)s | %(message)s"
handlers:
console:
class: logging.StreamHandler
formatter: standard
level: INFO
loggers:
prefect:
level: INFO
handlers: [console]
propagate: false
# Demote AWS S3 logs to WARNING so INFO-level upload messages are suppressed
prefect_aws.s3:
level: WARNING
handlers: [console]
propagate: false
# (Optional) Demote all prefect_aws logs
prefect_aws:
level: WARNING
handlers: [console]
propagate: false
- Then set:
export PREFECT_LOGGING_SETTINGS_PATH=/path/to/logging.yml
- Prefect 3 logging customization docs: Logging customization (Prefect 3)
Note: If the block uses the flow/task “run logger” internally instead of the module logger, the logger name may not be prefect_aws.s3
. If that’s the case, a message-based filter or the run-logger name would be needed. If you can paste one of the upload log lines, I can give you an exact drop-in YAML snippet.
2) Programmatically lower the S3 logger level in your flow code
- Add this once at startup (before using the block):
import logging
logging.getLogger("prefect_aws.s3").setLevel(logging.WARNING)
# Optional: cover the whole integration
logging.getLogger("prefect_aws").setLevel(logging.WARNING)
- This suppresses INFO-level upload logs while still surfacing warnings/errors.
3) Bypass the block’s logging and call boto3 directly using the block’s client
- You can still leverage the block for credentials, but call the client directly to avoid the block’s internal log calls:
from prefect_aws.s3 import S3Bucket
def upload_quiet(bucket: S3Bucket, fobj, key: str, **kwargs):
client = bucket.get_s3_client()
extra_args = kwargs.get("ExtraArgs") # optional
client.upload_fileobj(fobj, bucket.bucket, key, ExtraArgs=extra_args)
return f"s3://{bucket.bucket}/{key}"
# usage
with open("/path/to/file.bin", "rb") as f:
upload_quiet(my_bucket_block, f, "path/in/bucket/file.bin")
- This avoids any INFO-level upload logging performed by the block.
4) Coarse-grained (only if you’re okay with suppressing all INFO logs)
- Set:
export PREFECT_LOGGING_ROOT_LEVEL=WARNING
- This cuts down on logs across the board (not just S3). Mentioning for completeness, but the above targeted options are better.
If you can share:
- Your Prefect version (assuming 3.x)
- One example of an upload log line (to confirm the logger name)
…I can provide an exact, copy/paste logging.yml or filter that surgically silences only those messages.
If you’d like, I can also open a feature request to add a flag to S3Bucket
to disable per-file INFO logs or log them at DEBUG.