<@ULVA73B9P> Regarding cleanup of old logs of pos...
# ask-marvin
b
@Marvin Regarding cleanup of old logs of postgress I can write some cleanup flows, but I'm not sure which specific tables I should target for deletion to avoid interrupting any critical processes. I want to ensure I'm only deleting log-related data and nothing essential for system operation. Could you point me to the log tables or any guidance on which ones I should focus on to ensure that only logs older than 15 days are deleted? I’m aiming for a solution that won't interfere with Prefect's core operations.
m
When dealing with the cleanup of old logs in a PostgreSQL database used by Prefect, you should focus on tables that store log-related data. Based on the information gathered, here are some useful insights: 1. Event Resources Table: This table (
event_resources
) has been identified as one that accumulates data indefinitely and can be safely cleaned alongside corresponding events. It seems to be a common source of database growth issues (GitHub Issue). 2. Consideration for Foreign Keys: Be cautious with foreign key constraints when deleting data. Ensure that deleting logs does not affect other tables that might reference those logs (GitHub Discussion). 3. Retention Policies: While Prefect Cloud manages retention automatically, for self-hosted setups, you can use tools like
pg_cron
to schedule periodic cleanup tasks. This can help maintain database performance by purging old logs without affecting critical data. For a more tailored approach, you might want to inspect your database schema to identify log-related tables specific to your setup. This might include tables like
flow_run_logs
or similar, depending on how your Prefect instance is configured. For further reading on managing audit logs in Prefect Cloud, you can refer to the Prefect documentation. If you need a more specific solution or further assistance, examining your database schema directly or consulting Prefect's community resources might be beneficial.