Aaron Gonzalez
07/01/2025, 10:33 AMMarvin
07/01/2025, 10:34 AMMarvin
07/01/2025, 10:34 AMpython
from prefect import flow
from prefect.client import get_client
@flow
def process_s3_file(bucket: str, key: str):
# Your processing logic here
print(f"Processing {key} from {bucket}")
# In your AWS Lambda function:
async def lambda_handler(event, context):
async with get_client() as client:
# Extract bucket and key from the S3 event
bucket = event['Records'][0]['s3']['bucket']['name']
key = event['Records'][0]['s3']['object']['key']
# Deploy and run the flow
response = await client.create_flow_run(
name="s3_triggered_run",
flow_name="process_s3_file",
parameters={"bucket": bucket, "key": key}
)
2. Prefect Automations:
For scenarios where you need more complex triggering logic or want to respond to Prefect-specific events, you can use Prefect's Automations feature. This allows you to set up trigger conditions and corresponding actions within Prefect's ecosystem.
The event-driven approach offers several advantages over traditional sensors:
- Lower resource usage (no constant polling)
- Real-time responsiveness
- Better scalability
- More reliable event detection
For more details on migrating from Airflow patterns to Prefect, check out the migration guide and event triggers documentation.
Would you like me to provide more specific details about setting up either of these approaches?Mitch Nielsen
07/01/2025, 1:33 PM