Hello, I want to create an event driven (when files land in s3) deployment. Is this best way to do this with a lambda function? If so, what's the best documentation to follow on this?
b
Boggdan Barrientos
12/05/2022, 6:18 PM
Hi @Ashley Felber I manage it in that way and it's works properly. I create a function to check if the file arrived to S3, that function is executed by a task like the code.
I use bucket, path, filename for my function but you can create it with your requirements.
Copy code
@task(name="s3-sensor",
description="This task sensing s3 files")
def s3_sensor(s3_bucket,s3_prefix,s3_filename,incremental):
logger_run = get_run_logger()
while True:
bool_s3_object_arrived = check_if_file_arrived_in_s3(your args)
if bool_s3_object_arrived is True:
<http://logger_run.info|logger_run.info>(
f"The file has arrived today"
)
break
if bool_s3_object_arrived is False:
<http://logger_run.info|logger_run.info>(
f"Re-schedule the file is not updated"
)
time.sleep(180)
🙌 1
a
Ashley Felber
12/05/2022, 11:30 PM
Thanks. Is there a reason you're doing it this way vs. just triggering the lambda to run when an s3 file is added?
Also where are you actually triggering the deployment?
Bring your towel and join one of the fastest growing data communities. Welcome to our second-generation open source orchestration platform, a completely rethought approach to dataflow automation.