Hi everyone. Am using a github action to register ...
# prefect-community
e
Hi everyone. Am using a github action to register a flow. The flow uses S3 as storage and run_config is fargate config. The problem is, when registering it fails with an error related to boto3, which am not even using boto3, but before it was failing due to lack of boto3 library, so I added a command to install boto3 with pip. But now the error is "botocore.excwptions.ClientError: An error occured (AccessDenied) when calling the putObject operation: Acess Denied" Is this setup feasible? Why is it asking for boto3? Does prefect use boto3 library to use S3 as storage in the flow? Should I add auth to aws using github action or aws auth using boto3 in the flow python code itself?
a
It depends on your setup. If you configured S3 storage with
stored_as_script=True
, then Prefect will try to upload your flow to S3 during registration. If you upload your flows to S3 yourself, you can set it to False and your CI registration will not complain about missing AWS credentials. However, if you want that your flow gets uploaded to S3 upon registration, then yes, you should add auth to AWS in your Github Actions workflow. If you need some examples with various s3 storage and ECSRun run_config configurations, you can find them in this repo https://github.com/anna-geller/packaging-prefect-flows/blob/master/flows/s3_ecs_run.py
👍 1
❤️ 1
e
amazing! is working now, you're the best Anna 💪
❤️ 1
thanks
I have airbyte tasks and dbt tasks, is there a way to run airbyte tasks in parallel?
a
absolutely! You can initialize your Airbyte task before the Flow construct and then within the Flow() you can do:
Copy code
airbyte.map(connection_id=connections)
Kevin gave an example here
👍 1
but if you have any issue with it, LMK
e
sure, i'll give it a try. Thanks again Anna
@Anna Geller everything worked "Prefectly". thank you again, you were really helpfull
🙌 1