Hi all, has anyone toyed with deployment of flows from S3? I’d like to speed up our CI/CD process, and wondering if it’d make sense to have a “meta” Flow fetch a python package of other Flows from S3, download and register them, etc.
Airflow has had this on their to-do list for years (https://cwiki.apache.org/confluence/display/AIRFLOW/AIP-5+Remote+DAG+Fetcher), and I’d love for Prefect to beat them to the punch. Reducing deployment from 20-30 minutes to seconds would be a game-changing differentiator
k
Kevin Kho
08/20/2021, 3:39 PM
Hey @Sepehr Sadighpour, if you use S3 storage and DockerRun, the flow is downloaded from S3 and then run on top of the Docker container. This means all your dependencies can just be fixed and you don’t have to rebuild the container when you re-write the flow. Is that what you are looking for?
Bring your towel and join one of the fastest growing data communities. Welcome to our second-generation open source orchestration platform, a completely rethought approach to dataflow automation.