Another fun one with multiple deployments in one project. It seems like if we set up an S3 bucket for storage, each deployment uploads the entire project.
For example, if I have
flows
folder filled with 20 python files, each with, say, 2 flows inside them, if I try to deploy the 20 flows, every single deployment will reupload all 20 files, which is incredibly slow.
Is there a nicer way of doing this, such that I can upload everything just a single time? I see theres a
skip_upload
I could use, but it feels a bit of an anti-pattern for me to use a bool flag like “already_uploaded” to toggle the skip_upload flag
Bring your towel and join one of the fastest growing data communities. Welcome to our second-generation open source orchestration platform, a completely rethought approach to dataflow automation.