Ilya Galperin
08/12/2022, 12:37 AMdeployment
build
and apply
commands from a parent directory containing multiple flows. This way, we use only one storage block but if multiple developers are working in the same shared bucket like in a staging environment, the blast radius is bigger. For example, if a developer is working on only a single flow, there is a risk of overwriting someone else’s in-development code if you commit code from another flow/folder accidentally.
deployment build flow_a/flow.py:entry_point -n flow_a_deployment --storage-block s3/universal-storage-block
flows/ <working directory>
flow_a/
flow_b/
2. Every flow gets its own storage block and running deployment
build
and apply
commands from that flow’s root directory. This will obviously require us to use more storage blocks, but seems to decrease blast radius.
deployment build ./flow.py:entry_point -n flow_a_deployment --storage-block s3/individual-storage-block
flow_/a <working directory>
It seems to me like option 2 is more optimal, but are there any disadvantages or limitations we should be aware of in using multiple storage blocks?Anna Geller
prefect deployment build --path flows/flow1.py:one --path flows/flow2.py:two ...
2. Docker storage build - i.e. flow code baked into the container for all flowsAnna Geller
Ilya Galperin
08/12/2022, 1:08 AM--path
argument would only save that particular path to storage, even if the deployment build
command is running from a parent directory? In other words, would this allow us to only selectively save flows that all live in different folders, under a common parent folder and using a single storage block, without overwriting all the other flows?John Kang
08/12/2022, 2:52 PMIlya Galperin
08/12/2022, 3:59 PMJohn Kang
08/12/2022, 4:00 PM