Hello. Starting to look at migrating to Prefect 2....
# prefect-community
j
Hello. Starting to look at migrating to Prefect 2.0, and I’m a little confused about how to structure storage blocks for flows. Is it best practice to create a single block for all flows, or to create one block per flow? I’m not seeing any way when building the deployment to specify a subfolder, which makes me think option 1 is kind of forced…
👀 2
o
Not sure what the recommendation is, but I tried to make it work with one block at first, and bumped into the exact problem you mention. So now we're just using a block per flow, it's very easy to create the block using Python (or CLI, I suppose) if it doesn't already exist. You don't even need to check if it exists manually, the overwrite option appears to make it safe to create the block whether it already exists or not.
🙌 1
Copy code
print(f"Creating storage block flows/{azure_block_flow_name}...")

        azure_block = Azure(bucket_path=f"flows/{flow_name_clean}", azure_storage_connection_string=connection_string)
        azure_block.save(f"azure-{env.value}-{azure_block_flow_name}", overwrite=True)
🙌 1
j
Yeah that was probably going to be the road I went down if this was the case. Looking one step ahead as I’m thinking about CI, I’m thinking that’s a reasonable place to try and create the block.
o
Oh, yeah. I believe you can run an equivalent command using the prefect cli, I just don't know it off the top of my head.
🙏 1
i
We’ve also decided on this approach of 1 storage block per flow for now for what it’s worth. It also seems this is the best way to lessen blast radius if multiple developers are working in the same environment like staging or development.
It does seem like
deployment build
might be able to specify --path flag soon but I’m not sure how or if this relates to flow code storage
a
Is it best practice to create a single block for all flows, or to create one block per flow?
you can treat the block that you provide on the deployment as a parent block holding your generic configuration that you may configure once and share across deployments -- when you create a deployment, Prefect copies that configuration and creates anonymous block per deployment
this admittedly long readme file can bring more clarity https://github.com/anna-geller/dataflow-ops
🙏 1
we are working on a feature that will allow multiple flows to be built in one command - this is important for projects with multiple flows and will be critical to prevent re-uploading the entire project directory with every flow's deployment build -- the syntax will most likely look like
prefect deployment build --path ./flows/hello.py:one --path ./flows/other.py:two ..
🙌 1