How do I get each deployment to store its files in...
# ask-community
j
How do I get each deployment to store its files inside its own directory using S3 block storage?
1
The first deployment I applied pushed its files to the root of the S3 bucket.
c
Hi Jack, you can use relative paths to the storage block
j
Can you provide an example?
c
Copy code
prefect deployment build ./healthcheck.py:healthcheck -n azure-aci-deploy -t aci -q default -ib azure-container/boydimusprime -sb azure/boydblock/docker_test -a
-sb is the storage block command, azure is the block type, boydblock is the name of the block I created, and docker_test is a sub-path inside the storage (in Azure)
if you’re using s3, it would be like:
Copy code
-sb s3/block_name/sub-path
j
Using a python module to apply the deployment, where would I put that?
Copy code
storage = S3.load('flow-storage-block')

deployment = Deployment.build_from_flow(
    flow=jacks_first_flow,
    name='jacks-first-deployment',
    version=2,
    storage=storage,
)

deployment.apply(upload=True)
c
it looks like this is in my storage account:
j
Also, is there an option we can set so that storage is automatically namespaced? (So that different users of the same storage block don't inadvertently use the root of the S3 bucket and end up overwriting each others' files)
c
I’d need to check, I haven’t tested with python
there is not
j
It appears that with prefect v1 each flow was automatically namespaced when stored to S3.
c
I think you’d just add the path=“relative path to root”
Here’s the PR that adds that: https://github.com/PrefectHQ/prefect/pull/6518
j
The
path
keyword is working:
Copy code
storage = S3.load('flow-storage-block')

deployment = Deployment.build_from_flow(
    flow=jacks_first_flow,
    name='jacks-first-deployment',
    version=2,
    storage=storage,
    path='01-favorite-path',
)

deployment.apply(upload=True)
🙌 2