Hello, one question regarding Storage. In the documentation it says
If no storage is explicitly configured, Prefect will use
storage by default. Local storage works fine for many local flow run scenarios, especially when testing and getting started. However, due to the inherit lack of portability, many use cases are better served by using remote storage such as S3 or Google Cloud Storage.
If I add
prefect deployment build
I get a copy of all my files on my S3 bucket. But how can I ensure that the Deployment in Prefect Cloud always takes the flow codes from the S3 bucket?
1 month ago
I believe that the agent that will be executing your code is going to pull the code from the storage that is specified in the deployment.yaml file.
Jaime Raldua Veuthey
1 month ago
Oh I see thanks!So I understand it as follows:
prefect deployment build ./basic_flow.py:basic_flow -n test -t test --storage-block s3/s3-block-prefect-cloud
After running this command the deployment.yaml will contain "storage: bucket_path: myS3bucket/folder". That means that the manifest json file and the python script with the code will be retrieved from this S3 bucket when creating the Deployment and every time the Deployment runs.At this point one can delete the json manifest and the python script with the flow from the local folder since they will not be used (although Prefect creates them both locally and in the bucket anyway, not sure why)To create the deployment using the Prefect CLI:
prefect deployment apply deployment.yaml
This will create the Deployment and since in the yaml file has the S3 as the storage, every time the Deployment runs it will get the flow from the script file in the S3 bucket.