brokoli
11/20/2023, 5:05 PMKevin Grismore
11/20/2023, 5:44 PMfrom_source
to either select a block defining the location from which you want to pull code:
from prefect import flow
from prefect_aws.s3 import S3Bucket
if __name__ == "__main__":
flow.from_source(
source=S3Bucket.load("my-code-storage-bucket"), entrypoint="flows.py:my_flow"
).deploy(name="my-deployment", work_pool_name="above-ground")
or use the fsspec
compatible RemoteStorage
class:
from prefect import flow
from prefect.runner.storage import RemoteStorage
if __name__ == "__main__":
flow.from_source(
source=RemoteStorage(url="<az://my-container/my-folder>", account_name="my-account-name"),
entrypoint="flows.py:my_flow",
).serve(name="deployment-from-remote-flow")
Nate
11/20/2023, 7:27 PMbrokoli
11/21/2023, 10:00 AMAttributeError: 'S3Bucket' object has no attribute 'set_base_path'
.
If I do fsspec
then for every flow I want to deploy, the contents of the (in my case) same bucket/path gets downloaded ... for 14 flows or so.brokoli
11/21/2023, 10:02 AMdeploy
function that is there to deploy multiple flows.brokoli
11/21/2023, 10:05 AMbrokoli
11/21/2023, 10:37 AM'RunnerDeployment' object has no attribute 'pull_steps'
Nate
11/21/2023, 2:33 PMbrokoli
11/21/2023, 3:06 PMbrokoli
11/21/2023, 3:08 PMn
flows. In my CI pipeline I build the docker image (if requirements.txt change) and upload the flow code to S3. Then I want to register all the flows as a new deployment on prefect cloudbrokoli
11/21/2023, 3:08 PMbrokoli
11/21/2023, 3:09 PMprefect.yaml
to have all the flows and the other is to use fsspec
brokoli
11/21/2023, 3:10 PMbrokoli
11/21/2023, 3:11 PMbrokoli
11/21/2023, 3:15 PMNate
11/21/2023, 5:01 PMNate
11/21/2023, 5:02 PMprefect.yaml
makes a lot of sense, otherwise if you just have a monorepo situation and you just wanna deploy them all after you upload, you could run something like this in CINate
11/21/2023, 5:04 PMfrom_source
• create a deployment from that
• pass them all to deploy