I can see the Pull steps in the Web UI. But they a...
# prefect-kubernetes
b
I can see the Pull steps in the Web UI. But they are missing completely in the python API
k
You can use
from_source
to either select a block defining the location from which you want to pull code:
Copy code
from prefect import flow
from prefect_aws.s3 import S3Bucket

if __name__ == "__main__":
    flow.from_source(
        source=S3Bucket.load("my-code-storage-bucket"), entrypoint="flows.py:my_flow"
    ).deploy(name="my-deployment", work_pool_name="above-ground")
or use the
fsspec
compatible
RemoteStorage
class:
Copy code
from prefect import flow
from prefect.runner.storage import RemoteStorage

if __name__ == "__main__":
    flow.from_source(
        source=RemoteStorage(url="<az://my-container/my-folder>", account_name="my-account-name"),
        entrypoint="flows.py:my_flow",
    ).serve(name="deployment-from-remote-flow")
b
Tnx for the help Hmm, if I reference the S3Bucket in as storage I get
AttributeError: 'S3Bucket' object has no attribute 'set_base_path'
. If I do
fsspec
then for every flow I want to deploy, the contents of the (in my case) same bucket/path gets downloaded ... for 14 flows or so.
Which kind a defeats the purpose of the
deploy
function that is there to deploy multiple flows.
takes around 2 minutes
I tried to hack in the pull steps into the RunnerDeployment, but I can't figure how to do it
'RunnerDeployment' object has no attribute 'pull_steps'
n
hey @brokoli - can you give an example of what you're trying to do? ideally with code 🙂
b
So I need to deploy and run
n
flows. In my CI pipeline I build the docker image (if requirements.txt change) and upload the flow code to S3. Then I want to register all the flows as a new deployment on prefect cloud
I have a worker in k8s that then create Jobs and the Jobs then pull the code from S3
One way to do it is to template the
prefect.yaml
to have all the flows and the other is to use
fsspec
I do that in CI/CD pipeline
I haven't found a way to do it from the CLI (I guess that only works for agents, not workers)
Does that usecase make sense or am I doing something fundamentally wrong?
n
the use case makes a lot of sense!
if you know how many deployments you need to make, using
prefect.yaml
makes a lot of sense, otherwise if you just have a monorepo situation and you just wanna deploy them all after you upload, you could run something like this in CI
which you might want to modify a bit to search the remote dir / someplace else for flows, but the rough idea • find all the flows • create a flow object
from_source
• create a deployment from that • pass them all to
deploy
🙌 2