Hi everyone! I’m using Prefect Cloud with Prefect...
# prefect-cloud
d
Hi everyone! I’m using Prefect Cloud with Prefect Workers. My FastAPI backend has an endpoint which should trigger a flow execution on a prefect worker process without blocking the FastAPI. How can I do that? I tried to use
.serve
,
.deploy
and
.submit
but these commands do something else. I’m trying to do something similar to Celery task’s
.delay()
method. Is that possible with Prefect workers?
1
a
Yes! It’s possible. The easiest way is to serve your flow using
my_flow.serve(name="my-deployment")
and then, in your FastAPI route handler, you’d call our utility function
run_deployment(deployment_name, timeout=0)
. (Serving a flow creates a deployment of that flow for you.) When you call
run_deployment
, you’ll use the fully-qualified deployment name, which is
[flow-name]/[deployment-name]
, so with a flow function named
my_flow
and my example
.serve()
call, you’d use
run_deployment("my-flow/my-deployment", timeout=0)
. The
timeout=0
part means that
run_deployment
will enqueue execution of the flow and then immediately give you back a flow run object (kind of like you’d get back from submitting a task to Celery), but won’t wait for the flow to actually run. You can use the flow run object you get back to track progress of the flow run, or you can save the ID of the flow run somewhere to check its progress later using
read_flow_run
(see docs). And just a side note if you’re starting with Prefect: with Prefect (vs. Celery), you can have a flow and multiple deployments of it — one that runs in one infrastructure, one in another, one running on a schedule, etc. Finally, we’re working on different ways to make using flows from web apps feel more natural, so DM me if you want me to keep you updated on experiments we’re running or new releases in that area.
106 Views