Hello everyone, thanks for this amazing product, I...
# prefect-community
f
Hello everyone, thanks for this amazing product, I am just getting started with it. I was wondering the following though. I want to execute different flows using a local dask cluster. But as the documentation aptly describes, running
flow.run()
leads to a blocking process until the flow has finished. Is there a way of submitting jobs to the dask cluster in an almost fire and forget scenario? Especially since my outputs are written to file or DB I really don’t need the return/result object. I basically want to dynamically create workflows submit them to the dask cluster and then have them finish based on available resources. Could be that I am overlooking something super simple or using prefect for something it is not build for 🤷‍♂️ but would be nice to know 🙂
hmm I think I might understand things a bit better now. Am I right in my understanding that if I choose as a RemoteExecutionEnvironment a dask cluster and have the backend, dask cluster and agent up then the agent will coordinate the workflows to be run with the resources available on the dask cluster?
could I also just use a TaskPool with a limited number of processes as opposed to a dask cluster?
c
Hi Florian and welcome! Yes you are correct that if you use Prefect with a backend (either server or Cloud), the execution model will behave as you expect