1. Is there a way to pass parameters to via dask executor to flow? The requirement here is to run a flow for different datasets which could be configured either from dask or the caller. And the execution of flows of different datasets could be running in parallel.
2. Is there a way to start the run of a flow using a Flask api? and also can the cancellation of flow be controlled by an api? (I tried doing this but i was getting mutiprocessor errors)
2 years ago
Hi @Minakshi - the answer to your first question is "sort-of" and ties into your second question. The reason it's not resoundingly yes is that executors aren't currently configured at runtime, which means you register a flow with an executor context that's read at runtime but isn't modified. That said, flow runs can be launched from any client with access to GraphQL (including
), which means you can use a Flask or any other API to kick off a flow run, provided it has network access to the API (whether Server or Cloud).