https://prefect.io logo
Title
b

Ben Welsh

01/28/2022, 2:42 PM
Thanks to all the help in this channel, yesterday I was able to launch my first cloud deployment with a Kubernetes agent running from a Docker source. I'm very happy with the progress but here's one thing I noticed: My tasks aren't running in parallel anymore. When I was in local mode, I had achieved that with a LocalDaskExecutor, which remains in my code after the switch to cloud hosting and execution. What is the simplest way to get my Kubernetes agent and Docker source doing Dask again? Is it as simple as replacing LocalDaskExecutor with DaskExecutor? Do I need to install some extra dependencies in my agent Dockerfile? Include more requirements in my Source's python_dependencies? Something else?
🙌 2
k

Kevin Kho

01/28/2022, 2:52 PM
Will be OOO today, but the most common scenario where this happens is if you have
myflow.py
but register it with some registration script
import flow from myflow
and then attach your configurations
import flow from myflow
flow.run_config = ...
flow.executor = ...
flow.storage = ..

flow.register(...)
This is because all of the other guys are stored (RunConfig, Storage) on the Prefect database, but the executor is not part of the serialized Flow since it can have sensitive info (think Dask cluster address). So the executor is pulled from the Flow file that is stored so it needs to be defined there. Running the flow with debug level logs also lets you know what executor is being used.
:upvote: 1
TLDR, set the executor in the Flow file that is pulled from storage