Hi all, i want to know if this is possible. Run vi...
# prefect-community
l
Hi all, i want to know if this is possible. Run via dask executor different projects each one with respective virtual environment, in order to encapsulate its dependencies version. Thanks for ur help. A little representation
j
Hi Luis, what you're asking for is possible, but adds a few complications. We generally recommend users use short-lived dask clusters when possible, and try to keep the worker environments uniform. Can you comment more on your deployment environment & restrictions? There might be a better way to accomplish what you need.
l
Hi Jin, currently i have the following architecture server A: prefect 0.14.0 in venv with py=3.6.8 docker for prefect server All the flows registered Server B: Python3.6.8 (globally) dask-schedule installed globally 1 dask-worker localhost This wokrs for me, in fact before i use prefect, i was using simple crontab, and activate different virtualenv and then run the respective main.py (each project in one venv) i want to know if the prefect dask executor can point to a especific enviroment in the remote server, and avoid use the globally python binary
Copy code
# Use a virtual environment at /path/to/my/virtual/env
cluster = YarnCluster(environment='venv:///path/to/my/virtual/env')
j
So you have one server with prefect installed on it, and another server where you want to run a dask cluster for prefect to use?
Prefect's
DaskExecutor
is just a thin wrapper around all the different dask cluster-manager classes, so any way you're used to creating a dask cluster can work with prefect.
That said, I recommend not mixing virtualenvs across workers in your dask cluster. It can work, but it's a recipe for weird behavior. Rather, I recommend using a single dask cluster for all your flows that share a common venv, or using a dask cluster per flow if you want to use separate environments for each flow.
Also, if you only have two servers, you might instead run your prefect agent on Server B, then you don't have to deal with remote dask clusters at all and your flows are free to start whatever clusters in whatever envs they want.
l
I understand, currently i don't have prefect installed in the server B, only dask-scheduler. I will work with the dask-scheduler working globally for now. Then i will read more about remote agent or maybe kubernets agents. Thanks for your help. Regards
👍 1