https://prefect.io logo
Title
r

R Zo

02/25/2022, 2:48 AM
post 3) This is related to a post 1 above, will it be possible to have prefect/dask setup to run jobs on two different python environments, one running tensorflow with perhaps older packages and the other running a newer environment.
k

Kevin Kho

02/25/2022, 3:04 AM
Hi @R Zo, So having two environments can be supported in Orion (Prefect 2.0). If you wanted to do this in current Prefect, your best bet is doing 2 Docker images so that you can do two containerized environments, and then splitting those into two subflows. I don’t think you can change environment mid run. I’d need a traceback to guess about the segmentation fault, but I think it might be Dask+Tensorflow specifically that causes that. For post two, this seems like the Dask workers may be dying. I would suggest looking at the Dask dashboard to see if you can figure out anything
r

R Zo

02/25/2022, 3:20 AM
Thanks for the reply @Kevin Kho. Regarding post 1) will a python -m trace flow.py be sufficient ? But trace produces a lot of information. With post 2) I have looked at the dask dashboard and both the local and remote worker seem to be registered. The error posted on the original post is from the terminal where the worker is running during flow execution.
k

Kevin Kho

02/25/2022, 3:22 AM
Whatever logs would help yeah. I can take a look at it
r

R Zo

02/25/2022, 4:16 AM
@Kevin Kho instead of having separate docker images for running flows I was wondering if separate flows could be run on different Dask clusters ?
k

Kevin Kho

02/25/2022, 4:36 AM
It can because the executor is attached to the Flow so you register the two flows separately and attach different executors to each, and then call them in a main Flow with the
create_flow_run
task