Jackson Maxfield Brown
01/17/2020, 7:26 PMflow.run(executor=DaskExecutor(address=dask_scheduler_address))
does it serialize the flow / computation graph and send it as a task to hold onto? Basically, if I have a dask scheduler spun up and I am working on my local machine, can I call flow.run(...)
and then shut down my machine?Zachary Hughes
01/17/2020, 7:56 PMflow.run()
and shut down your computer-- looks like we're using a fire and forget approach to how we submit work to Dask.Jackson Maxfield Brown
01/20/2020, 12:23 AMfrom prefect import task, Flow
import random
import json
from time import sleep
from prefect.engine.executors import DaskExecutor
@task
def inc(x):
sleep(random.random() / 10)
return x + 1
@task
def dec(x):
sleep(random.random() / 10)
return x - 1
@task
def add(x, y):
sleep(random.random() / 10)
return x + y
@task(name="sum")
def list_sum(arr):
return sum(arr)
@task(name="save")
def save_total(total)
with open("~/dask_test.json", "w") as f:
json.dump({"total": total}, f)
range_list = list(range(0,int(1e5)))
with Flow("dask-example") as flow:
incs = inc.map(x=range_list)
decs = dec.map(x=range_list)
adds = add.map(x=incs, y=decs)
total = list_sum(adds)
save_total(total)
executor = DaskExecutor(address="<tcp://localhost>:{PORT}")
# This will run until the computation is complete
flow.run(executor=executor)
Will keep trying a few different configurations.Zachary Hughes
01/21/2020, 2:24 PMJackson Maxfield Brown
01/21/2020, 8:29 PM