https://prefect.io logo
c

Cody Webb

08/15/2023, 12:29 AM
Is there a way to start a deployment and then send it to background in python? I am connecting the flow run start to a UI and don’t want it blocking the ui?
n

Nate

08/15/2023, 12:41 AM
hi @Cody Webb - if you use
run_deployment(... , timeout=0)
then it will return immediately instead of waiting for it to finish, but if you need the result of the flow run, you'd need to fetch that manually later
🔥 1
c

Cody Webb

08/15/2023, 12:44 AM
Ok cool that’s easy:) The flow kicks off jobs that eventually just save files with which I am not needing the results right away
n

Nate

08/15/2023, 12:46 AM
nice - sounds good then catjam
c

Cody Webb

08/15/2023, 12:58 AM
Thank you , Oh,one more question , i am using prefect dask subflows and I saw in the forum if you’re saving local to local fs, you basically use a process block with the workdir and pass it to the infrastructure param when building the flow, is there a simpler way? I tried with the localfs block but no good , eventually I want to switch out to s3 when I scale more , so what’s the proper way to do this?
n

Nate

08/15/2023, 4:55 AM
if I understand correctly, you could use a variable here to store your desired storage block like
slug-name/block-name
e.g.
gcs/my-result-storage-bucket
Copy code
from prefect import flow, task
from prefect.blocks.core import Block
from prefect.variables import get
from prefect_dask import DaskTaskRunner

storage = Block.load(get("default_result_storage"))

@task(result_storage_key="{foo}.pkl")
def task_that_writes_inputs_to_a_filesystem(foo: str) -> str:
    # just return the input for demonstration purposes
    return foo

@flow(
    task_runner=DaskTaskRunner(),
    result_storage=storage,
)
def subflow_that_writes_to_a_filesystem(some_params: list):
    return task_that_writes_inputs_to_a_filesystem.map(foo=some_params)
and prove its working by retrieving results
Copy code
if __name__ == "__main__":
    from prefect.results import PersistedResult
    from prefect.settings import PREFECT_RESULTS_PERSIST_BY_DEFAULT, temporary_settings

    with temporary_settings({PREFECT_RESULTS_PERSIST_BY_DEFAULT: True}):
        inputs = ["a", "b", "c"]
        subflow_that_writes_to_a_filesystem(inputs)
        
        persisted_results = [
            PersistedResult(
                storage_key=f"{foo}.pkl",
                storage_block_id=storage._block_document_id,
                serializer_type="pickle",
            ).get()
            for foo in inputs
        ]
        assert persisted_results == inputs
in prod you could set
PREFECT_RESULTS_PERSIST_BY_DEFAULT
as an env var
🔥 1
c

Cody Webb

08/16/2023, 1:02 AM
Awesome this makes more sense, will try this tommorow. Thx! Ps. Your blog has been a great help insofar
n

Nate

08/16/2023, 1:04 AM
great to hear @Cody Webb ! what blog are you referring to? just so i know what is helpful for folks
c

Cody Webb

08/16/2023, 1:06 AM
I’ve referenced your blog a few times as well as the recipes when looking for answers, the forum as well but moreso the recipes
n

Nate

08/16/2023, 1:16 AM
cool! is there anything you’d be interested in seeing more detail on? i know there’s a lot of surface area to cover with docs as is, but anything you’d like to see a deeper dive on?
c

Cody Webb

08/16/2023, 1:37 AM
Yea the usage of blocks could use a bit more coverage, I’ve seen like one full MLops example-could prob use more examples ther, direct comparisons and benefits of prefect vs other orchestrators(dagster etc, like how to convert a Dagster pipeline to prefect to show how much lower the barrier to entry is with prefect ) , I noticed a lot of people here have problems with flow control so maybe more examples there like fine grain control and starting/ stopping tasks/flows based on events (like a flow failure / completion starts conditional sub flow) also how to properly utilize prefect.runtime / context if necessary , also maybe cover how to extend prefect, like adding your own task runner or something! Also more streaming examples , there’s a couple good blogs on it but not many recipes
n

Nate

08/16/2023, 5:35 PM
this is awesome feedback - thanks!
1