I have an idea - would be glad for feedback on how...
# ask-community
t
I have an idea - would be glad for feedback on how much it makes sense (or not) - so while fiddling with the Docker tasks - I realized that this kind of "subflow" where I basically have some ready-made image (or alternatively image i'd like to build) - that I'd like to run as a docker container --- would always have a very similar structure, namely: • take a ready-made image (or build one from a provided
Dockerfile
) • check if the dedicated container for it exists. if there is, remove it. • create a new container for it • run the container with some command • wait for the run to finish • log the results • remove the container ideally, i'd like to reach a point where i could just point Prefect at a local or remote code repository that contains a Dockerfile and have it basically just turn it into a single task (or, alternatively, point it at a pre-built image on ECR and rest would be the same) - do you think it would make more sense to turn this into a custom task - or - sort of a parameterized generic flow that i can incorporate into other flows using
create_flow_run
? my issues with making the generic flow is that it cannot easily be shared with the community + it requires to register such a flow + requires to name all the tasks dynamically to make it clear (from the outside) what it's doing (and even then, i can only cause the tasks to be named dynamically, not the flow). E.g. if this image does some data preprocessing for some model, the flow name would still be something like
run docker
which doesn't do much to indicate that it's actually running this specific docker... the problem with creating a custom task is that i could not re-use the existing task-library Docker tasks... thoughts? 🤔
k
Still thinking but first off, you can re-use them. You can call a task inside a task by calling the
run
method. It just treats it like normal Python instead of a Task
t
oh, ok - gotcha, so then it would be like creating the custom task, but without working hard for it. so the only problem left would be i wouldn't be leveraging prefect's advantages (monitoring, retries, etc.)
k
I personally would attempt this as a subflow that is invoked by
create_flow_run
and the subflow can utilize the tasks in the task library and you get observability into each one
upvote 1
Yeah exactly you know the tradeoff
t
right but then i have a bunch of
run_docker
flow runs all over the place and it would be difficult (from outside) to know what it's actually running until i delve into the params etc.
would be cool if a flow-run could have an ad-hoc name to distinguish it from other similar runs of the same flow
k
You can use the
RenameFlowRun
task to change the name, and then you can use the Python underneath by calling the
.run()
inside a state handler so that it gets used before the first task runs
t
i see
😆 1
k
You can use parameters in a state handler by getting it from
prefect.context.parameters
a
Here is an example:
Copy code
from prefect import Flow, task
from prefect.tasks.prefect import RenameFlowRun
import prefect

def rename_handler(obj, new_state, old_state):
    if new_state.is_running():
        param = prefect.context.parameters.get("your_parameter_name")
        RenameFlowRun().run(flow_run_name=f"new_name_{param}")
    return

@task
def first_task():
    return 1

with Flow("test-flow", state_handlers=[rename_handler]) as flow:
    first_task()
🙏 1