Hey everyone, has somebody tried to run a custom d...
# ask-community
v
Hey everyone, has somebody tried to run a custom docker image as a part of the flow? i'm using k8s workers, and want to start a pod with a custom image as a part of pipeline
b
Hey Validslav, yes you can!
Just to clarify the pattern you're looking to achieve: Are you trying to have a parent flow that is running in one pod kick off a new pod with a custom image to run a child/downstream flow?
v
Hi, thanks, basically i'm looking for something similar to airflows
GKEStartPodOperator
or
KubernetesPodOperator
. Using k8s worker i'd like to run a pod with custom image(non prefect/python), but i'd like prefect to monitor this pod for a status(successful/failed run, running, etc)
Are you trying to have a parent flow that is running in one pod kick off a new pod with a custom image to run a child/downstream flow?
yes, running a new pod with a custom image as a part of the flow
s
@Vladislav Rumjantsev Yes, you can. Build your custom image from the prefect one like so:
Copy code
FROM prefecthq/prefect:2.14.11-python3.10

# Do something custom
COPY requirements.txt .
RUN pip install -r requirements.txt
Next, point your k8s worker to pull this image. Now any new pods using this worker will communicate normally with the prefect API and appear in the prefect UI as usual
v
@staticnotdynamic thank you, so basically the image should be something based on prefect? I have a binary with a complex business logic that i wanna run as a part of pipeline, and i want mage to spawn a pod with a container with this binary and watch the pod until it finishes with status 0(or error, which should stop pipeline right away) i definitely can run it in a prefect container, but it will be adding a little bit of overhead
s
Yes, it has to be based on Prefect client. You can find how it's built here (if you want to build it from scratch yourself).
> I have a binary with a complex business logic that i wanna run as a part of pipeline, and i want mage to spawn a pod with a container with this binary and watch the pod until it finishes with status 0 If you are using Kuberentes, you can define it as a Kubernetes job. Ultimately if you are using any container orchestrator I am sure there's a mechanism to watch for jobs.
v
thanks a lot for clarifications!
If you are using Kuberentes, you can define it as a Kubernetes job. Ultimately if you are using any container orchestrator I am sure there's a mechanism to watch for jobs.
yeah, but i have to first prepare the data, and after processing i have another steps, so i prefer orchestrating it using a single solution, to have a better observability and control over the whole pipeline
s
I see. Yeah Prefect/Airflow/Dagster might be better suited for that. Good luck
❤️ 1