Hello, I am currently using Docker as our infrast...
# ask-community
v
Hello, I am currently using Docker as our infrastructure and GitHub as the repository for our workflow. I've come up with the following setup:
Copy code
python
`exec_path = "main.py:crawling_tracker"`
`source = GitRepository(`
    `url=repo,`
    `branch=BRANCH,`
`)`
`my_flow = flow.from_source(source=source, entrypoint=exec_path)`
`deployment = my_flow.deploy(`
    `name="my_flow",`
    `work_pool_name=work_pool,`
    `parameters={"test": test},`
    `schedule=interval_schedule,`
    `job_variables={`
        `"env": {`
            `"EXTRA_PIP_PACKAGES": "prefect==2.14.16 Scrapy==2.11.0 minio==7.2.3 uptrace==1.22.0 scrapeops-scrapy==0.5.4 prefect-docker==0.4.4"`
        `}`
    `}`
`)`
However, I now intend to optimize our workflow deployment process. Specifically, I aim to create Docker images with our workflows and required dependencies installed, and then deploy these images onto our Kubernetes infrastructure. Could you please provide guidance on how to achieve this effectively? Thank you.
j
Hi @Vikash Sharma. You can list your dependencies in a requirements.txt file and Prefect will build a Docker image for you automatically. See the docs here. You can create and specify a Kubernetes type work pool for your deployment.
v
Thank you so much for your response, @Jeff Hale! I appreciate your input. However, I have slightly different requirements for my service. Currently, I create deployments using source code from GitHub. Now, I'm exploring the possibility of utilizing a Docker image directly from the worker pool instead of relying on GitHub as the source. This would help avoid repetitive code pulls. Could you please advise if this is feasible? Additionally, I'm interested in knowing if it's possible to set an end time for deployments. Your guidance on these matters would be greatly appreciated. Thank you again for your assistance!
j
Hi Vikash. Yep, no problem. You can create your own Dockerfile and reference it in your work pool. Or, Prefect will build you a Dockerfile and push it to your repository and you can specify your Python dependencies in your requirements.txt file and they will be picked up when the image is built. Your code is baked into your image in this scenario. GitHub isn’t used.
Copy code
from prefect import flow


@flow(log_prints=True)
def buy():
    print("Buying securities")


if __name__ == "__main__":
    buy.deploy(
        name="my-code-baked-into-an-image-deployment", 
        work_pool_name="my-docker-pool", 
        image="my_registry/my_image:my_image_tag"
    )
For an end time for deployments - you can set a timeout for a task or a flow like this:
@flow(timeout_seconds=10)
If the flow runs for over 10 seconds it will TimeOut. You can also set a Job Timeout on a number of work pool types (Kubernetes, Prefect Managed, ACI, ECS, Cloud Run).